Job Description
About Job
We empower the people who build the world.
Taiyō.AI is the world's first infrastructure intelligence platform that helps some of the world’s large infrastructure companies proactively manage opportunities and threats in a dynamic world. We are building the largest universal and industry standard database of opportunities (tenders, projects, news) and threats (economy, climate, geopolitics, finance, logistics, etc..) for real assets. Taiyō.AI has been instrumental in shaping how infrastructure companies (infra investors, engineering, procurement and construction, and infra insurers) benchmark new project development opportunities, panoramic and dynamic view of external risks, predict prices and identify drivers, and mitigate supply-side disruptions.
About The Team
We are looking for the head of cloud backend engineering to oversee backend ops for managing and monitoring the data, related predictive analysis, provide insight into infrastructure projects related project screening and dynamically evaluate external risks, with a strong focus on supporting automation, process design, and resource planning.
Your Key Responsibilities
Work on data sourcing
Use web scrapers (Beautifulsoup, selenium, etc.)
Manage the data normalization and standards validation
Parametrize and automate the scrapers
Developing and executing processes for monitoring data sanity, check for data availability and reliability.
Exceptional python scripting practices
Ensure continuous data accuracy and recognize data discrepancies in systems that require immediate attention/escalation.
Become expert in the company’s data warehouse and other data storage tools, understanding the definition, context, and proper use of all attributes and metrics.
Creating dashboards based on business requirements.
Distributed systems, Scale, cloud, Caching, CI/CD (Continuous integration and deployment), Distributed logging, Data pipeline, Recommendation Engine, Data at Rest Encryption
What To Bring
Graduate/Post Graduate degree in Computer Science or Engineering.
1-3 years of Hands on experience with AWS open search v1.0 or elastic search 7.9
3 years in core software development.
Background on recommendation systems and pipeline softwares.
Microservices, Django, Fast api, Aws cognito and gateway, K8s. Docker, Gitlab CI, Grafana
Should be able to write complex queries independently.
Minimum 1 year of experience in big data frameworks like Hadoop/Spark/Impala/Qubole.
Basic knowledge of programming languages like Python, Scala etc.
Ability to work independently and take ownership of things.
Analytical mindset and strong attention to detail.
Good verbal & written communication skills for coordinating across teams.
OUR VALUES
We are customer obsessed.
We are audacious in vision and action.
We encourage honesty and open dialogue.
We respect everyone and every point of view.
We make objective and data-driven decisions.
We believe trust and accountability go hand-in-hand.
We invest in each other's growth.
We bring our A-game and nothing else.
We take charge and get it done.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
December 26, 2025
Job Type:
Technology
Location:
India
Company:
Taiyō.AI
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.