Job Description
Location:
India (Offshore Development Centre) US Based MNC - Roles - as soon they establish
in Hyderabad @ 2026
Function:
Trading Technology / Data Platform Engineering
Reports To:
Trading Data Engineering Development Lead – London
Role Type:
Full-Time, Permanent
Work Schedule:
Aligned with UK and US trading hours (rotational or extended coverage)
Overview
Our client, a leading global commodity trading house, is expanding its Trading Data Engineering
team in India to support the delivery of high-quality market and fundamental data platforms that power trading, analytics, and risk functions across global desks.
The
Trading Data Platform Developer
will focus on engineering scalable data services and access layers, including APIs and Excel-based integrations, in a
cloud-native (AWS-hosted)
environment. The role requires deep technical proficiency in Python-based microservices (e.g.,
FastAPI ), data engineering pipelines, and containerized deployments using
Kubernetes and Docker .
This is a hands-on engineering position, working closely with global teams in
London, Houston, and Singapore
to ensure the reliability, performance, and scalability of data systems supporting Oil, Gas, and Power trading.
Key Responsibilities
1. Data Platform Engineering
Design, build, and maintain data pipelines ingesting
market data, fundamental data , and reference datasets from internal and external sources.
Develop and optimize ingestion frameworks, transformations, and access layers using
Python ,
AWS services (Glue, Lambda, ECS, S3, Athena, Redshift) , and streaming tools such as
Kafka .
Collaborate with data scientists, analysts, and quants to ensure data accessibility for trading, pricing, and risk analytics.
2. API and Data Access Layer Development
Develop and maintain
RESTful APIs
using
FastAPI
to serve market and fundamental data to internal applications and dashboards.
Build scalable and performant
Excel Add-ins
and other client interfaces for traders and analysts to access real-time and historical data.
Manage authentication, authorization, and throttling for data access services, ensuring security and compliance.
3. Cloud-Native Infrastructure and DevOps
Deploy and manage applications using
Docker containers
and
Kubernetes pods
in
AWS-hosted
environments.
Implement CI/CD pipelines (GitHub Actions, Jenkins) for automated testing, deployment, and release management.
Monitor application health and performance using
CloudWatch ,
Grafana , or
Prometheus , ensuring resilience and observability.
4. Collaboration and Continuous Improvement
Partner with global Data Engineering, Market Data, and Risk Technology teams to align platform enhancements with business priorities.
Contribute to data quality monitoring, lineage tracking, and metadata management initiatives.
Drive continuous improvement by automating repetitive processes and adopting new frameworks or tools that enhance performance and maintainability.
Key Skills and Experience
Technical Skills
Strong proficiency in
Python
and experience with frameworks such as
FastAPI
or
Flask
for microservice development.
Hands-on experience with
AWS cloud ecosystem , including Lambda, ECS, EKS, S3, Glue, CloudFormation, and IAM.
Solid understanding of
containerization (Docker)
and orchestration using
Kubernetes .
Experience with
data pipelines ,
streaming frameworks
(Kafka, Kinesis), and distributed data processing.
Familiarity with
SQL
and data warehouse technologies (Redshift, Snowflake, or similar).
Exposure to
API authentication and security
(OAuth2, JWT) and
Excel integration
(COM Add-ins, xlwings, or similar).
Experience with
CI/CD ,
testing automation , and
Git-based workflows .
Soft Skills
Strong communication and collaboration skills across global time zones and functional teams.
Analytical thinker with a structured approach to problem-solving in high-pressure trading environments.
Demonstrates initiative in identifying and delivering automation or efficiency improvements.
Detail-oriented with a commitment to clean, maintainable, and well-documented code.
Experience
5–10 years of total experience in data engineering, platform development, or trading technology.
Prior experience in
commodity trading, investment banking, or financial data engineering
preferred.
Proven experience developing and deploying cloud-native applications in AWS.
Exposure to global support and development collaboration models advantageous.
Education
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical discipline.
Success Indicators
Delivery of reliable and high-performance data APIs and access layers supporting trading and analytics.
Demonstrated ability to deploy and scale containerized workloads in production.
Reduction in latency and improvement in data availability metrics across critical systems.
Positive feedback from trading and data stakeholders regarding platform stability and usability.
Why This Role Matters
This role is foundational to building a scalable and resilient trading data platform
that enables real-time decision-making across global commodity markets.
As the offshore center grows, the position offers strong career progression towards
Senior Data Engineer
or
Technical Lead
roles within the global Trading Technology organization