Job Description
Senior Data Engineer - Job Description
Position Summary
The Senior Data Engineer will design, develop, and optimize data solutions to support business processes and analytics within Investment Management. This role requires deep expertise in modern data engineering practices, cloud-native development, and advanced data transformation techniques. The engineer will work extensively with Azure Data Factory (ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity.
Key Responsibilities
Architect and implement scalable data pipelines and ETL workflows using ADF, PySpark, and DBT.
Design and optimize data models for Snowflake and other cloud-based data platforms.
Develop and maintain complex data transformation logic using Python, PL/SQL, DBT, Snowpark, Snowsight web interface, Databricks notebooks, and Pyspark.
Integrate job schedulers and Azure Function Apps for orchestration and automation.
Apply Gen AI tools (Cortex AI, GitHub Copilot) to accelerate development, improve code quality, and enhance productivity.
Ensure robust data governance, security, and compliance across all solutions.
Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
Mentor junior engineers and provide technical leadership in data engineering best practices.
Participate in Scrum ceremonies and contribute to continuous improvement initiatives.
Requirements & Qualifications
10-12 years of hands-on experience in data engineering with strong coding skills.
Proven expertise in: dbt, Snowflake, Azure Data Factory (ADF), Python, PySpark, PL/SQL, Databricks.
Experience with job schedulers and Azure Function Apps for automation.
Familiarity with Gen AI tools (e.G., Cortex AI, GitHub Copilot) for accelerating development.
Strong hands-on understanding of CI/CD pipelines, DevSecOps, Data Engineering best practices, and Agile methodologies.
Excellent problem-solving skills and ability to work in a fast-paced environment.
Strong communication and stakeholder management skills.
Education & Certifications
Bachelor of Engineering degree in Computer Science, Information Technology, or related field.
Certifications in Azure Data Engineering, Snowflake, or Databricks are a plus.
Exposure to Investment Management or Financial Services domain preferred.
Please Note:
Regular work hours: 11.00 AM to 8.00 PM IST, with flexibility in work hours
This position allows for Hybrid work which requires the individual to be in office 2-3 days a week
Individual must be available to work US Eastern Standard Time hours as business requires.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
March 5, 2026
Job Type:
Technology
Location:
India
Company:
Voya India
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.