Job Description
Primary Skillsets: AWS services (EC2, Lambda, Athena, Ste function), build ETL/ELT workflows using Python, PySpark scripting, SQL, AWS Databricks,
Secondary Skillsets: React; Database; GitHub Actions; Terraform. Also, should have worked in an Agile environment (User Stories/Features and following all Agile Rituals/meetings)
1. Design, develop, and maintain scalable data pipelines using AWS services, including EC2, Lambda, Step Function, and Athena
2. Maintain EC2 instances with auto-scaling policies, implement and manage IAM roles, policies, and security in the AWS application
3. Strong in Python scripting, list comprehensions and generators, String manipulation and formatting, Data structures (lists, dictionaries, sets, tuples), File I/O operations and various exception handling scenarios
4. AWS Lambda: Function development, event-driven architecture, different trigger types (S3, DynamoDB, API Gateway, EventBridge, SQS, SNS, CloudWatch)
5. AWS Step Functions is like Workflow orchestration, state machines, and error handling. Also strong in EC2 Instance types, auto-scaling groups, launch configurations/templates, scaling policies (target tracking, step scaling, scheduled)
- 6. Worked with AWS Databricks to build and maintain data lake house architectures, notebook pipeline execution, optimisation, troubleshoot and resolve data pipeline issues.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
February 26, 2026
Job Type:
Construction
Location:
India
Company:
YASH Technologies
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.