Job Description
Job Title: GCP Data Engineer
Location: Gurgaon
Experience: 3-8 years
About the Role:
- Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services.
- Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML.
- Orchestrate workflows using Cloud Composer (Airflow) to schedule, monitor and operationalise jobs.
- Optimize query performance, partitioning, clustering and cost in BigQuery.
- Work with structured, semi-structured and unstructured data, integrating multiple data sources.
- Collaborate with data scientists, analysts, business stakeholders to translate requirements into data solutions.
- Implement data governance, quality checks, pipeline monitoring, version control & CI/CD practices.
Required Skills / Qualifications:
- Strong hands-on experience with GCP services: BigQuery, Cloud Storage, DataProc/Dataproc, Dataflow, Pub/Sub, Cloud Composer. For example: designing pipelines, data ingestion, transformations. (Several roles explicitly list BigQuery + Composer + DataProc or Dataflow)
- Proficiency in Python (scripting, ETL, automation) and PySpark (or Spark) for large-scale data processing.
- Excellent SQL and BigQuery SQL skills, including query optimization, partitioning/clustering design.
- Experience with workflow orchestration tools: Cloud Composer (Airflow) or equivalent scheduling tools.
- Experience building and managing ELT/ETL/data-warehouse solutions at scale (data modelling, schemas, star/snowflake, analytics).
- Good understanding of cloud-native architecture, cost optimisation, data security, monitoring, and possibly DevOps/CI/CD.
- (Preferable) Certifications such as Google Cloud Professional Data Engineer or hands-on large-scale projects in GCP.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
November 28, 2025
Job Type:
Technology
Location:
India
Company:
Impetus
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.