Job Description
GCP Data Engineer (BigQuery, Dataflow, Composer)
Role Description
Design and build scalable, secure, and high-performance data pipelines on GCP.
Develop and optimize ETL/ELT workflows using Cloud Composer, Dataflow, Dataproc, and BigQuery.
Implement data ingestion frameworks for batch and streaming data (Pub/Sub, Kafka, Dataflow).
Model, partition, and optimize datasets in BigQuery for analytics use cases.
Collaborate with data scientists, architects, and business teams to deliver end-to-end data solutions.
Ensure data quality, reliability, and robustness through monitoring, validation, and automation.
Implement CI/CD pipelines for data workflows using Cloud Build, Git, and Terraform.
Optimize cost, performance, and scalability across GCP data services.
Ensure security best practices, IAM policies, and compliance with organizational standards.
Skills
Digital: Big Data and Hadoop Ecosystems
Digital: Google Data Engineering
#J-18808-Ljbffr
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
March 8, 2026
Job Type:
Technology
Location:
Canada
Company:
Astra-North Infoteck Inc. ~ Conquering today’s challenges, achieving tomorrow’s vision!
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.