Home Job Listings Categories Locations

GCP Data Architect

📍 Toronto, Canada

Technology Software International

Job Description

Software International (SI)

supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid‑sized organizations in Canada/US and Europe.

We currently have an indefinite contract role as a

GCP Data Architect

with our global consulting client, working remotely. This is a 6‑month contract initially, but could be extended.

Type:

Contract

Duration:

6 months to start + potential extension

Location:

Toronto, ON - remote with occasional office visits

Rate:

$100 - $120 CDN/hr C2C depending on overall experience

GCP Data Architect - Role Overview We are seeking a highly skilled

Google Cloud Platform (GCP) Data Architect

with

strong SAP data integration expertise

to design, implement, and oversee enterprise‑grade data solutions. The ideal candidate will combine deep expertise in

cloud data platforms, data governance, security, and data modeling

with hands‑on experience in

ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders and engineering teams to create a robust, scalable, and cost‑effective data ecosystem that bridges SAP and GCP environments.

Key Responsibilities Data Strategy, Security & Governance

Define and implement enterprise‑wide

data strategy

aligned with business goals.

Establish

data governance frameworks , data classification, retention, and privacy policies.

Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).

Design

conceptual, logical, and physical data models

to support analytics and operational workloads.

Implement

star, snowflake, and data vault models

for analytical systems.

Implement S4 CDS views in Google BigQuery.

Architect

data solutions on GCP

using BigQuery, Cloud Storage, Dataflow, and Dataproc.

Design and orchestrate

ETL/ELT pipelines

using Apache Airflow (Cloud Composer) and Dataflow.

Integrate data from multiple systems including

SAP BW, SAP HANA, Business Objects

using tools like SAP SLT or Google Cortex Framework.

Leverage integration tools such as

Boomi

for system interoperability.

Programming & Analytics

Develop

complex SQL queries

for analytics, transformations, and performance tuning.

Build automation scripts and utilities in

Python .

Good understanding of CDS views,

ABAP

language.

System Migration

Lead

on‑premise to cloud migrations

for enterprise data platforms ([SAP BW/Bobj]).

Manage migration of

SAP datasets

to GCP ensuring data integrity and minimal downtime.

DevOps for Data

Implement

CI/CD pipelines

for data workflows using GitHub Actions, Cloud Build, and Terraform.

Apply

infrastructure‑as‑code

principles for reproducible and scalable deployments.

Preferred Skills

Proven experience with

GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow .

Strong SQL and Python programming skills.

Hands‑on experience with SAP data extraction, modeling, and integration

from ERP, BW, and/or HANA systems.

Knowledge of

data governance frameworks

and data security best practices.

Experience with

Boomi, Informatica, or MuleSoft

for SAP and non‑SAP integrations.

Experience in

Google Cortex Framework

for SAP‑GCP integrations.

#J-18808-Ljbffr

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.

Job Details

Posted Date: November 21, 2025
Job Type: Technology
Location: Toronto, Canada
Company: Software International

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.