Job Description
Get AI-powered advice on this job and more exclusive features.
This range is provided by Confidential Company. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range
CA$130,000.00/yr - CA$160,000.00/yr
Fulltime position with paid Vacation & Holidays + Benefits Program
Must be
CANADIAN Citizen
or
CAN Perm. Resident
100% Remote
from within Canada (not Quebec). Must work on EST.
Small B2B SaaS cloud-based software company (less than 800 employees) that is experiencing high growth and is rolling out leading edge AI-enabled features. Need a Data Expert/Data Team Lead/Data Architect that can lead their migration from mono-to-micro. MongoDB over to Redshift…..using AWS Glue, PySpark, dbt, and Python.
This is NOT a typical SR Data Engineer role - This new hire must be the actual
LEADER
of the data migration! (Data Team Lead/Architect)
This Lead Data Engineer/Data Architect will be the highest-ranking Data Expert on the team and report to the VP of Technical Engineering. There is no one else.
This person must lead the way themselves and make suggestions on how to do it. This is NOT just be a Senior Engineer that follows orders and transition uses ETL to migrate or modernize the data.
Must be an expert in
Database Management
and
Database Administration , especially
MongoDB
and
Domain Data Modeling .
This seasoned Senior Data Engineer will help lead the modernization of our data infrastructure as we transition from a tightly coupled monolithic system to a scalable, microservices-based architecture. This role is central to decoupling legacy database structures, enabling domain-driven service ownership, and powering real-time analytics, operational intelligence, and
AI initiatives
across our platform.
Key Responsibilities
Monolith-to-Microservices Data Transition: Lead the decomposition of monolithic database structures into domain-aligned schemas that enable service independence and ownership.
Pipeline Development & Migration: Build and optimize ETL/ELT workflows using
Python, PySpark/Spark, AWS Glue , and
dbt , including schema/data mapping and transformation from on-prem and cloud legacy systems into data lake and warehouse environments.
Domain Data Modeling: Define logical and physical domain-driven data models (star/snowflake schemas, data marts) to serve cross-functional needs, BI, operations, streaming, and ML.
Legacy Systems Integration: Design strategies for extracting, validating, and restructuring data from legacy systems with embedded logic and incomplete normalization.
Database Management : Administer, optimize, and scale SQL (MySQL, Aurora,
Redshift ) and NoSQL ( MongoDB ) platforms to meet high-availability and low-latency needs.
Cloud & Serverless ETL: Leverage
AWS Glue Catalog, Crawlers, Lambda , and
S3
to manage and orchestrate modern, cost-efficient data pipelines.
Monitoring & Optimization: Implement observability (CloudWatch, logs, metrics) and performance tuning across
Spark, Glue,
and
Redshift
workloads.
Minimum Requirements
10+ years
in data engineering with a proven record in modernizing legacy data systems and driving large-scale migration from monolithic over to microservices (preparing modernized data to be ready for use in AI modules).
Must have experience as an actual
Data Lead/Data Architect
leading the way. NOT just a Senior Data Engineer following orders. Must be a LEADER. Must be able to give direction and make recommendations and give push back when needed.
Must be an expert in
MongoDB
and
Redshift , and be an ace using
AWS Glue ,
PySpark ,
dbt , and
Python .
Must be “ very hands-on ” engineer. This is an
individual contributor
role. There are no other data engineers. You may be able to have some offshore assistance, but it will mostly be up to this person to handle the migration.
Must have
great English communication
skills and work well in a team environment. Position will work closely with solution architects and domain owners to design resilient pipelines and data models that reflect business context and support scalable, secure, and auditable data access for internal and external consumers.
Must be
Canadian Citizen
or
Canadian Permanent Resident . We can NOT HIRE anyone in Quebec. Must live in Canada.
Must be comfortable working for a small company with less than 800 employees.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
IT System Custom Software Development
Referrals increase your chances of interviewing at Confidential Company by 2x
#J-18808-Ljbffr
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
December 16, 2025
Job Type:
Technology
Location:
Canada
Company:
Confidential Company
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.