Job Description
Select how often (in days) to receive an alert:
Location:
Greater Edmonton or Calgary area, AB
Company:
Government of Alberta
Job Information
Job Title: Data Architect
Job Requisition ID: 80869
Ministry: Technology and Innovation
Location: Greater Edmonton or Calgary area (Remote)
Full or Part-Time: Full Time
Hours of Work: 36.25 hours per week
Permanent/Temporary: Permanent
Scope: Open Competition
Closing Date: March 24, 2026
Classification: Systems Analyst 3
Salary: $3,756.86 - $4,976.09/Biweekly ($98,054 - $129,875/ per year)
The Business Technology Operations branch within the Ministry of Technology & Innovation plans for, implements, operates, and supports the digital services foundation technologies and platforms hosting the digital systems and data that empower business partners' services across the Government of Alberta (GOA). The branch ensures that modern technologies, including both on‑premise and cloud-based hosting environments, along with an operating framework are in place to support the GoA's digital and data strategies. The branch is also responsible for the implementation, operation, and support of corporate electronic content management services as well as legacy business applications in accordance with policies and standards.
The passionate and solutions focused people that we hire help us to drive vital programs and services that affect Albertans. Whatever your position is here, you will be a part of something great. Join us!
Role Responsibilities
Join the team shaping the future of how the Government of Alberta uses data to drive decisions, deliver services, and power innovation.
The Data Management Platform is the organisation's trusted engine for Environmental and Transportation data. It exists to transform data from multiple enterprise source systems into reliable, well‑governed, business‑ready information products that enable decision‑making across all levels of the organization.
As a Data Architect within the Ministry of Technology and Innovation, you will serve in the architectural capacity for assigned data domains owning the standards, the quality checkpoints, and the design decisions that determine what reaches production and how. Working across delivery teams, platform services, business stakeholders, enterprise system owners, and BI and analytics consumers, you will ensure that data products are built to a consistent, documented standard that makes them trusted for business decision‑making and ready for advanced analytics and AI consumption.
This is not a design‑only role. This role will hold the line at the quality gates, develop the technical capability of the people in the team, govern a pattern library that makes every subsequent delivery faster and safer, and manage the data contracts that allow enterprise systems and AI workloads to depend on the platform reliably.
This role is at the forefront of the enterprise Data Management Platform, a cloud‑based, scalable environment that underpins the GoA’s digital, data, and AI strategies. This is a high‑impact role where your technical leadership will help transform how data is collected, connected, secured, and used across the province.
Responsibilities
Own and govern data architecture across assigned domains : Maintain an active pattern library and Architecture Decision Records. Hold sign‑off authority at quality checkpoints, Architecture Sign‑off and Data Contract Confirmed and return work that does not meet the standard before build begins.
Lead production pipeline design across multiple enterprise integration patterns : Design and govern event‑driven, batch, CDC, and API‑based ingestion pipelines on Azure and Databricks including Delta Lake configuration, Unity Catalog governance, orchestration, and performance tuning at production scale. Every pattern is documented, reviewed, and reusable.
Author and enforce data contracts for all certified data products : Define grain, schema, Data Quality rules, SLAs, lineage, and ownership for every dataset entering production. Manage contracts as living commitments versioning changes and designing for backward compatibility across enterprise systems, BI reports, and AI workloads consuming the same business ready layer dataset.
Architect and govern complex downstream data sharing across multiple consumer surfaces : Design integration architecture between the platform and its consumers, semantic models, downstream enterprise operational systems, cross‑domain pipelines, and AI/ML workloads. Manage breaking‑change decisions and establish data sharing protocols that protect downstream consumers from upstream fragility.
Design data architectures that are AI‑ready by default : Ensure business ready data products meet the quality thresholds, access controls, lineage, and structural completeness required for machine learning pipelines, embedded from the outset, not retrofitted. Include feature pipeline and Unity Catalog AI governance requirements in every significant design.
Embed data governance, quality, classification, and security into the delivery lifecycle : Apply privacy‑by‑design, Data Quality rule frameworks, and classification checkpoints from intake through to production, including sensitive data handling and governance requirements specific to a multi‑ministry or multi‑domain operating environment.
Build the technical capability of analysts and engineers within assigned domains : Hold named, ongoing accountability for developing Technical Leads, Data Platform Analysts, and Integration Analysts through structured mentoring and technical guidance that produces measurable improvement in independent decision‑making, data contract quality, and delivery confidence.
Represent assigned domains at cross‑team architecture forums and contribute to shared platform governance : Contribute to the shared pattern library and federated architecture review process. Partner with platform services, cybersecurity, enterprise architecture, and business ministry stakeholders to ensure your domain's architectural choices are coherent with the broader data management environment.
Success Criteria
Architectural authority exercised with clarity : The ability to make a grounded decision, document the reasoning in a form others can use and build on, and hold the standard when delivery pressure creates shortcuts.
Structured problem‑solving that finds the root cause : Particularly in data quality failures, integration design conflicts, and downstream consumer breakages where the presenting symptom and the actual problem are rarely the same thing.
Communication that works at both ends of the room : Translating complex data architecture, systems integration, and governance concepts for delivery engineers and for senior business and government decision‑makers, without losing precision in either direction.
An AI workload does not simply consume a Gold layer dataset. It depends on documented lineage to be explainable, consistent quality thresholds to be reliable, versioned access controls to be auditable, and a feature structure designed for model consumption rather than reporting convenience. Architects who understand this difference, who can design business ready datasets that serves BI and a machine learning pipeline without compromising either will determine how quickly this organisation moves from data maturity to genuine AI capability.
Desired Experience
Unity Catalog AI governance and lineage for ML workloads
Agentic AI and automated data pipeline design
Large language model integration patterns and RAG architecture
Machine learning workflows and feature pipeline design
If you’re excited about building the foundation for enterprise‑wide AI capabilities, we encourage you to apply.
Competencies
Agility: Ability to anticipate, assess, and readily adapt to changing priorities, manage resilience in times of uncertainty and effectively work in a changing environment.
Drive for Results: Knowing what outcomes are important and maximizing resources to achieve results that are aligned with the goals of the organization, while maintaining accountability to each other and external stakeholders.
Develop self and others: A commitment to lifelong learning and the desire to invest in the development of the long‑term capability of yourself and others.
Systems Thinking: The work done within the APS is part of a larger integrated and inter‑related environment. It is important to know that work done in one part of the APS impacts a variety of other groups/projects inside and outside the APS. Systems thinking allows us to keep broader impacts and connections in mind.
Creative Problem Solving: Ability to assess options and implications in new ways to achieve outcomes and solutions.
Qualifications
Required:
University degree in a related field such as Computer Science, Information Technology, or a closely related discipline.
Minimum of Four (4) years of directly related experience in Information Technology, including genuine domain architecture ownership, data engineering, transformation, and enterprise data architecture.
Domain Architecture Ownership. Active, multi-domain, recent : Proven experience in enterprise data architecture, with active ownership across multiple distinct data domains within the past two years evidenced by living documentation, an approved pattern library, and production pipelines currently governed by your decisions. Experience limited to a single domain or supporting architecture under someone else's authority does not meet this criterion.
Gate and Checkpoint Authority. Sus...