Home Job Listings Categories Locations

Senior Data & ML Engineer

📍 Australia

Technology Predelo Pty Ltd

Job Description

Senior Data & ML Engineer Hybrid

About Predelo

Predelo is an AI decision agent for back‑office operations, transforming operational data into trusted predictions and automated actions. Built on a state‑of‑the‑art forecasting and optimization engine developed over five years, we are currently focused on Workforce Management optimisation through a strategic partnership with Deputy, reaching 1.7 million end users. We are trusted by enterprise brands operating across thousands of locations and complex labour environments. Operating at scale: 2.2 B+ records in Databricks, ~170 000 shifts generated per week, 20 000+ employees supported across the US and Australia.

Job Overview

You will own Predelo's data platform component (internally: Cybertron) end‑to‑end. This is a platform‑as‑a‑product role. You don't just "build pipelines"; you build the core platform and standards that enable faster AI product development across forecasting, optimisation, automation, analytics, and customer onboarding.

Your first mandate is to modernise and harden our Databricks lakehouse foundation (governance, orchestration, reliability, and developer experience) so teams can ship AI‑powered product capabilities quickly and safely.

This is not a ticket queue role. You'll have architecture authority, hands‑on delivery expectations, and operational ownership.

What We're Looking For

Deep hands‑on Databricks experience (Spark, Delta Lake) in production.

Unity Catalog governance and migration experience.

Strong SQL and Python/PySpark.

Strong data modelling skills (medallion architecture, dimensional modelling).

Experience with dbt and/or DLT including CI/CD and testing patterns.

AWS fundamentals (S3, IAM/KMS, eventing/Lambda; Step Functions a bonus).

Strong engineering hygiene: version control, testing, observability, operational readiness.

Proven ability to leverage AI tools to improve speed and quality.

Key Responsibilities

Own the Databricks lakehouse foundation: Delta Lake, Unity Catalog, compute patterns, job and workflow orchestration, and performance tuning.

Own the transformation and modelling layer: dbt today, and evaluate where managed patterns (e.g., DLT) are a better fit.

Be the technical owner of CONNECT reliability patterns: idempotency, retries and backoff, replay, and data freshness SLAs from source to gold.

Treat data as a product: define stable data products, contracts, validation, lineage expectations, documentation, and clear ownership for priority datasets.

Build DataOps as code: monitoring, alerting, runbooks, and guardrails that prevent repeat incidents.

Partner with product, ML, and integrations teams to keep platform priorities tied to customer value and AI delivery velocity.

Operate what you build: incident response, postmortems, systematic fixes, and continuous reliability improvement.

For the right candidate: help own and improve our MLOps platform (SageMaker and Step Functions), including reliability, repeatability, and faster experimentation‑to‑production.

Liaise with and manage our Databricks and AWS account teams, and stay active in the Databricks and data platform community to bring back best practices and new capabilities.

Why Join Predelo?

Own high‑impact AI systems operating at real‑world scale.

Work in a small, high‑leverage, AI‑native team.

Ship fast, learn from production, and see direct customer impact.

Competitive compensation + ESOPs.

#J-18808-Ljbffr

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.

Job Details

Posted Date: March 4, 2026
Job Type: Technology
Location: Australia
Company: Predelo Pty Ltd

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.