Job Description
We are looking for a highly experienced Solution Architect with a strong background in building and scaling cloud-native data platforms. This role is Databricks-heavy, focusing on Unity Catalog, Medallion Architecture, Delta Lake, governance, and modern data engineering patterns. You will collaborate with cross-functional teams to translate business needs into well-architected solutions and guide delivery across enterprise environments.
Responsibilities
Design and lead implementation of end-to-end
Databricks Lakehouse Platforms
using Delta Lake, Delta Live Tables, and MLflow.
Architect
Medallion Architecture
(Bronze/Silver/Gold) for structured, semi-structured, and streaming workloads.
Implement governed Lakehouse patterns using
Unity Catalog
for access control, lineage, data classification, and secure sharing.
Build scalable
ETL/ELT pipelines
using Databricks Notebooks, Workflows, SQL Warehouses, and Spark-based transformations.
Develop
real-time streaming pipelines
with Auto Loader, Structured Streaming, and event-driven platforms (Kafka, Kinesis, Pub/Sub).
Integrate Databricks with cloud-native services such as
AWS Glue, Azure Data Factory, and GCP Dataform .
Define distributed integration patterns using
REST APIs , microservices, and event-driven architectures.
Enforce
data governance, RBAC/ABAC, encryption, secret management , and compliance controls.
Optimize Delta Lake tables, Spark workloads, and cluster configurations using Photon and autoscaling patterns.
Drive cloud cost optimization across storage, compute, and workflow orchestration.
Participate in architecture reviews, set standards, and support engineering teams throughout execution.
Stay current on Databricks capabilities including Unity Catalog updates, Lakehouse Federation, serverless compute, and AI/ML features.
Qualifications
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
8+ years of experience in enterprise software, cloud architecture, or data engineering roles.
Strong hands‑on experience with
Databricks, Apache Spark, Delta Lake , and Lakehouse platform design.
Experience implementing and administering
Unity Catalog
for governance, lineage, and fine‑grained access control.
Experience designing
Medallion Architecture
for analytics and engineering workloads.
Hands‑on experience with cloud platforms such as
AWS, Azure, or GCP , including storage, compute, and networking services.
Experience with streaming technologies such as
Kafka, Kinesis, or Pub/Sub .
Strong understanding of
data modeling, workflow orchestration (Airflow, Databricks Workflows, dbt) , and pipeline automation.
Familiarity with
Scala-based Spark workloads
in addition to PySpark and SQL pipelines.
Skilled in performance tuning, Spark optimization, cluster policies, and
cloud cost management .
Excellent communication skills for technical leadership and stakeholder collaboration.
Certifications in Databricks, AWS/GCP/Azure Solution Architecture, or TOGAF are a plus.
Seniority Level
Mid‑Senior level
Employment Type
Contract
Job Function
Engineering and Information Technology
Industries: IT Services and IT Consulting
Referrals increase your chances of interviewing at CloudTech Innovations by 2x
Location: Melbourne, Victoria, Australia
Salary: A$125.00 – A$157.00
#J-18808-Ljbffr