Home Job Listings Categories Locations

Senior - Mid Level Microsoft Fabric Data Engineer

๐Ÿ“ India

Technology Squash Apps

Job Description

Company Description Squash Apps is a highly-rated full-stack consulting company dedicated to creating next-generation, scalable, and robust web applications for visionary clients. Leveraging cutting-edge technologies such as MEAN, MERN, MEVN stacks, Java stack, SQL/NOSQL, Elastic Search, Redis, and hybrid mobile apps, Squash Apps delivers exceptional software solutions. With a portfolio highlighted on their website and a successful eCommerce product, Panitr, Squash Apps emphasizes innovation and quality. The team is passionate about developing top-tier applications and invites forward-thinkers to join their growing organization.

We are looking for a

Senior Microsoft Fabric Data Engineer

to lead end-to-end data pipeline development across the

Microsoft Fabric ecosystem . The ideal candidate is strong in Data Factory, Lakehouse architecture, Delta, Spark, and Power BI integrations.

Responsibilities ๏ปฟ โ— Design and build scalable

ELT/ETL pipelines

using Microsoft Fabric (Data Factory, Dataflows, Notebooks, Spark). โ— Develop & optimize

Lakehouse / Delta Lake

layers (Bronzeโ€“Silverโ€“Gold). โ— Implement data ingestion pipelines using

Fabric Data Factory Pipelines & Dataflows Gen2 . โ— Work with

Fabric Lakehouse, Warehouses, KQL DB, OneLake . โ— Build and tune

Spark notebooks

for transformations and large-scale processing. โ— Integrate with

Power BI

for semantic modeling and data product delivery. โ— Ensure data governance using

Fabric capacities, security roles, lineage tools . โ— Implement CI/CD using

Azure DevOps / GitHub , including branching & deployment. โ— Mentor teams, perform code reviews, and drive best practices.

Required Skills โ—

7โ€“12+ years

of Data Engineering;

2โ€“3+ years

solid experience with Microsoft Fabric . โ— Strong in

PySpark, SQL, Delta Lake, Spark Notebooks . โ— Deep understanding of

OneLake, Lakehouse, Warehouses, Data Factory . โ— Experience with

Azure โ— Strong in data modeling (star/snowflake), ELT, and Lakehouse architecture. โ— Strong hands-on experience with

Git, CI/CD, Azure DevOps . โ— Excellent performance tuning skills for Spark workloads.

Good to Have โ— Power BI DAX / Semantic Models exposure โ— Experience with KQL / Real-time analytics โ— Azure or Fabric certifications โ— Knowledge of DataOps, monitoring, cost optimization

Microsoft Fabric Data Engineer (4โ€“6 Years Experience) Weโ€™re looking for a

Microsoft Fabric Data Engineer

to build modern, scalable data pipelines across the

Microsoft Fabric ecosystem . Perfect for someone with strong skills in Spark, Data Factory, Lakehouse, and SQL.

Responsibilities โ— Build and maintain

ELT/ETL pipelines

using Microsoft Fabric (Data Factory, Dataflows Gen2, Notebooks). โ— Develop

Lakehouse layers (Bronzeโ€“Silverโ€“Gold)

using Delta Lake. โ— Work with

Fabric Lakehouse, Warehouses, KQL DB, OneLake

for end-to-end data solutions. โ— Perform data ingestion, transformation, and performance tuning using

Spark notebooks .โ— Integrate datasets with

Power BI

and semantic models. โ— Manage deployments using

Git + CI/CD

(Azure DevOps / GitHub). โ— Ensure data quality, documentation, and best practices.

Required Skills โ—

4โ€“6 years

of Data Engineering experience. โ— Hands-on with

Microsoft Fabric

components: Data Factory, Lakehouse, Pipelines, Notebooks. โ— Strong in

PySpark, SQL, Delta Lake, Spark transformations . โ— Experience with

Azure โ— Good understanding of data modeling and Lakehouse architecture. โ— Experience with Git and basic CI/CD workflows.

โœจ

Good to Have โ— Exposure to

Power BI

(DAX or modeling). โ— Knowledge of KQL or Real-time analytics. โ— Azure / Fabric certification.

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.

Job Details

Posted Date: December 19, 2025
Job Type: Technology
Location: India
Company: Squash Apps

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.