Job Description
About Angel One:
Angel One is one of India’s fastest growing fin-techs, on a bold mission to make investing simple, smart, and inclusive for every Indian. With over 3+ crore clients, we’re building at scale – and building for impact.
Our Super App helps clients manage their investments, trade seamlessly, and access financial tools tailored to their goals. We are working to build personalized financial journeys for our clients, powered by new-age tech, AI, Machine Learning and Data Science.
We're a builder's company at heart. You’ll have the space to experiment, the freedom to move with velocity, and the mandate to make bold, user-first decisions – every single day.
The vibe? Think less hierarchy, more momentum. Everyone has a seat at the table and a shot to build something that lasts.
Be part of a team that’s scaling sustainably, thinking big, and building for the next billion.
Why You'll Love Working at Angel One!
- Tech Systems that run at Scale: From AI to real-time data infra, you’ll work on tech that’s ahead of the curve and solve problems that truly matter.
- Build one of India’s Leading Fintech Platform: We’re not just disrupting finance – we’re shaping how billion Indians access wealth.
- Own It. Drive It. Scale It: You’ll have the freedom to lead, the resources to build, and the opportunity to leave your mark.
- Empowered Growth: We invest in your growth and empower you to explore your full potential.
- Exceptional Benefits: Our comprehensive benefits package includes health insurance, wellness programs, learning & development opportunities, and more.
Role: Senior Data & AI Engineer (DE3) – Agentic Data Systems
Location: Mumbai (Hybrid)
At Angel One, we are moving beyond traditional ETL. We are building the next generation of data infrastructure where AI Agents are first-class citizens. We need a Senior Engineer who can bridge the gap between heavy-duty data engineering (Databricks/Spark) and GenAI to build autonomous tools that can "reason" over data, automate pipeline generation, and provide intelligent insights
What you will do
- Develop Agentic Workflows: Design and deploy multi-agent systems that automate core data engineering tasks—such as automated data profiling, PII detection, and dynamic schema mapping.
- Build Custom Tools for LLMs: Write specialized Python and SQL-based "tools" (Functions) that enable LLMs to interact safely and efficiently with our AWS/Databricks ecosystem.
- Autonomous ETL: Architect "Self-Healing" pipelines that use GenAI to identify failures in Spark jobs, suggest fixes, or re-route data based on metadata changes.
- Metadata-Driven AI: Build RAG (Retrieval-Augmented Generation) systems over technical metadata and data catalogs to allow natural language discovery of our Delta Lake.
- High-Performance Foundations: Ensure the underlying data is modeled (Star/Snowflake/Data Vault) for maximum retrieval efficiency by both human analysts and AI agents.
Technical Requirements
1. GenAI & Agentic Programming
- · Agent Frameworks: Hands-on experience building multi-agent systems using LangGraph, CrewAI, or AutoGen.
- · Tool Development: Proven ability to write "Tools" (API wrappers, Python functions, SQL executors) that allow agents to perform complex data operations.
- · Prompt Engineering for DE: Experience in specialized prompting techniques like ReAct, Chain-of-Thought, and Few-Shot specifically for generating valid SQL and Spark code.
- · Evaluation: Experience using frameworks like Ragas or TruLens to evaluate the reliability of agentic outputs.
2. Data Engineering & Cloud (The Core)
- · Databricks on AWS: 4+ years of deep technical experience with Unity Catalog, Delta Live Tables (DLT), and Photon-accelerated clusters.
- · Spark Mastery: Expert-level PySpark skills with a deep understanding of RDDs, DataFrames, and internal optimization (Shuffle tuning, predicate pushdown).
- · Data Modeling: Advanced knowledge of Medallion Architecture and designing optimized Parquet/Delta storage layers for both batch and real-time streams.
3. Software Craftsmanship
- · Python Expert: You write production-grade, asynchronous Python code.
- · Architectural Vision: Ability to design systems that balance the "hallucination" risks of GenAI with the "strictness" of financial data requirements.
What Success Looks Like in 6 Months:
- Month 1-2: Modernize existing Databricks pipelines and optimize Delta table performance.
- Month 3-4: Deploy your first "Data Agent" that can autonomously generate documentation or SQL queries from our metadata catalog.
- Month 6: Build a multi-agent system that monitors data quality across the lake and suggests/executes corrective ETL actions with human-in-the-loop triggers.
Why Join Angel One?
- · The Scale: We don't play in a sandbox. You will be building agents that interact with petabytes of financial data.
- · The Tech Stack: We are 100% committed to the AWS + Databricks ecosystem, giving you the best tools to succeed.
- · Innovation First: You aren't just maintaining pipelines; you are defining what "Data Engineering" looks like in the age of AI.
At Angel One, our thriving culture is rooted in Diversity, Equity, and Inclusion (DEI).
As an Equal opportunity employer, we wholeheartedly welcome people from all backgrounds irrespective of caste, religion, gender, marital status, sexuality, disability, class or age to be part of our team. We believe that everyone's unique experiences and viewpoints make us stronger together. Come and be a part of #OneSpace*, where your individuality is celebrated and embraced.