Job Description
12+ years of experience delivering data-intensive solutions (data technical architect and engineering).
Streaming and event driven data experience
building and utilizing Kafka or queue-based architectures including ordering, replay, schema evolution, and real time observability.
Strong hands-on expertise in
Apache Kafka ecosystem
including Kafka Connect, Kafka Streams (KStreams), ksqlDB, schema registry, producer/consumer services using Java.
Strong understanding of
transactional integrity, idempotency, and error handling
in financial data flows. Hands-on experience building Java/Spring Boot–based data processing services that produce and consume Kafka events, apply real‑time transformations and enrichment, and persist channel‑optimized views into MongoDB to support low‑latency omni‑channel banking use cases.
Experience designing replay safe and resilient consumers for banking workloads.
Proven experience implementing and architecting
Change Data Capture (CDC)
data from heterogeneous sources (RDBMS, NoSQL, SaaS) using tools like Debezium, Goldengate, or native connectors.
Solid understanding of
MongoDB , Schema design & document modeling, Indexing, sharding techniques and performance tuning. Good to have experience in Azure cosmos for mongo db vCore architecture.
Hands-on experience designing
MongoDB
as a real‑time , read‑optimized data serving layer , decoupled from systems of record, for mobile and omni‑channel banking use cases.
Performance tuning for low‑latency mobile workloads.
Hands-on expertise in architecting omni‑channel and Customer 360 data platforms using MongoDB or similar NoSQL databases. Experience building
event‑driven
materialized views in MongoDB to support mobile apps, web.
Strong knowledge of stream and batch processing frameworks such as Spark, Flink, and cloud-native/on-prem data processing services.
Expertise in modern data architecture including data mesh, and unified data platforms across on‑prem and cloud (Azure/AWS/GCP).
Strong knowledge of
API‑led and event‑driven architectures
working together (REST + async events). Ability to define contracts, versioning, backward compatibility, and
domain events
aligned to banking domains.
Hands-on experience building
Java / Spring Boot services
that consume Kafka events and persist real‑time, replay‑safe updates into
MongoDB .
Strong understanding of API‑driven architectures and how they integrate with streaming and event platforms.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
March 1, 2026
Job Type:
Finance and Insurance
Location:
Chennai, India
Company:
Optimum Solutions Pte Ltd
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.