Kafka Engineer

📍 Hyderabad, India

Altro EXL

Job Description

Role

- Kafka Engineer Skills Required : Kafka, Java, SQL Experience : 5-8 Years Location : Hyderabad [3 Days a Week]

We are looking for a skilled Kafka Engineer

with 5–7 years of experience in designing, building, and maintaining

real-time streaming pipelines

using Apache Kafka. The ideal candidate will have strong expertise in

Java-based development , Kafka architecture, and CI/CD integration. You’ll play a key role in enabling real-time data processing and ensuring the scalability, fault tolerance, and governance of data streaming across our enterprise systems.

Key Responsibilities: Design and maintain Kafka-based streaming pipelines

for high-throughput, low-latency real-time data processing. Ensure

scalability ,

fault tolerance , and

schema governance

using tools like

Schema Registry (Avro/Protobuf) . Develop robust

Kafka producers and consumers

using

Java , adhering to performance and reliability standards. Collaborate with DevOps teams to automate

CI/CD pipelines , deployment, and infrastructure provisioning for Kafka components. Implement

security best practices

including encryption, authentication, and access control for Kafka clusters. Monitor and troubleshoot Kafka cluster health, latency, throughput, and message delivery guarantees. Document architecture, deployment strategies, and operational runbooks for Kafka-based systems. Participate in code reviews, knowledge-sharing sessions, and architectural discussions.

Must-Have Skills: 5–7 years of experience in backend or data engineering, with strong focus on

Apache Kafka . Proficiency in

Java

for building Kafka producers/consumers and integration with backend services. Deep understanding of

Kafka internals

(brokers, partitions, consumer groups, offsets, replication, etc.). Experience with

schema management

using

Confluent Schema Registry , Avro, or Protobuf. Experience with

CI/CD pipelines

for Kafka component deployment and integration. Strong collaboration skills with DevOps teams to manage

Kafka infrastructure , security, and automation. Solid understanding of

real-time data processing concepts , fault tolerance, and data consistency.

Good-to-Have Skills: Experience with

Kafka Streams ,

ksqlDB , or

Apache Flink

for stream processing. Familiarity with

Kafka Connect

and connectors for data ingestion and export. Exposure to

Confluent Kafka Platform

and

Kafka REST Proxy . Experience with

monitoring and observability tools

(e.g., Grafana, ELK Stack). Understanding of

container orchestration platforms

like Kubernetes or OpenShift. Knowledge of

cloud-native Kafka deployments

(AWS MSK). Familiarity with

Spring Boot

for integrating Kafka in microservice environments

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.

Job Details

Posted Date: November 22, 2025
Job Type: Altro
Location: Hyderabad, India
Company: EXL

Ready to Apply?

Don't miss this opportunity! Apply now and join our team.