As a Kafka senior developer at ProArch you will be responsible for designing, developing, troubleshooting, and maintaining scalable and efficient Kafka-based messaging solutions and microservices applications. You will work closely with cross-functional teams to ensure seamless data flow, scalability, and reliability of our Kafka ecosystem. You are responsible for developing and delivering POCs. Hands-on experience is a must.
Key Responsibilities:
- Set up and configure Kafka clusters, including brokers, zookeepers, and other components.
- Develop and implement best practices for Kafka configuration, management, and monitoring.
- Troubleshoot and resolve Kafka-related issues, ensuring high availability and performance.
- Collaborate with development teams to integrate Kafka with existing systems and applications.
- Design, develop, and manage Kafka-based data streaming applications and pipelines. Stay updated with the latest trends and advancements in Kafka and related technologies.
- Implement security measures and ensure compliance with industry standards.
- Integrate Kafka with other systems and applications, ensuring seamless data flow and real-time processing.
- Conduct thorough code reviews, providing constructive feedback and ensuring adherence to coding standards and best practices.
- Experience with cloud platforms such as AWS.
- Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes.
- Familiarity with other messaging systems and data streaming platforms such as RabbitMQ, Apache Pulsar, or Apache Flink.
- Familiarity with CI/CD pipelines and DevOps practices.
- Bachelor’s or master’s degree in computer science, Information Technology, or related field.
- 5-8 years of hands-on experience in software development, with a focus on distributed systems and data streaming technologies.
- Proven experience in setting up, configuring, and troubleshooting Kafka clusters in production environments using AWS MSK (serverless/Provisioned) or Confluent Kafka platforms is a must.
- Strong understanding of Kafka architecture, including brokers, zookeepers, producers, consumers, and Kafka Streams.
- Strong experience with Schema registries like AWS Glue or Avro etc.
- Strong experience Connector ecosystem specifically Source Connectors and Sink Connector leveraging open-source components.
- Strong experience leveraging Postgres and its datatypes like Jsonb etc.
- Proficiency in programming languages such as Java, Scala, or Python.
- Experience in designing, building deploying and maintenance of enterprise cloud solutions in AWS
- Demonstrable experience with microservices based architecture on Cloud at scale.
- In-depth understanding of microservices architecture and best practices.
- Experience with RESTful APIs and web services.
- Strong testing skills with JUnit and Mockito.
- Experience with Karate testing framework for API testing.
- Familiarity with version control systems like Git.
- Strong problem-solving skills and the ability to work under pressure.
- Excellent communication and teamwork skills.
Similar Jobs
As a Senior Software Engineer, you will develop and implement web-based applications and new features, troubleshoot issues across the stack, and mentor team members. You'll work in an agile environment, ensuring high-quality code while keeping up with industry trends.
The Senior Software Engineer will design, develop, and maintain microservices and distributed systems using Java and Spring Cloud, while integrating Pub/Sub messaging systems. Responsibilities include optimizing system performance, ensuring seamless communication between services, and collaborating with cross-functional teams. The role also involves conducting code reviews and mentoring junior developers.
Be an Early Applicant
As a Senior Software Engineer on the database engine team, you will design, implement, and optimize data ingestion systems and contribute to performance improvements. You will work collaboratively in a fast-paced environment, from writing and testing software to gaining insights into the business impact of technology. Strong expertise in C/C++ and Linux, along with experience in data streaming with Kafka, is essential.
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.