ProArch Logo

ProArch

Senior Kafka Developer

Sorry, this job was removed Sorry, this job was removed at 11:22 p.m. (IST) on Wednesday, Jan 08, 2025
Be an Early Applicant
India
India

Similar Jobs

4 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Financial Services
As a Senior Associate, you will develop high-performance Java applications, ensure code quality, lead teams, and support junior colleagues in a global setting.
Top Skills: C++CockroachdbGoGoogle Protocol BuffersGrpcJavaJunitKafkaLinuxMockitoReactRedisRustSpring BootSpring Test FrameworkTerraform
4 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Software Engineer III, you will design and deliver technology products, develop solutions, and ensure system stability and architecture compliance.
Top Skills: AWSAws AuroraAws EcsAws EksAws LambdaAws RdsAws SnsAws SqsJavaKubernetesSpringTerraform
5 Days Ago
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Big Data • Cloud • Internet of Things
The role involves architecting, deploying, and maintaining RabbitMQ and Kafka platforms, optimizing performance, ensuring security, and leading engineering teams.
Top Skills: AnsibleCloud InfrastructureGrafanaKafkaPrometheusPythonRabbitMQ

Description

ProArch is seeking a skilled Senior Kafka Developer with AWS experience to join our innovative team. In this role, you will be responsible for designing, developing, and managing Kafka-based solutions that improve our data streaming capabilities and support our cloud infrastructure on AWS.

Key Responsibilities:

  • Design, develop, and implement Kafka streams and data pipelines to facilitate real-time data processing and analytics.
  • Integrate Kafka with various data sources and sinks, ensuring data flow across different applications.
  • Work alongside data architects and engineers to optimize data flow and storage patterns in AWS.
  • Monitor and troubleshoot Kafka clusters, ensuring high availability and reliability of services.
  • Implement security measures and manage data governance to protect sensitive information.
  • Collaborate with cross-functional teams to define and refine requirements for data processing functionalities.
  • Document data pipeline designs, workflows, and operational processes for reference and knowledge sharing.
Requirements

To be successful in this role, candidates should possess the following qualifications:

  • 4+ years of experience in developing applications using Kafka.
  • Strong understanding of AWS services such as Kinesis, S3, EC2, Lambda, and CloudFormation.
  • Proficient in programming languages like Java, Scala, or Python.
  • Experience with data modeling and schema design using Avro or JSON.
  • Familiarity with monitoring tools for Kafka, such as Confluent Control Center or Grafana.
  • Strong analytical and problem-solving skills.
  • Excellent communication skills and ability to work in a team-oriented environment.
  • Knowledge of data governance, security, and compliance frameworks is a plus.

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account