ProArch Logo

ProArch

Senior Kafka Developer

Sorry, this job was removed Sorry, this job was removed at 11:22 p.m. (IST) on Wednesday, Jan 08, 2025
Be an Early Applicant
In-Office
India
In-Office
India

Similar Jobs

9 Days Ago
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Big Data • Marketing Tech • Analytics
Design and develop software solutions, manage Kafka Streaming, work with RDBMS and No-SQL databases, and ensure application performance.
Top Skills: Aurora MysqlAWSCassandraHadoopJavaKafkaScalaSpark
18 Days Ago
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Expert/Leader
Expert/Leader
Fintech • Information Technology • Payments
Lead complex technical projects, specializing in cloud solutions and software security, utilizing Java and Kafka. Collaborate across teams to implement high-performance architectures.
Top Skills: AWSDb2DockerGCPGitJavaJenkinsJSONKafkaKubernetesMavenMongoDBOracleRestSoapSpringSpring BootSQL ServerXML
6 Days Ago
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Mid level
Mid level
Fintech • Information Technology • Payments
The Middleware Engineer will design, build, and support middleware solutions, focusing on integration and performance for business-critical applications.
Top Skills: AWSAzureCentosDockerFlinkGCPHazelcastJavaKafkaKubernetesPythonQlikRhelScalaStriimTemporal

Description

ProArch is seeking a skilled Senior Kafka Developer with AWS experience to join our innovative team. In this role, you will be responsible for designing, developing, and managing Kafka-based solutions that improve our data streaming capabilities and support our cloud infrastructure on AWS.

Key Responsibilities:

  • Design, develop, and implement Kafka streams and data pipelines to facilitate real-time data processing and analytics.
  • Integrate Kafka with various data sources and sinks, ensuring data flow across different applications.
  • Work alongside data architects and engineers to optimize data flow and storage patterns in AWS.
  • Monitor and troubleshoot Kafka clusters, ensuring high availability and reliability of services.
  • Implement security measures and manage data governance to protect sensitive information.
  • Collaborate with cross-functional teams to define and refine requirements for data processing functionalities.
  • Document data pipeline designs, workflows, and operational processes for reference and knowledge sharing.
Requirements

To be successful in this role, candidates should possess the following qualifications:

  • 4+ years of experience in developing applications using Kafka.
  • Strong understanding of AWS services such as Kinesis, S3, EC2, Lambda, and CloudFormation.
  • Proficient in programming languages like Java, Scala, or Python.
  • Experience with data modeling and schema design using Avro or JSON.
  • Familiarity with monitoring tools for Kafka, such as Confluent Control Center or Grafana.
  • Strong analytical and problem-solving skills.
  • Excellent communication skills and ability to work in a team-oriented environment.
  • Knowledge of data governance, security, and compliance frameworks is a plus.

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account