Smarsh Logo

Smarsh

Cloud Engineer III

Posted 11 Days Ago
Be an Early Applicant
Hybrid
India
Mid level
Hybrid
India
Mid level
The Cloud Engineer III will develop scalable integrations and APIs for the Internal Developer Portal ecosystem, focusing on data pipelines and open-source contributions. Responsibilities include designing and maintaining APIs, ensuring performance and security, collaborating with cross-functional teams, and implementing best practices in code quality and testing.
The summary above was generated by AI

Who are we?


Smarsh empowers its customers to manage risk and unleash intelligence in their digital communications. Our growing community of over 6500 organizations in regulated industries counts on Smarsh every day to help them spot compliance, legal or reputational risks in 80+ communication channels before those risks become regulatory fines or headlines. Relentless innovation has fueled our journey to consistent leadership recognition from analysts like Gartner and Forrester, and our sustained, aggressive growth has landed Smarsh in the annual Inc. 5000 list of fastest-growing American companies since 2008.


We are seeking a talented Engineer to join our team, focusing on developing scalable integrations, APIs, and open-source solutions that contribute to our Internal Developer Portal (IDP) ecosystem. As a key team member, you will collaborate with cross-functional teams to design, implement, and maintain APIs and data pipelines that enable seamless data flow into our IDP. If you are passionate about clean code, open-source contributions, and building developer-centric tools, we want to hear from you.

Key Responsibilities

  • API Development:
  • Design, develop, and maintain robust APIs to push data into the IDP.
  • Ensure high performance, scalability, and security in API implementations.
  • Collaborate with teams to integrate APIs with existing systems.
  • Integration Development:
  • Build and maintain open-source integrations for third-party tools (e.g., monitoring systems, CI/CD pipelines, container registries).
  • Write reusable, testable, and efficient Python code to bridge systems with the IDP.
  • Data Processing and Transformation:
  • Develop data pipelines to process, transform, and push data into the IDP.
  • Implement error handling and logging mechanisms to ensure reliability.
  • Design systems for data parsing and transformation, including robust handling of YAML, JSON, and other serialisation formats to normalise inputs from disparate sources.
  • Open-Source Contribution:
  • Contribute to open-source projects that enhance the IDP ecosystem.
  • Actively participate in the developer community by publishing and maintaining open-source tools.
  • Collaboration and Communication:
  • Work closely with DevOps, Platform Engineering, and Security teams to understand data requirements.
  • Document APIs, integrations, and workflows for internal and external stakeholders.
  • Code Quality and Testing:
  • Write unit and integration tests to ensure code reliability.
  • Perform code reviews and enforce best practices in Python development.

Required Skills and Qualifications

  • Education:
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Technical Expertise:
  • Proficiency in Python with a focus on building scalable applications.
  • Experience with API frameworks such as FastAPI, Django Rest Framework, or Flask.
  • Knowledge of data serialization formats (e.g., JSON, YAML).
  • Knowledge of event-driven architecture.
  • Knowledge of queuing system like Kafka, RabbitMQ and SQS.
  • Knowledge of Role-Based Access Control (RBAC) and least-privilege principles to secure all IDP interactions.
  • Integration Experience:
  • Experience building integrations with third-party tools like Jenkins, GitLab, Prometheus, or AWS.
  • Familiarity with APIs for monitoring tools, container registries, and CI/CD systems.
  • DevOps and Cloud:
  • Understanding of Kubernetes, Docker, and cloud platforms (AWS, GCP, Azure).
  • Familiarity with GitOps practices and tools like ArgoCD.
  • Data Processing:
  • Experience with data pipelines and ETL workflows.
  • Knowledge of PostgreSQL, MongoDB, or other relational/non-relational databases.
  • Design systems for data parsing and transformation, including robust handling of YAML and JSON.
  • Open Source:
  • Proven experience contributing to or maintaining open-source projects.
  • Familiarity with Git and GitHub workflows.
  • Soft Skills:
  • Strong communication skills and the ability to work in a collaborative environment.
  • Analytical mindset with attention to detail and problem-solving skills.

Preferred Qualifications

  • Familiarity with Port or other Internal Developer Portal (IDP) tools.
  • Experience with security practices, including API authentication and data encryption.
  • Understanding of AWS, Kubernetes and DevOps practices.
  • Knowledge of DORA metrics and CI/CD pipeline observability.
  • Exposure to Infrastructure-as-Code tools (e.g., Terraform, Pulumi).
  • Familiarity with testing frameworks like pytest or unittest

About our culture


Smarsh hires lifelong learners with a passion for innovating with purpose, humility and humor. Collaboration is at the heart of everything we do. We work closely with the most popular communications platforms and the world’s leading cloud infrastructure platforms. We use the latest in AI/ML technology to help our customers break new ground at scale. We are a global organization that values diversity, and we believe that providing opportunities for everyone to be their authentic self is key to our success. Smarsh leadership, culture, and commitment to developing our people have all garnered Comparably.com Best Places to Work Awards. Come join us and find out what the best work of your career looks like.

Top Skills

Python

Similar Jobs

3 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Software Engineer III at JPMorgan Chase, you will design, develop, and troubleshoot software solutions, create secure production code, analyze diverse data sets to improve applications, and actively engage in software engineering practices while contributing to a diverse team culture.
Top Skills: Software
Mid level
Financial Services
As a Technology Support III, you will be responsible for AWS and Databricks support, driving automation enhancements, analyzing issues for efficiency, managing stakeholder communications during technology incidents, and documenting support processes within a collaborative agile environment.
Top Skills: PythonScala
15 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Software Engineer III at JPMorgan Chase, you will design and deliver technology products within an agile team, focusing on application modernization and cloud-native data platforms. Responsibilities include developing secure production code, creating data engineering solutions, and migrating data products to cloud platforms while coordinating with stakeholders.
Top Skills: Python

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account