Capco Logo

Capco

Senior Devops Engineer

Sorry, this job was removed at 01:19 a.m. (IST) on Wednesday, Oct 16, 2024
India
Internship
India
Internship

Job Description: DevOps Engineer (7+ Years of Experience)

Position: DevOps Engineer
Experience: 7+ Years
Location: Gurugram, Bangalore, Hyderabad, Chennai, Kolkata, Pune, Mumbai.


Hong Kong, Malaysia, India

Joining Capco means joining an organization that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can, and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.

ABOUT CAPCO

Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry. You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.


Job Overview:

We are seeking a highly experienced DevOps Engineer with 7+ years of expertise to join our dynamic team. The role will focus on the deployment, integration, and maintenance of BigID across various environments, ensuring compliance with data retention and legal hold requirements. The candidate will also be responsible for API development, database scanning, and developing Python scripts for integration with Google Cloud Platform (GCP) for data storage and further utilization.


Must have skills: Python, Devops, Linux, CI/CD, docker, GIT/version control,GCP

Good to have: Big ID

Key Responsibilities:

  • Deployment and Integration:
    • Deploy and integrate BigID across multiple environments (Dev, QA, PreProd, Prod).
    • Address issues related to authentication, connectivity, and integration with the existing infrastructure.
    • Ensure successful deployment in both production and development instances by configuring the necessary environments.
  • Data Retention and Legal Hold:
    • Implement BigID’s retention module to manage legal holds and apply retention rules for both structured and unstructured data in compliance with regulatory obligations.
  • API Development:
    • Develop robust APIs for data storage and processing based on project requirements.
  • Database Scanning:
    • Configure BigID to connect with relevant data platforms, including NAS, for performing initial scans.
    • Support the data retention use case by ensuring accurate scanning, reporting, and identification of data subject to retention policies.
  • Python Script Development for GCP Integration:
    • Develop Python scripts to automate data transfer from BigID to GCP, ensuring secure storage and availability for further processing or analytics.
    • Support future use cases by enabling the movement of data from BigID to GCP for various business or regulatory purposes.

Required Skills and Experience:

  • Development and Scripting:
    • Strong experience in development using Python, Bash/Shell scripting, and other scripting languages on Linux-based operating systems.
  • Linux Expertise:
    • Proficiency in managing and deploying applications on Linux servers and operating systems.
  • CI/CD Tools: Strong experience with Jenkins, GitLab CI, CircleCI, or similar tools for continuous integration and continuous deployment.
  • Configuration Management: Proficiency with Ansible, Terraform, Chef, or Puppet for automating deployment and infrastructure configuration.
  • Containerization & Orchestration: Experience with Docker, Kubernetes, and Helm for deploying, managing, and scaling containerized applications.
  • Cloud Platforms: Expertise in GCP and other cloud platforms (AWS, Azure) with hands-on experience in deploying, managing, and integrating cloud services.
  • Version Control: Strong understanding and experience with Git or other version control systems.
  • Monitoring & Logging: Familiarity with monitoring tools and logging tools
  • Networking and Security: Knowledge of networking, firewall configurations, SSL/TLS, and implementing secure authentication mechanisms.
  • Experience with BigID: Experience with BigID or similar data discovery, privacy, and governance tools.

Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • 7+ years of experience in DevOps or related roles.
  • Excellent problem-solving and communication skills.
  • Ability to work collaboratively with cross-functional teams.


WHY JOIN CAPCO?

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.

We offer:

A work culture focused on innovation and building lasting value for our clients and employees

Ongoing learning opportunities to help you acquire new skills or deepen existing expertise

A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients

A diverse, inclusive, meritocratic culture

Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause and bereavement

NEXT STEPS

If you’re looking forward to progressing your career with us, please do not hesitate to apply. We are looking forward to receiving your application.

To learn more about Capco and its people

#LI-Hybrid

#LI-RA1

Similar Jobs at Capco

5 Days Ago
Pune, Maharashtra, IND
Remote
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Sr. Data Engineer will work on financial services projects using PySpark and Scala with a focus on debugging and data analysis. The role requires understanding of SDLC, experience with Big Data application life cycles, and proficiency in GIT, with bonus skills in CICD tools like Jenkins and Ansible.
Be an Early Applicant
7 Days Ago
Bengaluru, Karnataka, IND
Hybrid
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Senior Java Microservices Developer, you will engage in backend development focusing on Java and Microservices. You will be responsible for designing technical solutions based on business requirements, leading API framework designs, assessing application improvement opportunities, and managing production incidents.
Be an Early Applicant
7 Days Ago
Bengaluru, Karnataka, IND
Hybrid
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Java Microservices Developer will engage in backend development focusing on Java and Microservices, leveraging technologies like Spring Boot, Kafka, and cloud platforms such as AWS. Responsibilities include designing technical solutions, improving application processes, managing API development, and collaborating with global stakeholders to drive project success.

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account