Capco Logo

Capco

Data Engineer

Posted An Hour Ago
Be an Early Applicant
Hybrid
Pune, Maharashtra
Mid level
Hybrid
Pune, Maharashtra
Mid level
The Data Engineer will build complex data pipelines, implement DW projects in a Big Data environment, and work with a variety of databases. Strong skills in Python, PySpark, and Agile methodologies are essential.
The summary above was generated by AI

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.


CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

Hands-on expertise in Python, PySpark, Hadoop, Cloudera platforms, Airflow

• Base Skill Requirements: • MUST Technical

• Experience in Data Warehouse/Data Lake/Lake House related projects in product or service-based organization 

• Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment handling petabyte scale data.

• Solid Experience of building complex data pipelines through Spark with Scala/Python/Java on Hadoop or Object storage

• Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge.

• Proficient in working within an Agile/Scrum framework, including creating user stories with well-defined acceptance criteria, participating in sprint planning and reviews

• Optional Technical • Experience of building Nifi pipelines (Preferred) 

• Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan

• Strong communication skills - both verbal and written
• Ability to multi-task across multiple projects, interface with external / internal resources

• Proactive, detail-oriented and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results

• Willingness to quickly learn and implement new technologies, participate POC to explore best solution for the problem statement

• Experience working diverse and geographically distributed project teams

Top Skills

Airflow
Cloudera
Hadoop
Java
Netezza
Oracle
Pyspark
Python
Scala
SQL

Similar Jobs at Capco

4 Days Ago
Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Design and deploy AI/ML solutions on AWS, build data pipelines using Databricks and PySpark, and maintain legacy systems. Collaborate with teams to ensure efficiency and performance optimization in data processing.
Top Skills: APIsAWSCi/CdDatabricksDockerJavaKubernetesNetezzaNifiOraclePl/SqlPysparkScalaSparkSQLUnix
9 Days Ago
Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The GCP Data Engineer will develop and maintain data pipelines and ETL/ELT processes using Python and SQL on Google Cloud Platform, working in a collaborative environment to ensure data quality and availability.
Top Skills: BigQueryCloud ComposerCloud FunctionsDataflowGoogle Cloud PlatformPub/SubPythonSQL
9 Days Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead technical role for engineering team, focusing on development and operations in Cloud. Involves migrations, scripting, and process enhancements.
Top Skills: AirflowAWSBig QueryCloud SqlConfluenceControl MGCPGitJIRAPl/SqlPythonShell ScriptingSQL

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account