Capco Logo

Capco

Senior Data Engineer with Python and Snowflake - Pune

Reposted 14 Hours Ago
Hybrid
Pune, Maharashtra
Senior level
Hybrid
Pune, Maharashtra
Senior level
Lead the design and development of scalable web applications using Python and React.js, mentoring a team of developers and collaborating across functions.
The summary above was generated by AI

Job Title: Senior Data Engineer with Python and Snowflake - Pune

About Us

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.


Role Description:

Key Skills: Data Engineering, Python, Snowflake, AWS, Git/ Bitbucket

Exp: 9+yrs

Location – Hinjewadi, Pune

Shift timings: 12:30PM- 9:30PM

3 days WFO (Tues, Wed, Thurs)

Technical Requirement

Job Summary
 

Job Description: Python & Snowflake Engineer with AI/Cortex Development

  1. 7+ years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud. Snow Park experience preferred. Experience with different data modeling  techniques is required.
  2. 7+ yrs experience with Python development. Used tools like VS Code or anaconda, version control using Git or Bitbucket and Python unit testing frameworks.
  3. 1+ years of experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents, cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods).
  4. Deep understanding of implementing Object oriented programming in the Python, data structures like Pandas, data frames and writing clean and maintainable Engineering code.
  5. Understanding multi-threading concepts, concurrency implementation using Python server-side python custom modules.
  6. Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent.
  7. Good at developing and deploying Python applications like lamda on AWS Cloud platform.
  8. Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines.
  9. Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes.
  10. Good at Snowflake Account hierarchy models, Account-role-permissions strategy.
  11. Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings.
  12. Good at the Data Governance/Security concepts within Snowflake, Row/Column level dynamic data masking concepts using Snowflake Tags.
  13. Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake.
  14. Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs.
  15. Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers).
  16. Candidate continuously improving functionality through experimentation, performance tuning and customer feedback.
  17. Nice to have any application Cache implementation experience within Python web applications. Nice to have duckdb with Apache arrow experience.
  18. Nice to have implementing CI/CD pipelines within Snowflake applications.
  19. Good at analytical skills, problem solving and communicate technical concepts clearly.
  20. Experience using Agile and SCRUM methodologies and preferably with JIRA.
If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Top Skills

AWS
Ci/Cd
CSS
Django
Docker
Fastapi
Flask
Git
Python
React
Snowflake
SQL
Typescript

Similar Jobs at Capco

6 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead projects focused on digital onboarding and wealth servicing via mobile apps, collaborating with UX/UI designers and managing stakeholders efficiently while employing Agile methodologies.
Top Skills: Agile MethodologiesConfluenceJIRAKanbanScrum
6 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead project delivery in capital markets and trading, driving digital transformation and managing cross-functional teams using Agile methodologies.
Top Skills: Agile MethodologiesConfluenceJIRAMobile App DevelopmentWeb Development
14 Hours Ago
Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Develop and maintain backend services using Python frameworks, manage data pipelines with Snowflake, and build frontend components using React. Collaborate on delivering scalable web applications in an Agile environment.
Top Skills: AWSDjangoDockerFastapiFlaskGitPandasPythonReactSnowflakeSQL

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account