The role involves designing, developing, and maintaining data models and ETL pipelines using Python, SQL, and DBT, ensuring efficient data management for analysis-ready datasets.
Our client is a leading Data Management and Analytics company that helps organizations implement optimal Data Management, Analytics, and Cloud solutions to meet their business challenges. We are a trusted service consulting provider specializing in Informatica, AWS, Azure, Snowflake, GCP, focusing on Big Data, Cloud, Data Integration, Data Quality, and Data Security solutions. As an Informatica Platinum Partner, we deliver innovative solutions to drive business growth for our clients.
We are also a reseller of Informatica and various other market-leading vendors' products, consistently growing as a recognized leader in data management across industries.
Job Description:
We are seeking a skilled Python + SQL with DBT Developer to join our growing team. As a Python + SQL with DBT Developer, you will be responsible for designing, developing, and maintaining data models and transformations using DBT to create analysis-ready datasets. You will also be responsible for building and optimizing ETL pipelines using Snowflake and Python, ensuring efficient data extraction, transformation, and loading processes.
KEY ROLES AND RESPONSIBILITIES:
SKILLS REQUIRED:
We are also a reseller of Informatica and various other market-leading vendors' products, consistently growing as a recognized leader in data management across industries.
Job Description:
We are seeking a skilled Python + SQL with DBT Developer to join our growing team. As a Python + SQL with DBT Developer, you will be responsible for designing, developing, and maintaining data models and transformations using DBT to create analysis-ready datasets. You will also be responsible for building and optimizing ETL pipelines using Snowflake and Python, ensuring efficient data extraction, transformation, and loading processes.
KEY ROLES AND RESPONSIBILITIES:
- Develop and maintain robust data transformation pipelines using Python and SQL.
- Implement data models and schemas to ensure efficient storage and retrieval of information.
- Candidate should be able to perform and deliver under large and complex env.
- Develop DBT models to transform data into useful, actionable information
- Create Models and Identify Patterns.
- Build, test, and maintain database pipeline architectures
- Ensure compliance with data governance and security policies
- Collaborate with data engineers, data scientists, and business analysts to understand data requirements and deliver effective solutions.
SKILLS REQUIRED:
- Bachelors or Master’s degree and/or equivalent professional experience
- 2-6 years of experience as Data Engineer (Python, SQL).
- Strong knowledge of DBT tool and its features and capabilities
- Min 6 months to 1 year of hands-on experience in DBT (Data build tool) is a must.
- Experience working on Data warehouses like Snowflake, redshift or equivalent.
- Experience working on any ETL tools like Informatica, Matillion, etc
- Hands on experience with SQL and database design
- Strong understanding basic data warehouse and ETL concepts.
- Should be an excellent team player, with strong communication and ownership skills.
Top Skills
Dbt
Informatica
Python
Snowflake
SQL
Similar Jobs
Big Data • Cloud • Fintech • Financial Services • Conversational AI
As an Associate Data Engineer, you'll design and implement data models, build ETL/ELT pipelines, and maintain data governance to support business intelligence needs.
Top Skills:
AWSAzurePostgresPythonSnowflakeSQL
Big Data • Cloud • Fintech • Financial Services • Conversational AI
As VP, Fullstack Engineer, you will lead a high-performing engineering team, designing scalable data pipelines and collaborating with multiple teams to deliver high-quality data solutions.
Top Skills:
AWSAzureCi/CdFastapiKubernetesPostgresPythonReactSnowflakeTerraformTypescript
Information Technology • Software • Financial Services
The Site Reliability Engineer will support application migrations, diagnose problems, and manage infrastructure deployment while collaborating in a distributed environment.
Top Skills:
BashPythonSQLTcp/IpUdpUnix/Linux
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.