The Data Engineer will build and maintain ETL/ELT pipelines, implement automated data quality checks, optimize data storage, and create interactive visualizations. They will manage CI/CD pipelines and deliver production-ready data solutions.
- Years of experience: 8 to 10 years
Focus: Reliability, Automation, and Data Quality. The Engineer turns the architect's blueprint into a working, automated reality.
- Key Responsibilities:
- Build and maintain ETL/ELT pipelines to move data from source to storage.
- Implement automated data quality testing and observability alerts.
- Optimize data storage formats (e.g., Parquet, Delta Lake) for high-speed querying.
- Manage CI/CD pipelines for data code deployment.
- Create interactive visualizations with actionable insights by persona
- Deliverables: Production-ready optimized data pipelines, Cleaned datasets, interactive & intelligent dashboards.
- Tech Stack - AWS
- Data Ingestion & Processing : AWS API Gateway, AWS Lambda, AWS Glue ETL, AWS Glue Crawler
- Data Storage & Analytics : Amazon S3, Amazon Redshift (Data Warehouse), Amazon Athena
- Governance & Security : AWS Lake Formation, AWS IAM, AWS CloudTrail, Amazon CloudWatch
- AI & Analytics : AWS Bedrock
- Visualization : Amazon QuickSight
- Visa Requirement: Preferred USA B1 Business Visa with a minimum validity of 2 years
Top Skills
Amazon Athena
Amazon Cloudwatch
Amazon Quicksight
Amazon Redshift
Amazon S3
Aws Api Gateway
Aws Bedrock
Aws Cloudtrail
Aws Glue Crawler
Aws Glue Etl
Aws Iam
Aws Lake Formation
Aws Lambda
Similar Jobs
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Senior Big Data Engineer designs and maintains data pipelines, ensuring scalability, performance, and security while collaborating with various business units.
Top Skills:
SparkAzure Data FactoryAzure FunctionsCosmos DbData LakesData MartsData WarehousesDatabricksDistributed DatabasesETLEventhubHadoopHbaseHiveMongoDBNifiNoSQLPythonScalaSQLSynapseUnix Shell Scripting
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Big Data Engineer II is responsible for designing and maintaining data pipelines on cloud and on-premises, ensuring data quality and compliance, collaborating with stakeholders, and utilizing big data frameworks and cloud platforms.
Top Skills:
SparkAzure Data FactoryAzure FunctionsCosmos DbDatabricksEvent HubHadoopHbaseHiveMongoDBNifiNoSQLPythonScalaSQLSynapseUnix
Information Technology • Consulting
The Data Engineer is responsible for implementing business logic in the Data Warehouse, designing ETL pipelines, optimizing data loading processes, and collaborating with stakeholders to improve data architecture.
Top Skills:
Azure Data FactoryAzure DevopsData WarehousingDatabricksDbtDimensional ModelingGitAzureMs Sql ServerSQLT-Sql
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

