AGT Software Partners Logo

AGT Software Partners

Data Engineer

Posted 22 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
The Data Engineer will design, build, and maintain data pipelines for Data Warehouses, utilizing AWS, GCP, and SQL, while ensuring data quality and implementing CI/CD best practices.
The summary above was generated by AI

Job Overview:
We are looking for a skilled Data Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining high-quality data pipelines and workflows, ensuring reliable and accurate data for our Data Warehouses. This is an exciting opportunity to work with cutting-edge cloud technologies and data tools in a collaborative environment.

Start Date: Late March 2026
Time Zones: Sydney, AUS & London
Duration: 4-6 Months
Experience: 7years+

Key Responsibilities:

  • Design, develop, and maintain data pipelines for Data Warehouses.
  • Build scalable solutions using AWS and GCP.
  • Write efficient and optimized SQL, including analytical and window functions.
  • Transform and model data using dbt and BigQuery.
  • Orchestrate workflows using Airflow.
  • Ensure data quality and implement testing for pipelines.
  • Work in an Agile Delivery environment using JIRA and Confluence.
  • Follow CI/CD best practices using GitHub or Bitbucket.

Required Skills:

  • Strong proficiency in AWS and GCP.
  • Hands-on experience delivering data pipelines for Data Warehouses.
  • Advanced SQL skills, including analytical/window functions.
  • Experience with dbt and BigQuery.
  • Workflow orchestration experience using Airflow.
  • Knowledge of CI/CD processes and tools.
  • Familiarity with data quality checks and testing of pipelines.
  • Experience working in an Agile framework.

What We Offer:

  • Join a talented international team in a friendly, creative, and dynamic environment that fosters collaboration and support.
  • Opportunities for professional growth and development.
  • Enjoy the flexibility of working 100% remotely from anywhere in the world while contributing to cutting-edge projects.
  • Work 5 days a week (40 hours: Monday to Friday; Office Hours: 9 AM - 5 PM EST). Office time is flexible by 1 hour.
  • Competitive compensation: Receive a salary package commensurate with your experience and skill set.
  • Internet bill reimbursement.
  • The right candidate will receive training (all training and probation periods offered at AGT are fully paid; we value all candidates' time).

Candidates must be based in India or Bangladesh

Top Skills

Airflow
AWS
BigQuery
Bitbucket
Confluence
Dbt
GCP
Git
JIRA
SQL

Similar Jobs

4 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, implement and maintain Big Data solutions for Disability & Absence products. Build batch, speed, and serving layers using Spark, Hive, NoSQL, SOLR and Kafka; optimize Hadoop jobs; automate with Shell/Python; support deployments, CI/CD, runbooks, and cross-team collaboration.
Top Skills: Spark,Hive,Nosql,Solr,Scala,Pig,Kafka,Hbase,Nifi,Change-Data-Capture,Hadoop,Ci/Cd,Shell,Python,Azure,Google Cloud
6 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Data Engineer, you will work with GCP technologies, managing big data, utilizing ETL/ELT processes, and collaborating across teams while advancing in your career.
Top Skills: AirflowBigQueryBigtableCloud RunDatastoreETLGCPGcsJavaPysparkPythonSpannerSparkSQL
Yesterday
Remote
Hinjawadi, Pune, Mahārāshtra, IND
Mid level
Mid level
Fintech • Financial Services
Design, implement and optimize large-scale Databricks-based ETL/ELT pipelines on Azure. Transform data with PySpark/Spark and SQL, integrate multiple sources, ensure performance and cost optimization, enforce governance with Unity Catalog, and deploy/manage Databricks artifacts using Asset Bundles and GitLab.
Top Skills: Databricks,Delta Lake,Unity Catalog,Lakehouse Architecture,Table Triggers,Delta Live Pipelines,Databricks Runtime,Pyspark,Spark,Sql,Azure Blob Storage,Adls,Azure,Databricks Asset Bundles,Gitlab,Spark Streaming,Relational Databases

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account