Delta Exchange Logo

Delta Exchange

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
As a Senior Data Engineer, you will manage the ETL lifecycle, build batch pipelines, transform data using SQL and Python, and ensure data quality and performance for analytics.
The summary above was generated by AI

About the Company

At Delta, we are reimagining and rebuilding the financial system. Join our team to make a positive impact on the future of finance.

🎯 Mission Driven: Re-imagine and rebuild the future of finance.

💡 Most innovative cryptocurrency derivatives exchange. With a daily traded volume of ~$ 3.5 billion, and increasing. Delta is bigger than all the Indian crypto exchanges combined.

📈 Offer the widest range of derivative products and have been serving traders all over the globe since 2018 and growing fast.

💪🏻 The founding team is comprised of IIT and ISB graduates. Business co-founders have previously worked with Citibank, UBS and GIC; and our tech co-founder is a serial entrepreneur who previously co-founded TinyOwl and Housing.com.

💰 Funded by top crypto funds (Sino Global Capital, CoinFund, Gumi Cryptos) and crypto projects (Aave and Kyber Network).

Role Summary:

Support our analytics team by owning the full ETL lifecycle—from master data to analytics-ready datasets. You will build and maintain daily batch pipelines that process 1–10 million master-data rows per run (and scale up to tens or hundreds of millions of rows), all within sub-hourly SLAs. Extract from OLTP and time-series sources, apply SQL/stored-procedure logic or Python transformations, then load into partitioned, indexed analytics tables. Reads run exclusively on read-only replicas to guarantee zero impact on the production master DB. You’ll also implement monitoring, alerting, retries, and robust error handling to ensure near-real-time dashboard refreshes.


Requirements

Required Skills & Experience: 

* 4+ years in data engineering or analytics roles, building daily batch ETL pipelines at 1–10 M rows/run scale (and up to 100 M+).

* Expert SQL skills, including stored procedures and query optimisation on Postgres, Mysql, or similar RDBMS.

* Proficient in Python for data transformation (pandas, NumPy, SQLAlchemy, psycopg2).

* Hands-on with CDC/incremental load patterns and batch schedulers (Airflow, cron).

* Deep understanding of replicas, partitioning, and indexing strategies.

* Strong computer-science fundamentals and deep knowledge of database internals—including storage engines, indexing mechanisms, query execution plans and optimisers for MySQL and time-series DBs like TimescaleDB.

* Experience setting up monitoring and alerting (Prometheus, Grafana, etc.).

Key Responsibilities:

1. Nightly Batch Jobs: Schedule and execute ETL runs.

2. In-Database Transformations: Write optimised SQL and stored procedures.

3. Python Orchestration: Develop Python scripts for more complex analytics transformations.

4. Data Loading & Modelling: Load cleansed data into partitioned, indexed analytics schemas designed for fast querying.

5. Performance SLAs: Deliver end-to-end sub-hourly runtimes.

6. Monitoring & Resilience:Implement pipeline health checks, metrics, alerting, automatic retries, and robust error handling.

7. Stakeholder Collaboration: Work closely with analysts to validate data quality and ensure timely delivery of analytics-ready datasets.

Top Skills

Airflow
Grafana
MySQL
Postgres
Prometheus
Python
SQL
Timescaledb

Similar Jobs

2 Days Ago
Remote or Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Architect and deliver distributed platform components, lead technical initiatives, and ensure high-quality, scalable data systems. Collaborate across teams and adopt new technologies.
Top Skills: Apache FlinkApache IcebergSparkJavaKafkaKafka ConnectKubernetesMySQLOraclePostgres
Yesterday
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Healthtech • Pharmaceutical
The Senior Data Engineer will develop data pipelines for healthcare data, ensuring quality and reliability. Responsibilities include data ingestion, transformation, and collaboration with stakeholders.
Top Skills: AirflowAWSDagsterDbtFivetranMatillionPythonSnowflakeSQL
Yesterday
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Marketing Tech
The Senior Data Engineer will develop and maintain data pipelines, manage ingestion, and enforce data quality while collaborating with architects and stakeholders in a healthcare context.
Top Skills: AirflowDbtMatillionPythonSnowflakeSQL

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account