DRW Logo

DRW

Data Engineer

Posted 10 Days Ago
Easy Apply
Hybrid
Chicago, IL
Mid level
Easy Apply
Hybrid
Chicago, IL
Mid level
As a Data Engineer, you'll build and manage data products, collaborate with traders and researchers, design data solutions, and ensure data pipeline reliability and quality.
The summary above was generated by AI

DRW is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world. We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk.

Headquartered in Chicago with offices throughout the U.S., Canada, Europe, and Asia, we trade a variety of asset classes including Fixed Income, ETFs, Equities, FX, Commodities and Energy across all major global markets. We have also leveraged our expertise and technology to expand into three non-traditional strategies: real estate, venture capital and cryptoassets.

We operate with respect, curiosity and open minds. The people who thrive here share our belief that it’s not just what we do that matters–it's how we do it. DRW is a place of high expectations, integrity, innovation and a willingness to challenge consensus.


As a Data Engineer in our Unified Platform organization, you will play an integral role in building data products and defining platform capabilities used by Traders, Quantitative Researchers, and Back-Office personnel to analyze financial markets, determine trading opportunities, establish new strategies, and ensure smooth back-office processes. 

Technical requirements summary:  

  • Have experience designing and building data pipelines and systems
  • Have experience working within modern batch or streaming data ecosystems
  • An expert in SQL and have expertise in Java/Scala or Python
  • Can apply data modeling techniques
  • Able to own the delivery of data projects, working with analysts and stakeholders to understand requirements and implement solutions
  • Able to contribute to project management and project reporting 

What you will do in this role:  

  • Help model, build, and manage data products built atop DRW’s Unified Data Platform.
  • Work alongside domain experts to gain and leverage knowledge of the relevant project’s domain to design optimized data products that meet the needs of many different research and trading strategies.
  • Work closely with Traders and Researchers to determine appropriate data sources and implement processes to onboard and manage new data sources for analysis to unlock future trading opportunities.
  • Design and develop data solutions to help discover, purchase, organize, track usage, manage rights, and control quality of data sets to address the needs of various DRW trading teams and strategies.
  • Continually monitor data ingestion pipelines and data quality to ensure stability, reliability, and quality of the data. Contribute to the monitoring and quality control software and processes. 

What you will need in this role: 

  • 3+ years of experience working with modern data technologies and/or building data-first products
  • Extensive familiarity with SQL with a proven ability to construct maintainable queries and productionalize them as ongoing pipelines.
  • Experience leveraging data modeling techniques on complex datasets and ability to articulate the trade-offs of different approaches.
  • Strong skill in Java/Scala or Python
  • Familiarity with cloud-native technologies used for scalable data processing.
  • Experience with one or more data processing technologies (e.g. Flink, Spark, Polars, Dask, etc.)
  • Experience with multiple data storage technologies (e.g. S3, RDBMS, NoSQL, Delta/Iceberg, Cassandra, Clickhouse, Kafka, etc.) and knowledge of their associated trade-offs.
  • Experience with multiple data formats and serialization systems (e.g. Arrow, Parquet, Protobuf/gRPC, Avro, Thrift, JSON, etc.)
  • Experience managing data pipeline orchestration systems (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.)
  • Proven experience in managing the operational components of large data pipelines such as backfilling datasets, rerunning batch jobs, and handling dead letter queues.
  • Prior experience triaging data quality control processes, correcting data gaps and inaccuracies.
  • Strong technical problem-solving skills
  • Proven ability to work in a collaborative, agile, and fast-paced environment, prioritizing multiple tasks and projects, and efficiently handle the demands of a trading environment
  • Excellent written and verbal communication skills. 

Location: Chicago or New York   

The annual base salary range for this position is $150,000 to $200,000 depending on the candidate’s experience, qualifications, and relevant skill set. The position is also eligible for an annual discretionary bonus.  In addition, DRW offers a comprehensive suite of employee benefits including group medical, pharmacy, dental and vision insurance, 401k (with discretionary employer match), short and long-term disability, life and AD&D insurance, health savings accounts, and flexible spending accounts.

For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at https://drw.com/privacy-notice.

California residents, please review the California Privacy Notice for information about certain legal rights at https://drw.com/california-privacy-notice.

Top Skills

Airflow
Argo Workflows
Arrow
Avro
Cassandra
Clickhouse
Dagster
Dask
Delta
Flink
Grpc
Iceberg
Java
JSON
Kafka
Kubernetes
NoSQL
Parquet
Polars
Prefect
Protobuf
Python
Rdbms
S3
Scala
Spark
SQL
Thrift

Similar Jobs at DRW

14 Days Ago
Easy Apply
Hybrid
Chicago, IL, USA
Easy Apply
Mid level
Mid level
Fintech • Financial Services
Collaborate with teams to deliver data solutions for trading activities, design robust data systems, and develop APIs for easy data access.
Top Skills: AirflowApache IcebergC++DagsterDelta LakeJavaKafkaLinuxPythonRelational Databases
7 Days Ago
Easy Apply
Hybrid
Chicago, IL, USA
Easy Apply
Expert/Leader
Expert/Leader
Fintech • Financial Services
The Cloud Data Platform Engineer designs and builds secure, scalable data platforms, serving as a subject matter expert for Snowflake and cloud services, collaborating across teams to enhance data access and analysis.
Top Skills: AWSAzureGCPSnowflake
Yesterday
Easy Apply
Hybrid
Chicago, IL, USA
Easy Apply
Junior
Junior
Fintech • Financial Services
The Desktop Systems Engineer provides end user support, manages workstation configurations, troubleshoots desktop servers, and automates system tasks, while evaluating new technologies.
Top Skills: Memcm,Sccm,Windows 10,Group Policy,Powershell,Intune

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account