Fidelity Investments Logo

Fidelity Investments

Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office
Chennai, Tamil Nadu, IND
Junior
In-Office
Chennai, Tamil Nadu, IND
Junior
Responsible for expanding and optimizing data pipelines, supporting data needs of multiple teams, and redesigning data architecture for fraud detection and prevention.
The summary above was generated by AI
Job Description:

Job Title – Data Engineer

The Purpose of This Role

FRC’s Customer Protection Technology team is responsible for processing and providing data and creating technology-based applications for detecting and reporting fraud activity. This would help an extended FRC team to respond to the fraud activity and mitigate. Data would also be shared to help business units (WI, Brokerage and Wealth, FI) and Finance teams. 

 

The Value You Deliver

We are looking for a Software Engineer to join our Customer Protection Technology group. The hire will be responsible for expanding and optimizing our data and data pipelines, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced modern data technologist with special focus on data ingestion, processing and consumption technologies. Candidate must be self-directed and comfortable supporting the data needs of multiple teams (Client Management), systems, and products both in Cloud and on-prem. The right candidate will be excited by the prospect of optimizing or even redesigning our organization’s data architecture to support our next generation of products and data initiatives.  

 

The Skills that are Key to this role

Technical / Behavioral

  • Bachelor’s or master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.).

  • 1+ years of experience in developing software working with SQL, relational databases, Snowflake, Python, and Java.

  • Preferred to have experience with Postgres, YugabyteDB, Aerospike, S3 and similar data management platforms.

  • Object oriented Python programming.

  • Experience with Docker, Kubernetes and AWS cloud deployment/application management is preferred.

  • Experience using continuous integration pipelines and automated deployment tools such as Jenkins is preferred.

  • Preferred to have Hands on experience with event-based systems, functional programming, new technologies and messaging frameworks such as Kafka.

  • Good knowledge on ETL/ELT experience with both structured and unstructured data stores.

  • Understanding of Data Modeling and experience in any of the Continuous Integration tools (e.g. Jenkins, GIT, Concourse, Terraform) tools

 

The Skills that is good to have for this role

  • Experience with Snowflake and Data Science/ AI/ML

  • Experience with object-oriented/object function scripting languages: Java, Python, etc. is preferred


How Your Work Impacts the Organization

Customer Protection Technology team is helping FRC (Fraud Risk and Control) team with Fraud detection and Fraud Prevention activity. As part of this effort, we are sourcing data in both batch and in real time and storing data in different forms of databases. Along with this we are also building new data warehouse solution for fraud loss reporting.

 

The Expertise we’re looking for

  • 1-4 years of experience in Data Warehousing, SQL and Python or Java

  • Graduate / Postgraduate

Location: Chennai

Shift timings: 11:00 am - 8:00pm

Certifications:

Category:Information Technology

Top Skills

Aerospike
AWS
Docker
Java
Jenkins
Kafka
Kubernetes
Postgres
Python
S3
Snowflake
SQL
Yugabytedb

Similar Jobs

Yesterday
In-Office
Chennai, Tamil Nadu, IND
Expert/Leader
Expert/Leader
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Support and operate SQL Server and MongoDB Atlas platforms, ensuring performance and reliability in a 24/7 production environment, including troubleshooting and incident management.
Top Skills: AWSAzureGCPGitKubernetesMongodb AtlasPowershellPythonShellSQL ServerTerraform
Yesterday
In-Office
Chennai, Tamil Nadu, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Principal Data Engineer supports and operates Oracle and MongoDB Atlas platforms, ensuring operational stability, troubleshooting, and performance optimization in cloud-based environments.
Top Skills: AWSAzureGCPGitKubernetesMongodb AtlasOraclePowershellPythonShellTerraform
19 Days Ago
In-Office
Chennai, Tamil Nadu, IND
Expert/Leader
Expert/Leader
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Principal Data Engineer will lead the design and development of data platforms and cost optimization strategies while ensuring data quality, security, and compliance. Responsibilities include managing databases, building ETL pipelines, and influencing stakeholders across the organization.
Top Skills: Apache AirflowAWSAzureBigQueryCloud ComposerCloudFormationDbtDockerGCPGithub ActionsJenkinsPythonRedshiftSnowflakeSQLTerraform

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account