Develop and maintain scalable data pipelines, optimize workflows, manage cloud storage, ensure data security, and collaborate across teams.
Requisition Number: 2355730
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Develop, enhance, and maintain scalable data pipelines to support enterprise analytics and reporting needs
- Design and optimize data processing workflows using Python, Spark, and Scala, following data warehousing and data-modeling best practices
- Build and support real-time and near-real-time data ingestion using Kafka and streaming frameworks
- Perform activities which include platform stability, incident resolution, and operational support
- Identify and remediate security vulnerabilities, ensuring compliance with enterprise and healthcare data-security standards
- Execute tools, framework, and software version upgrades to maintain platform currency and reliability
- Create, manage, and monitor Azure Data Factory (ADF) pipelines across development, test, and production environments
- Support and optimize Databricks workloads for batch and streaming use cases
- Manage cloud storage and data movement using Azure Blob Storage and AZ Copy
- Collaborate with cross-functional teams to troubleshoot issues, improve performance, and deliver reliable data solutions
- Contribute to documentation, knowledge transfer, and continuous improvement of the data platform
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's degree or equivalent experience
- Solid experience with UNIX/Linux environments and scripting
- Hands-on experience with IBM DataStage and Teradata
- Experience with Apache Spark (batch and/or streaming)
- Experience with Databricks and Snowflake in enterprise data platforms
- Experience working in or supporting healthcare data environments
- Working knowledge of Airflow for workflow orchestration
- Familiarity with GitHub and GitHub Copilot for source control and development productivity
- Solid understanding of data warehousing concepts, data modeling, and ETL/ELT best practices
- Proficiency in Python for data engineering and automation
- Proven excellent communication, analytical, and problem-solving skills
Preferred Qualifications:
- Experience supporting operations in large-scale data platforms
- Hands-on experience with Azure cloud services, especially ADF, Blob Storage, and Databricks
- Experience working in regulated or compliance-driven environments
- Proven exposure to Kafka-based streaming architectures
- Proven ability to mentor junior engineers and contribute to platform standardization efforts
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum Chennai, Tamil Nadu, IND Office
Chennai, India, India
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The role involves developing and maintaining data pipelines, optimizing workflows, managing cloud storage, and ensuring compliance with data security standards.
Top Skills:
AirflowSparkAzureAzure Blob StorageAzure Data FactoryDatabricksGitGithub CopilotIbm DatastageKafkaLinuxPythonSnowflakeTeradataUnix
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Manager - Data Engineering oversees enterprise data engineering platforms, leading technical teams, ensuring scalable architectures, and supporting AI/ML models and forecasting across healthcare domains.
Top Skills:
SparkAzure DatabricksGCPPysparkPythonSnowflakeSQL
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Director of Systems Management oversees UEM, DEX, endpoint security, and quality engineering, leading strategies and operations for endpoint management, compliance, and automation frameworks across systems.
Top Skills:
ControlupCrowdstrikeDefenderIntuneJAMFJmeterMecm/Sccm/ConfigmgrNexthinkPostmanPowershellPythonRest ApiSeleniumSentineloneSystrackTaniumUnified Endpoint Management (Uem)Workspace One
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

