Design and maintain scalable data pipelines, optimize ETL workflows, implement data validation, and collaborate with teams for data modeling and analytics.
Requisition Number: 2343878
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Design and maintain scalable data pipelines across AWS (Glue, Lambda, RDS) and Azure Databricks
- Build and optimize end to end ETL/ELT workflows for structured and semi structured data
- Develop Spark transformations using PySpark and SQL with strong performance and cost practices
- Translate business needs into data models, curated datasets, and reusable assets
- Implement data validation, reconciliation, and anomaly checks for pipeline accuracy
- Manage and optimize cloud storage and compute, ensuring security and governance compliance
- Use Git and CI/CD for efficient, reliable development and deployment
- Troubleshoot and resolve pipeline failures and performance issues
- Partner with infrastructure and security teams to meet architecture and compliance standards
- Leverage AI tools (Copilot, IntelliCode, Databricks Assistant, VS AI tools) to accelerate development
- Participate in design/code reviews and improve engineering standards
- Provide operational support, monitoring, and on call coverage for production pipelines
- Document pipelines, workflows, designs, and runbooks for maintainability
- Support data migration, modernization, and cloud platform initiatives
- Collaborate with analytics and AI teams to enable advanced reporting and ML use cases
- Contribute across design, development, testing, deployment, and operations as needed
- Maintain living docs (system diagrams, SLOs, on call guides) and self healing playbooks
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Experience or knowledge in the following:
- Cloud: AWS (Glue, Step Functions, Lambda, RDS) and Azure (Databricks)
- ETL Development: Designing and building pipelines across AWS & Azure
- Programming: Python, Spark, SQL
- Version Control: Git
- AI Productivity Tools (Mandatory): GitHub Copilot, GenAI API, IntelliCode, Databricks Assistant, Visual Studio AI Studio extensions
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Ai Tools
Aws Glue
Aws Lambda
Aws Rds
Azure Databricks
Ci/Cd
Git
Pyspark
SQL
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The ITSM Analyst is responsible for managing IT Service Management operations, improving service delivery, and collaborating on process enhancements, focusing on governance and efficiency.
Top Skills:
It Service ManagementItilItsmItsm Tools
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead offshore engineering teams delivering full stack and Generative AI solutions. Provide technical leadership, mentor engineers, and ensure delivery quality.
Top Skills:
Github CopilotLangchainLanggraphMongoDBNode.jsPythonReactRestful Apis
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and develop scalable web applications, integrate AI tools, collaborate on backend services, and maintain clean code practices.
Top Skills:
Github CopilotJavaScriptLangchainLanggraphMongoDBNode.jsPythonReactRedux
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

