The role involves designing and developing cloud-native ETL pipelines on Azure and Databricks, optimizing frameworks, collaborating with teams, and ensuring data quality.
Requisition Number: 2357622
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualification:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Contribute hands-on to the design, ETL development, and operation of cloud-native data pipelines on Azure and Databricks
- Design and optimize Spark-based ETL frameworks for large-scale batch and incremental processing
- Implement complex transformations and performance tuning using Python and Spark SQL
- Modernize on-prem ETL workloads to Databricks/snowflake, ensuring enterprise standards for scalability, reliability, and governance
- Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions
- Ensure data quality and integrity by implementing robust data validation and monitoring processes
- Develop comprehensive documentation for data engineering processes and systems
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's degree (B.E./B.Tech) in any engineering or related discipline
- Hands-on experience with Databricks and cloud-native data platforms, preferably on Azure
- Solid experience building ETL pipelines using Python, PySpark, and Spark SQL
- Experience with ETL tools (e.g., DataStage) and large-scale data warehouse implementations
- Experience migrating or modernizing on-prem ETL workloads to cloud architectures
- Solid knowledge on building AI solutions in Data pipelines
- Solid understanding of data warehousing concepts, including dimensional modeling, data quality, reconciliation, and performance tuning
- Proficient in SQL and working with relational and distributed database systems
- Familiarity with cloud ecosystems (Azure/AWS/GCP) and workflow orchestration tools such as Airflow
- Solid knowledge on building AI solutions
- Proven solid analytical and problem-solving skills with the ability to deliver scalable, reliable data solutions
- Demonstrated ability to lead a team
Preferred Qualification:
- Demonstrated ability to collaborate effectively across teams; leadership or mentoring experience
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Airflow
Azure
Databricks
Datastage
Pyspark
Python
Spark Sql
SQL
Optum Chennai, Tamil Nadu, IND Office
Chennai, India, India
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead backend development for scalable Amazon Connect solutions, implementing GenAI features, collaborating cross-functionally, and developing reusable frameworks.
Top Skills:
Amazon BedrockAmazon ConnectAws LambdaCi/CdInfrastructure As CodeLexPythonSagemakerStep Functions
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Analyze user tickets, provide data mapping for integration, create test cases and documentation, and support ETL processes with a focus on Azure.
Top Skills:
Azure AdfAzure DatabricksCloudEtl ToolsPythonSQLUnix
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Scrum Master-Agile Coach will facilitate Agile team events, guide and coach teams in Scrum and Kanban practices, eliminate bottlenecks, and improve workflow efficiency while working closely with Product Owners and other Scrum Masters in a global healthcare context.
Top Skills:
KanbanLean-AgileScrum
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

