Optum Logo

Optum

Senior Data Engineering Lead

Posted Yesterday
Be an Early Applicant
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
In-Office
Bangalore, Bengaluru Urban, Karnataka
Senior level
Lead the design and operation of data pipelines for analytics and AI/ML applications, while ensuring data quality and compliance. Mentor engineers and manage cross-team collaborations to optimize performance and cost.
The summary above was generated by AI
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
  • Design, build, and operate reliable data pipelines (batch/stream) that power analytics and AI/ML use cases end-to-end
  • Develop and curate high-quality datasets for AI features (e.g., prompt/RAG inputs, training data, evaluation sets) with clear contracts, SLAs, and lineage
  • Implement/use data quality, observability, and incident response practices (tests, anomaly detection, monitoring, runbooks, on-call readiness)
  • Build and/or maintain data models and semantic layers that make data easy and safe for internal customers (analytics, operations, product)
  • Partner with ML/AI engineers and product teams to produce AI systems (feature stores where appropriate, embedding pipelines, vector index refresh, offline/online consistency)
  • Establish and adhere to governance for sensitive data used in AI (PII/PHI handling, access controls, retention, auditability) and ensure compliance requirements are met
  • Optimize performance and cost across the data platform (query tuning, partitioning/clustering, workload management, storage lifecycle)
  • Define and enforce data contracts with upstream/downstream teams; lead discovery to clarify requirements, value, and adoption readiness
  • Create reusable frameworks/templates for pipelines, validation, and AI data preparation to reduce friction and increase consistency
  • Measure impact: instrument usage, quality, and business outcomes; support experimentation and A/B testing where relevant
  • Mentor engineers, drive technical standards, and lead design reviews with a focus on long-term maintainability
  • Contribute to architecture decisions: cloud data platform, orchestration, streaming, metadata/lineage, and AI-enablement tools
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:
  • Bachelor's degree in fine arts, Industrial Design, or related field. Concentration in UCD, HCI, or similar highly desired
  • 8+ years of production data engineering experience delivering and operating data pipelines/datasets used by multiple teams
  • Experience building training data pipelines and evaluation frameworks: ground truth management, reproducibility, data leakage prevention, and regression testing
  • Advanced SQL plus solid Python skills (or Scala/Java) for building robust data pipelines, libraries, and automation
  • Hands-on experience with Snowflake and Databricks (Delta/Unity concepts helpful), including modeling, performance tuning, and operational best practices
  • Orchestration experience with Apache Airflow and/or Azure Data Factory/Data Lake Flow; solid understanding of scheduling, retries, idempotency, and backfills
  • Solid Azure experience (storage patterns, networking basics, identity/access patterns); Infrastructure-as-Code experience (Terraform/Bicep) preferred
  • Demonstrated experience supporting RAG and/or LLM applications: embedding pipelines, document preprocessing, metadata strategies, vector stores, and retrieval evaluation
  • Proven ability to implement data quality/observability (Great Expectations/dbt tests/monitoring equivalents) and operate on-call with clear runbooks
  • Solid security/governance mindset, especially for regulated domains (PII/PHI), including auditing and least-privilege access
  • Proven excellent cross-team communication: can turn ambiguous asks into scoped work with clear success metrics and adoption plans

Preferred Qualifications:
  • Experience with ML tooling (MLflow/model registries), vector database operations, prompt/version management, and drift monitoring signals for LLM systems

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Top Skills

Apache Airflow
Azure Data Factory
Azure Data Lake
Bicep
Databricks
Java
Mlflow
Python
Scala
Snowflake
SQL
Terraform

Similar Jobs at Optum

Yesterday
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Junior
Junior
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
AI/ML Engineers develop and deploy software for ML models, communicate results to stakeholders, and leverage deep learning for various applications.
Top Skills: Automatic Speech RecognitionComputer VisionData AnalysisDeep LearningMachine LearningNatural Language ProcessingStatistical Languages
Yesterday
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Mid level
Mid level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Develop and implement machine learning solutions using LLMs, Langchain, and Docker, optimizing models through fine-tuning and RAG methodologies while ensuring production quality code.
Top Skills: DockerLangchainPythonXgboost
Yesterday
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Manager will lead a cloud networking team, ensuring operational excellence and security across multi-cloud platforms, driving process improvements and reporting.
Top Skills: AviatrixAWSAzureCi/CdGCPGitPalo Alto FirewallsTerraform

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account