Fractal Logo

Fractal

Data Engineer/Senior Data Engineer- Azure DigiOps

Posted Yesterday
Be an Early Applicant
In-Office
8 Locations
Mid level
In-Office
8 Locations
Mid level
Responsible for L2 technical support for data platforms, advanced troubleshooting, code-level fixes, and ensuring SLA compliance. Requires strong skills in Azure Data Factory, Databricks, SQL, and Python for performance tuning and data reliability.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Job Description: 

Need someone with strong Data Engineering skillet to ensure production (operations/support) related activities are delivered as per SLA. Need to work on issues/requests, bug fixes, minor changes, co-ordinate with the development team in case of any issues, work on enhancements.  

 

Role Details 
You will be part of the operation team providing L2 support to a client working in specified business hours or working in a 24*7 support model 

Provide Level-2 (L2) technical support for data platforms and pipelines built on Azure Data Factory (ADF), Databricks, SQL, and Python. This role involves advanced troubleshooting, root cause analysis, code-level fixes, performance tuning, and collaboration with engineering teams to ensure data reliability and SLA compliance. Must adhere to ITIL processes for Incident, Problem, and Change management. 

 
 

Key Responsibilities 

Advanced Troubleshooting & RCA 

  • Investigate complex failures in ADF pipelines, Databricks jobs, and SQL processes beyond L1 scope. 

  • Perform root cause analysis for recurring issues, document findings, and propose permanent fixes. 

  • Debug Python scripts, SQL queries, and Databricks notebooks to resolve data ingestion and transformation errors. 

  • Analyze logs, metrics, and telemetry using Azure Monitor, Log Analytics, and Databricks cluster logs. 

Code-Level Fixes & Enhancements 

  • Apply hotfixes for broken pipelines, scripts, or queries in non-production and coordinate controlled deployment to production. 

  • Optimize ADF activities, Databricks jobs, and SQL queries for performance and cost efficiency. 

  • Implement data quality checks, schema validation, and error handling improvements. 

Incident & Problem Management 

  • Handle escalated incidents from L1; ensure resolution within SLA. 

  • Create and maintain Known Error Database (KEDB) and contribute to Problem Records. 

  • Participate in Major Incident calls, provide technical insights, and lead recovery efforts when required. 

Monitoring & Automation 

  • Enhance monitoring dashboards, alerts, and auto-recovery scripts for proactive issue detection. 

  • Develop Python utilities or Databricks notebooks for automated validation and troubleshooting. 

  • Suggest improvements in observability and alert thresholds. 

Governance & Compliance 

  • Ensure all changes follow ITIL Change Management process and are properly documented. 

  • Maintain secure coding practices, manage secrets via Key Vault, and comply with data privacy regulations. 

 

 

Technical skills 
 

  • Azure Data Factory (ADF): Deep understanding of pipeline orchestration, linked services, triggers, and custom activities. 

  • Databricks: Proficient in Spark, cluster management, job optimization, and notebook debugging. 

  • SQL: Advanced query tuning, stored procedures, schema evolution, and troubleshooting. 

  • Python: Strong scripting skills for data processing, error handling, and automation. 

  • Azure Services: ADLS, Key Vault, Synapse, Log Analytics, Monitor. 

  • Familiarity with CI/CD pipelines (Azure DevOps/GitHub Actions) for data workflows. 

Non-technical skills 

  • Strong knowledge of ITIL (Incident, Problem, Change). 

  • Ability to lead technical bridges, communicate RCA, and propose permanent fixes. 

  • Excellent documentation and stakeholder communication skills. 

  • Drive Incident/Problem resolution by assisting in key operational activities in terms of delivery, fixes, and supportability with operations team. 

  • Experience working in ServiceNow is preferred. 

  • Attention to detail a must, with focus on quality and accuracy. 

  • Able to handle multiple tasks with appropriate priority and strong time management skills. 

  • Flexible about work content and enthusiastic to learn. 

  • Ability to handle concurrent tasks with appropriate priority. 

  • Strong relationship skills to work with multiple stakeholders across organizational and business boundaries at all levels. 

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Azure Data Factory
Azure Services
Ci/Cd Pipelines
Databricks
Python
SQL

Fractal Chennai, Tamil Nadu, IND Office

Chennai, Tamil Nadu, India, 600034

Similar Jobs

2 Hours Ago
In-Office
Pune, Mahārāshtra, IND
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead strategic technology initiatives to support enterprise workforce planning and vendor management, collaborating with senior leaders to align technology and business strategy.
Top Skills: Ai Portfolio ManagementApplication ManagementData Tracking ToolsVendor Management
4 Hours Ago
Easy Apply
Remote or Hybrid
India
Easy Apply
Mid level
Mid level
Consumer Web • HR Tech
As an Applied AI Engineer, you will design intelligent systems, develop chat-based UIs, and work on AI workflows using LLMs and orchestration tools.
Top Skills: APIsLangchainLlm ApisOpenaiPythonSQLTransformers
Yesterday
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
This role entails managing billing operations for Cardholder Services, analyzing inquiries, ensuring accurate billing, and delivering customer insights. Responsibilities include data analysis, maintaining service integrity, and supporting special projects.
Top Skills: Power BI

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account