The Lead Data Engineer will focus on Azure Data Engineering, utilizing skills in AKS, Event Hub, Cosmos DB, and Azure DevOps to design and implement data solutions.
About the Role
We are seeking a Microsoft Fabric Data Engineer with 7+ years of experience for a lead role. The ideal candidate will be responsible for designing, developing, and deploying data pipelines, ensuring efficient data movement and integration within Microsoft Fabric.
Responsibilities
Data Pipeline Development: Design, develop, and deploy data pipelines within Microsoft Fabric, leveraging OneLake, Data Factory, and Apache Spark to ensure efficient, scalable, and secure data movement across systems.
ETL Architecture: Architect and implement ETL workflows optimized for Fabric’s unified data platform, streamlining ingestion, transformation, and storage.
Data Integration: Build and manage integration solutions that unify structured and unstructured sources into Fabric’s OneLake ecosystem. Utilize SQL, Python, Scala, and R for advanced data manipulation.
Fabric OneLake & Synapse: Leverage OneLake as the single data lake for enterprise-scale storage and analytics, integrating with Synapse Data Warehousing for big data processing and reporting.
Cross-functional Collaboration: Partner with Data Scientists, Analysts, and BI Engineers to ensure Fabric’s data infrastructure supports Power BI, AI workloads, and advanced analytics.
Performance Optimization: Monitor, troubleshoot, and optimize Fabric pipelines for high availability, fast query performance, and minimal downtime.
Data Governance & Security: Implement governance and compliance frameworks within Fabric, ensuring data lineage, privacy, and security across the unified platform.
Leadership & Mentorship: Lead and mentor a team of engineers, oversee Fabric workspace design, code reviews, and adoption of new Fabric features.
Automation & Monitoring: Automate workflows and orchestration using Fabric Data Factory, Azure DevOps, and Airflow, ensuring smooth operations.
Documentation & Standards: Document Fabric pipeline architecture, data models, and ETL processes. Contribute to Fabric engineering best practices and enterprise guidelines.
Innovation: Stay current with Fabric’s evolving capabilities (Real-Time Analytics, Data Activator, AI integration) and drive innovation within the team.
Qualifications
7+ years of experience in data engineering, with a strong focus on Microsoft Fabric and related technologies.
Required Skills
Proficiency in SQL, Python, Scala, and R.
Experience with data pipeline development and ETL processes.
Strong understanding of data governance and security practices.
Ability to lead and mentor a team effectively.
Preferred Skills
Experience with Azure DevOps and Airflow.
Familiarity with Power BI and AI workloads.
Knowledge of real-time analytics and data activator features.
Pay range and compensation package
Competitive salary based on experience and qualifications.
Equal Opportunity Statement
We are committed to creating a diverse and inclusive environment for all employees. We encourage applications from individuals of all backgrounds and experiences.
Primary Skills
- AKS, Event Hub, Azure DevOps, Cosmos DB, Azure Functions
Specialization
Job requirements
Top Skills
Aks
Azure
Azure Devops
Azure Functions
Cosmos Db
Event Hub
Ms Fabric
Brillio Chennai, Tamil Nadu, IND Office
Chennai, India
Similar Jobs
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Drive enterprise-wide automation initiatives by designing and implementing solutions using N8N and Tray.ai. Collaborate with teams to automate processes, integrate systems, and ensure high availability of workflows while mentoring others and reporting on impact.
Top Skills:
Gemini EnterpriseJSONMcpN8NPythonRest ApisSQLTray.AiXML
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The role involves designing and implementing automation solutions using N8N and Tray.ai, developing AI-powered workflows, and integrating various applications while collaborating with cross-functional teams.
Top Skills:
Ci/CdCoupaDevOpsGemini EnterpriseJSONMcpN8NNetSuitePythonRest ApisSalesforceSAPSnowflakeSQLTray.IoWorkdayXML
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The Consultant Database Administrator will manage and optimize DB2 and PostgreSQL databases, writing complex SQL queries, ensuring data integrity, and implementing security measures.
Top Skills:
Db2PostgresSQL
What you need to know about the Chennai Tech Scene
To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.


