Synechron Logo

Synechron

Microsoft Fabric & Azure Data Engineer with Delta Lake and Cloud Expertise

Posted 3 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Mid level
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Mid level
The Data Engineer will design, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure services, ensuring data quality and performance while collaborating with cross-functional teams.
The summary above was generated by AI

Job Summary
Synechron is seeking a highly skilled Data Engineer with Microsoft Azure, Fabric Stack, and Lakehouse expertise to support enterprise data platform initiatives. This role focuses on designing, developing, and maintaining scalable data pipelines leveraging Microsoft Fabric (OneLake, Data Factory, Lakehouse, Warehouse) and Azure data services. The successful candidate will work closely with cross-functional teams to implement modern lakehouse architectures, optimize data workflows, and support intelligent data solutions, ensuring data quality, security, and performance align with organizational goals.

Software Requirements

  • Required:

    • Hands-on experience with Microsoft Fabric components including OneLake, Data Factory, Lakehouse, and Warehouse

    • Strong proficiency in Azure data services: Azure Data Factory, Azure SQL, Key Vault, RBAC

    • Practical experience working with Delta Lake within the Microsoft ecosystem

    • Advanced Python and SQL skills for building, optimizing, and managing scalable data pipelines

    • Familiarity with enterprise data platforms and integration within Microsoft-native environments

  • Preferred:

    • Experience with AI-enabled data solutions and data science integration

    • Knowledge of additional data processing tools such as Power BI or Data Lake Analytics

    • Experience working in cross-platform or multi-cloud environments

    • Exposure to DevOps practices for data deployment and automation

Overall Responsibilities

  • Design, develop, and maintain scalable, high-performance data pipelines using Microsoft Fabric and related Azure services.

  • Collaborate with data analysts, product teams, and business units to understand data requirements and translate them into resilient data architectures.

  • Implement and optimize storage, processing, and data governance strategies for lakehouse architectures.

  • Conduct performance tuning, resource management, and troubleshooting for Data Factory, Lakehouse, and database components.

  • Develop data validation, quality checks, and monitoring routines to ensure data integrity and compliance.

  • Automate data workflows and deployment processes using Azure DevOps or other automation tools.

  • Support data security policies, including data encryption, access controls (RBAC), and key management.

  • Stay informed of industry trends and emerging Microsoft Data and AI technologies to recommend innovative solutions.

  • Document architecture, configurations, and operational procedures for ongoing support and compliance.

Technical Skills (By Category)

  • Data Platform & Storage (Essential):

    • Microsoft Fabric (OneLake, Data Factory, Lakehouse, Warehouse)

    • Delta Lake technology and architecture within Microsoft ecosystem

    • Cloud storage and data lake management in Azure

  • Data Processing & Automation (Essential):

    • PySpark for data transformation and processing

    • SQL for data modeling, validation, and querying

    • Python scripting for automation and pipeline management

  • Cloud & Infrastructure (Preferred):

    • Azure Data Services, Azure Data Factory, Azure SQL, Key Vault, RBAC

    • Containerization and orchestration (Docker, Kubernetes) for deployment (preferred)

  • Analytics & BI (Preferred):

    • Power BI, Data Lake Analytics — optional but advantageous for reporting

  • Development Tools & Methodologies (Essential):

    • Git for version control

    • Azure DevOps, Jenkins or similar CI/CD tools

    • Data architecture modeling (UML, Data Flow Diagrams)

Experience Requirements

  • Minimum of 3+ years supporting or developing data pipelines in enterprise environments, ideally leveraging Microsoft Fabric and Azure data services.

  • Proven expertise developing and managing scalable Lakehouse architectures and Delta Lake solutions.

  • Experience working within MS Azure cloud platforms supporting data processing, automation, and security.

  • Support experience supporting business-critical data workflows in regulated industries is a plus.

  • Alternative pathways include extensive hands-on experience designing data lakes, pipelines, and cloud-native data solutions.

Day-to-Day Activities

  • Develop, configure, and optimize data pipelines within Microsoft Fabric and Azure.

  • Collaborate with technical and business teams to gather requirements and deliver scalable data solutions.

  • Monitor, troubleshoot, and performance tune data workflows for efficiency and reliability.

  • Implement data validation, quality assurance, and security protocols to ensure data integrity.

  • Automate deployment and pipeline management using Azure DevOps or similar tools.

  • Lead documentation of architecture, configurations, and operational procedures.

  • Stay updated on new features, best practices, and innovative technologies related to Microsoft Fabric and Azure Data ecosystems.

  • Support system upgrades, optimize resource utilization, and enforce compliance standards.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.

  • 3+ years experience with Microsoft Fabric, Data Lake, and Azure data services.

  • Experience designing and implementing scalable lakehouse architectures using Delta Lake.

  • Strong Python and SQL skills for data pipeline development and management.

  • Relevant certifications (e.g., Microsoft Certified: Data Engineer Associate, Azure Architect) are a plus.

  • Proven ability to support enterprise data initiatives with a focus on security, compliance, and performance.

Professional Competencies

  • Critical thinking with a focus on designing scalable, reliable data architectures.

  • Strong analytical and troubleshooting skills for complex data workflows.

  • Effective communication and stakeholder management across technical and business teams.

  • Adaptability and continuous learning to keep pace with evolving data and cloud technologies.

  • Ownership of data quality, security, and operational efficiency.

  • Ability to handle multiple projects, prioritize tasks, and meet deadlines.

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Top Skills

Azure Data Factory
Azure Devops
Azure Sql
Delta Lake
Microsoft Fabric
Power BI
Python
Rbac
SQL

Similar Jobs

An Hour Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Business Analyst, you'll lead digital transformation projects in banking, ensuring requirements align with product goals through strong stakeholder collaboration and documentation processes.
Top Skills: JIRA
An Hour Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead the delivery of complex digital transformation programs in mobile banking. Manage customer journeys, engage stakeholders, mitigate risks, and operate within Agile frameworks.
Top Skills: AgileSafeScrum
2 Hours Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Sr. Accounts Associate prepares and reviews manual journal entries, reconciles accounts, and generates financial reports for management.
Top Skills: AccountingFinancial Reporting

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account