ABOUT INNOCAP
Innocap is the world’s leading firm of managed account platform services. With over US$100 billion in assets under management, over 500 employees and offices in five countries, we are shaping the future of alternative investments for institutional owners and allocators. Our mission is to revolutionize the asset management industry and to provide customized expert services and an exceptional client experience.
We are seeking forward-thinking individuals to join us on our exciting journey. Innocap’s success is built on the diversity of our people and the strength of their ambitions. We empower our teams and foster a culture of inclusivity, collaboration, innovation, and growth. At Innocap, you'll have the opportunity to enhance your career, work on exciting projects, and make a real impact.
About the Role
We are looking for a highly skilled Lead Data Ingestion Engineer with 7–10+ years of experience in designing and leading scalable, secure, and automated data ingestion frameworks. This role requires deep expertise in SSIS, Databricks, Python, Azure Functions, as well as strong foundational skills in C#, .NET, DevOps, CI/CD, and Azure cloud. You will mentor a team of data engineers, define architectural best practices, ensure reliability of ingestion ecosystems, and collaborate closely with data platform, analytics, and business teams.
Key Responsibilities
Technical Leadership & Architecture
· Lead the design and development of enterprise-grade ingestion architectures using Databricks, SSIS, Azure Functions and cloud services.
· Define standards, frameworks, and reusable components for ingestion across the organization.
· Drive modernization initiatives such as SSIS-to-ADF migration, batch-to-real-time ingestion, and automation improvements.
· Ensure cost optimization, observability, data quality, and governance best practices.
Hands-On Engineering
· Build and enhance ETL/ELT workflows using Python, Databricks (PySpark), SQL, and Azure components.
· Develop C#/.NET services or functions for ingestion, validation, and integration with APIs/microservices.
· Optimize ingestion workflows for performance, cost, and scalability.
· Troubleshoot ingestion failures, production issues, and provide RCA.
Team & Stakeholder Collaboration
· Mentor and guide junior and mid-level data engineers.
· Conduct architectural reviews, code reviews, and provide technical coaching.
· Partner with Architects, DevOps Engineers, QA, and business stakeholders to align ingestion solutions with enterprise strategy.
Operations & Delivery
· Own the CI/CD deployment process for ingestion components using Azure DevOps.
· Ensure monitoring, alerting, automated validation, and recovery frameworks are in place.
· Lead production releases, incident management, and lifecycle maintenance.
· Design ingestion and data engineering frameworks with AI-powered monitoring and anomaly detection to automatically flag ingestion failures and propose remediation steps.
· automate unit test and integration test generation using AI agents to expand test coverage, detect edge cases proactively, and speed up QA cycles.
Qualifications
· Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
· 7–10+ years of experience in data engineering/data ingestion, with at least 2–3 years in a lead role.
· Deep experience with Databricks, Azure Functions, SSIS, and Python.
· Strong programming background in Python and C#/.NET.
· Solid hands-on experience with SQL Server, T-SQL, and performance tuning.
· Experience building CI/CD pipelines with Azure DevOps.
· Strong leadership, communication, and stakeholder management skills.
Must-Have Skills
· Strong Python development skills for ingestion, automation, and transformation.
· Experience migrating SSIS workloads to Azure-native tools.
· Proficiency with Azure Functions for serverless ingestion pipelines.
· Hands-on experience with Databricks (PySpark) for distributed data processing.
· Solid C#, .NET background for integration and ingestion services.
· Strong SQL and relational database ingestion experience.
· Experience with Git, CI/CD, and DevOps automation.
Nice-to-Have Skills
· Experience with Delta Lake, Lakehouse architecture, or Synapse.
· Experience with Docker/Kubernetes for containerized workloads.
· Knowledge of data governance, lineage, and security practices.
· Domain knowledge in BFSI or financial datasets.
Our Offerings
Hybrid work culture: Flexible working environment to promote a healthy work-life balance.
A smart, talented & agile team: International collaboration across locations and time zones, with a focus on learning, sharing, and fun.
Competitive compensation package: Attractive salary, comprehensive insurance, fitness discounts, and employee assistance programs.
Continuous learning opportunities: Access to professional development platforms and resources.
Diversity and Inclusion: Commitment to fostering a diverse and inclusive environment for all employees.
#Lihybrid
Innocap's Global Privacy Notice

.png)
