HighLevel Logo

HighLevel

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
The Senior Data Engineer will design and maintain pipelines for event data ingestion and validation, ensuring operational reliability and consistency for analytics.
The summary above was generated by AI

About HighLevel:
HighLevel is an AI powered, all-in-one white-label sales & marketing platform that empowers agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. We are proud to support a global and growing community of over 1 million businesses, comprised of agencies, consultants, and businesses of all sizes and industries. HighLevel empowers users  with all the tools needed to capture, nurture, and close new leads into repeat customers. As of mid 2025, HighLevel processes over 4 billion API hits and handles more than 2.5 billion message events every day. Our platform manages over 470 terabytes of data distributed across five databases, operates with a network of over 250 microservices, and supports over 1 million hostnames.

Our People
With over 1,500 team members across 15+ countries, we operate in a global, remote-first environment. We are building more than software; we are building a global community rooted in creativity, collaboration, and impact. We take pride in cultivating a culture where innovation thrives, ideas are celebrated, and people come first, no matter where they call home.
 
Our Impact
As of mid 2025, our platform powers over 1.5 billion messages, helps generate over 200 million leads, and facilitates over 20 million conversations for the more than 1 million businesses we serve each month. Behind those numbers are real people growing their companies, connecting with customers, and making their mark - and we get to help make that happen.

About the Role:

    We are looking for a Senior Product Data Engineer to own the event ingestion and identity layer that connects product instrumentation to downstream analytical systems.
    This role focuses on the operational reliability and correctness of event and identity data as it moves through the data platform. You will design and operate pipelines, schema validation, and replay workflows that ensure product events remain consistent and safe to use for analytics and customer-facing reporting.
    You will work closely with product engineering teams on instrumentation patterns, with the CDP team on event contracts and definitions, and with platform teams to ensure event infrastructure and analytical systems scale reliably. This role builds the foundational event and identity datasets required for reliable downstream modeling. Behavioral models, canonical entities, and business analytics datasets are owned by the analytics engineering team.

Responsibilities:

  • Define event schemas, required fields, and compatibility rules in collaboration with the CDP team
  • Implement automated validation and contract enforcement to prevent breaking schema changes
  • Maintain versioning and compatibility guarantees for event producers and downstream consumers
  • Build and maintain pipelines that ingest, validate, and process high-volume product events
  • Ensure event streams are deduplicated, ordered correctly, and safe for downstream consumption
  • Partner with platform teams to ensure ingestion pipelines scale with product growth
  • Define and maintain identity stitching logic across anonymous and authenticated users
  • Handle identity merges, splits, and corrections while preserving tenant boundaries
  • Ensure identity resolution remains explainable, deterministic, and safe for downstream datasets
  • Design workflows that allow event datasets and identity graphs to be replayed or rebuilt safely
  • Build tooling for historical corrections, schema evolution, and dataset reprocessing
  • Ensure downstream models can be rebuilt without manual intervention when definitions evolve
  • Provide guidance and tooling that help product teams emit events consistently
  • Maintain validation checks and schema enforcement that catch instrumentation issues early
  • Collaborate with engineering teams to evolve instrumentation safely over time
  • Ensure deletion and suppression requests propagate correctly through event and identity pipelines
  • Partner with governance and security teams to support policy requirements
  • Define requirements and interfaces for event infrastructure and downstream analytical systems
  • Work with platform teams to ensure pipelines remain reliable, scalable, and observable.

Requirements:

  • 4+ years of experience in data engineering, platform engineering, or product data roles
  • Strong experience building and operating event ingestion or streaming pipelines
  • Experience implementing schema validation, data contracts, or event governance frameworks
  • Strong SQL and Python, with experience building data processing or validation tooling
  • Familiarity with identity resolution, entity resolution, or customer identity systems
  • Experience operating analytical data systems or large-scale event datasets

EEO Statement:
The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government record-keeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.
#LI-Remote #LI-NJ1

Top Skills

Python
SQL

Similar Jobs

8 Days Ago
Remote
India
Senior level
Senior level
Information Technology • Software • Consulting
The Senior Data Engineer will design and build data pipelines on Azure Databricks, ensuring data security and governance while collaborating with clients and vendors on projects.
Top Skills: Azure DatabricksDelta LakeEltETLPyspark
9 Days Ago
Remote
India
Senior level
Senior level
Information Technology • Marketing Tech • Social Media
As a Senior Data Engineer, you will develop scalable data pipelines, build financial applications, and deliver analytics solutions for finance operations using AWS and other tools.
Top Skills: AWSAws QuicksightGoPythonSnowflakeSQLTeradataTypescript
12 Days Ago
Remote
India
Senior level
Senior level
Information Technology • Consulting
The Senior Data Engineer will build and optimize data pipelines in Azure Databricks, migrate existing pipelines from Azure Synapse, and design scalable data models to support a financial services client's data platform modernization.
Top Skills: Azure DatabricksAzure SynapseDelta LakeSparkSQL

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account