Workiva Logo

Workiva

Senior Data Engineer

Sorry, this job was removed Sorry, this job was removed at 07:04 p.m. (IST) on Monday, Jun 23, 2025
Remote
Hiring Remotely in USA
Remote
Hiring Remotely in USA

Similar Jobs at Workiva

3 Days Ago
Remote
USA
Senior level
Senior level
Artificial Intelligence • Cloud • Fintech • Professional Services • Software • Analytics • Financial Services
Lead Business Systems Analyst specializing in Finance Systems, responsible for designing solutions, enhancing workflows, ensuring compliance, and managing complex projects.
Top Skills: BlacklineConcurConfluenceCoupaJIRAOracleSage IntacctSalesforceWorkday
3 Days Ago
Remote
USA
Mid level
Mid level
Artificial Intelligence • Cloud • Fintech • Professional Services • Software • Analytics • Financial Services
The Senior AI Enablement Engineer evaluates and implements AI solutions, collaborates cross-functionally, and enhances AI literacy through training and workshops.
Top Skills: Ai ToolsComputer VisionFrameworksLlmsNlp
4 Days Ago
Remote
USA
Mid level
Mid level
Artificial Intelligence • Cloud • Fintech • Professional Services • Software • Analytics • Financial Services
The Program Manager will lead GTM initiatives, driving cross-functional alignment to enhance operational excellence across revenue teams including Sales and Marketing.
Top Skills: GainsightMarketoSalesforce

As a Senior Data Engineer on the Workiva Carbon Engineering Team, you’ll partner closely with product engineers to design, build, and evolve the data products that power our mission-critical applications. You’ll leverage our central self-service data platform—built on dbt, DLT, Snowflake, Kafka, and more—to craft and maintain complex dbt models, enforce data quality guardrails, and deliver high-performance data products tailored to application use cases. You’ll operate within a highly regulated environment, collaborating through well-governed CI/CD pipelines without direct production access, and ensuring every release meets stringent compliance standards.

What You’ll Do

Data Product Development

  • Model & Transform: Design, build, and evolve a complex suite of dbt models—implementing best practices for testing, version control, and lineage tracking—to serve application-specific data needs

  • Ingestion & Processing: Author and maintain DLT pipelines for reliable batch and real-time ingestion. Developing automation and operational tasks in Python and Dagster

  • SQL Mastery: Write, optimize, and document advanced SQL queries and scripts to support ad-hoc analyses, model performance tuning, and data validation routines

  • APIs & Interfaces: Build APIs to expose curated data products to downstream applications and services

Data Quality & Compliance

  • Quality Frameworks: Implement data quality checks, monitoring, and alerting within dbt (e.g., custom tests, freshness checks) to enforce SLAs

  • Governed Releases: Navigate complex, regulated release pipelines using GitOps/CI/CD workflows—author pull requests, manage promotions through dev/test/prod, and collaborate with platform/infrastructure teams for gated approvals

  • Security & Controls: Adhere to information-protection policies, ensuring role-based access controls, audit logging, and encryption standards are in place

Collaboration & Mentorship

  • Cross-Functional Partnership: Work hand-in-hand with product managers, software engineers, data analysts, and central data platform teams to translate application requirements into scalable data solutions

  • Best Practices Evangelist: Mentor peers on dbt coding conventions, SQL performance tuning, and deployment processes; participate in code reviews and design discussions

Innovation & Strategy

  • Platform Feedback: Relay application-team insights back to the central data platform roadmap—identifying enhancements to tooling, documentation, or self-service capabilities

  • Continuous Learning: Stay current on emerging data-engineering technologies, advanced analytics patterns, and regulatory trends, and propose pilot projects to advance our analytics maturity

What You’ll Need

Minimum Qualifications

  • 5+ years in data engineering or analytics engineering, with hands-on ownership of dbt projects and data-model lifecycles

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field—or equivalent professional experience

Preferred Qualifications

  • SQL Expertise: Demonstrated mastery of SQL—able to write and optimize multi-join, window-function, and CTE-based queries at scale

  • Python Proficiency: Comfortable building ETL/ELT scripts, APIs, and automation frameworks in Python

  • Regulated Environments: Proven track record working in highly protected or regulated domains, following stringent release controls and compliance standards

  • Platform Tooling: Familiarity with DLT, Snowflake (Snowpipe, streaming, external tables), Kafka, and Superset (or equivalent BI tools)

  • Data Quality & Observability: Experience implementing dbt test suites, Great Expectations, or similar frameworks, plus monitoring via tools like Prometheus/Grafana or cloud-native services

  • Analytics Mindset: Background in analytical problem solving, statistical reasoning, or data science collaborations

  • Infrastructure as Code: Exposure to Terraform, AWS CloudFormation, or similar IaC tools for defining data-platform resources

Why Join Us?

  • Impactful Work: Directly shape the data products that drive your application’s success and delight end users

  • Collaborative Culture: Thrive in a fast-paced, high-trust environment where your voice on design and tooling truly matters

  • Growth & Learning: Access expert mentorship, internal training, and opportunities to pilot next-gen data-engineering technologies

Work Conditions & Requirements

  • Reliable high-speed internet for remote work

  • Occasional travel for team offsites or industry conferences (as needed)

How You’ll Be Rewarded

✅ Salary range in the US: $111,000.00 - $178,000.00

✅ A discretionary bonus typically paid annually

✅ Restricted Stock Units granted at time of hire

✅ 401(k) match and comprehensive employee benefits package

The salary range represents the low and high end of the salary range for this job in the US. Minimums and maximums may vary based on location. The actual salary offer will carefully consider a wide range of factors, including your skills, qualifications, experience and other relevant factors.

Employment decisions are made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other protected characteristic.

Workiva is committed to working with and providing reasonable accommodations to applicants with disabilities. To request assistance with the application process, please email talentacquisition@workiva.com.
 

Workiva employees are required to undergo comprehensive security and privacy training tailored to their roles, ensuring adherence to company policies and regulatory standards.

Workiva supports employees in working where they work best - either from an office or remotely from any location within their country of employment.

#LI-MJ2

What you need to know about the Chennai Tech Scene

To locals, it's no secret that South India is leading the charge in big data infrastructure. While the environmental impact of data centers has long been a concern, emerging hubs like Chennai are favored by companies seeking ready access to renewable energy resources, which provide more sustainable and cost-effective solutions. As a result, Chennai, along with neighboring Bengaluru and Hyderabad, is poised for significant growth, with a projected 65 percent increase in data center capacity over the next decade.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account