Senior Data Engineer

Remote - IA, United States

Workiva

Workiva's assured integrated reporting software connects finance, sustainability, audit, and risk for assured, data-driven insights.

View all jobs at Workiva

Apply now Apply later

As a Senior Data Engineer on the Workiva Carbon Engineering Team, you’ll partner closely with product engineers to design, build, and evolve the data products that power our mission-critical applications. You’ll leverage our central self-service data platform—built on dbt, DLT, Snowflake, Kafka, and more—to craft and maintain complex dbt models, enforce data quality guardrails, and deliver high-performance data products tailored to application use cases. You’ll operate within a highly regulated environment, collaborating through well-governed CI/CD pipelines without direct production access, and ensuring every release meets stringent compliance standards.

What You’ll Do

Data Product Development

  • Model & Transform: Design, build, and evolve a complex suite of dbt models—implementing best practices for testing, version control, and lineage tracking—to serve application-specific data needs

  • Ingestion & Processing: Author and maintain DLT pipelines for reliable batch and real-time ingestion. Developing automation and operational tasks in Python and Dagster

  • SQL Mastery: Write, optimize, and document advanced SQL queries and scripts to support ad-hoc analyses, model performance tuning, and data validation routines

  • APIs & Interfaces: Build APIs to expose curated data products to downstream applications and services

Data Quality & Compliance

  • Quality Frameworks: Implement data quality checks, monitoring, and alerting within dbt (e.g., custom tests, freshness checks) to enforce SLAs

  • Governed Releases: Navigate complex, regulated release pipelines using GitOps/CI/CD workflows—author pull requests, manage promotions through dev/test/prod, and collaborate with platform/infrastructure teams for gated approvals

  • Security & Controls: Adhere to information-protection policies, ensuring role-based access controls, audit logging, and encryption standards are in place

Collaboration & Mentorship

  • Cross-Functional Partnership: Work hand-in-hand with product managers, software engineers, data analysts, and central data platform teams to translate application requirements into scalable data solutions

  • Best Practices Evangelist: Mentor peers on dbt coding conventions, SQL performance tuning, and deployment processes; participate in code reviews and design discussions

Innovation & Strategy

  • Platform Feedback: Relay application-team insights back to the central data platform roadmap—identifying enhancements to tooling, documentation, or self-service capabilities

  • Continuous Learning: Stay current on emerging data-engineering technologies, advanced analytics patterns, and regulatory trends, and propose pilot projects to advance our analytics maturity

What You’ll Need

Minimum Qualifications

  • 5+ years in data engineering or analytics engineering, with hands-on ownership of dbt projects and data-model lifecycles

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field—or equivalent professional experience

Preferred Qualifications

  • SQL Expertise: Demonstrated mastery of SQL—able to write and optimize multi-join, window-function, and CTE-based queries at scale

  • Python Proficiency: Comfortable building ETL/ELT scripts, APIs, and automation frameworks in Python

  • Regulated Environments: Proven track record working in highly protected or regulated domains, following stringent release controls and compliance standards

  • Platform Tooling: Familiarity with DLT, Snowflake (Snowpipe, streaming, external tables), Kafka, and Superset (or equivalent BI tools)

  • Data Quality & Observability: Experience implementing dbt test suites, Great Expectations, or similar frameworks, plus monitoring via tools like Prometheus/Grafana or cloud-native services

  • Analytics Mindset: Background in analytical problem solving, statistical reasoning, or data science collaborations

  • Infrastructure as Code: Exposure to Terraform, AWS CloudFormation, or similar IaC tools for defining data-platform resources

Why Join Us?

  • Impactful Work: Directly shape the data products that drive your application’s success and delight end users

  • Collaborative Culture: Thrive in a fast-paced, high-trust environment where your voice on design and tooling truly matters

  • Growth & Learning: Access expert mentorship, internal training, and opportunities to pilot next-gen data-engineering technologies

Work Conditions & Requirements

  • Reliable high-speed internet for remote work

  • Occasional travel for team offsites or industry conferences (as needed)

How You’ll Be Rewarded

✅ Salary range in the US: $111,000.00 - $178,000.00

✅ A discretionary bonus typically paid annually

✅ Restricted Stock Units granted at time of hire

✅ 401(k) match and comprehensive employee benefits package

The salary range represents the low and high end of the salary range for this job in the US. Minimums and maximums may vary based on location. The actual salary offer will carefully consider a wide range of factors, including your skills, qualifications, experience and other relevant factors.

Employment decisions are made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other protected characteristic.

Workiva is committed to working with and providing reasonable accommodations to applicants with disabilities. To request assistance with the application process, please email talentacquisition@workiva.com.
 

Workiva employees are required to undergo comprehensive security and privacy training tailored to their roles, ensuring adherence to company policies and regulatory standards.

Workiva supports employees in working where they work best - either from an office or remotely from any location within their country of employment.

#LI-MJ2
Apply now Apply later
Job stats:  1  0  0
Category: Engineering Jobs

Tags: APIs AWS CI/CD CloudFormation Computer Science Dagster Data quality dbt ELT Engineering ETL Grafana Kafka Pipelines Privacy Python Security Snowflake SQL Statistics Streaming Superset Terraform Testing

Perks/benefits: 401(k) matching Career development Conferences Equity / stock options Salary bonus Startup environment

Regions: Remote/Anywhere North America
Country: United States

More jobs like this