Data Warehouse Engineer

Remote

FingerprintJS

The Fingerprint device intelligence platform works across web and mobile applications to identify all visitors with industry-leading accuracy — even if they’re anonymous.

View all jobs at FingerprintJS

Apply now Apply later

Fingerprint empowers developers to stop online fraud at the source.

We work on turning radical new ideas in the fraud detection space into reality. Our products are developer-focused and our clients range from solo developers to publicly traded companies. We are a globally dispersed, 100% remote company with a strong open-source focus. Our flagship open-source project is FingerprintJS (20K stars on GitHub).

We have raised $77M and are backed by Craft Ventures (previously invested in Tesla, Facebook, Airbnb ), Nexus Venture Partners (previously invested in Postman, Apollo.io, MinIO, Druva) and Uncorrelated Ventures (previously invested in Redis, Rollbar & Gradle).

We have noticed a rise in recruiting impersonations across the industry, where scammers attempt to access candidates' personal and financial information through fake interviews and offers. All Fingerprint recruiting email communications will always come from the @fingerprint.com domain. Any outreach claiming to be from Fingerprint via other sources should be ignored.

 

 

 

 

At Fingerprint, we’re redefining online security and identity verification through cutting-edge technology and data-driven insights. As a Data Warehouse Engineer, you’ll play a key role in shaping the data backbone that powers our industry-leading platform.

We’re looking for a collaborative, self-starter with a passion for data engineering and a knack for building efficient, scalable data pipelines and warehouses. In this role, you’ll work closely with data scientists, analysts, and engineers to design and implement robust solutions that drive smarter decision-making and fuel innovation.

What You’ll Do:

  • Architect and build scalable, high-performance data warehouse solutions that empower analytics and operational excellence across our platform.
  • Develop and maintain reliable, well-documented data pipelines and ELT processes, ensuring data quality and consistency.
  • Continuously tune and optimize data warehouse performance—partitioning, indexing, and query optimization are your bread and butter.
  • Partner with data analysts and scientists to transform raw data into insightful, actionable analytics and machine learning models.
  • Troubleshoot, monitor, and refine our data warehouse infrastructure, balancing speed and cost efficiency.
  • Foster a data-driven culture by sharing best practices and championing clean, well-structured data.
  • Work with modern cloud-based data warehouses like ClickHouse and ensure seamless integration with our services.
  • Contribute to the development of new developer-facing data tools that empower teams across the company.

What We’re Looking For:

  • A team player who loves collaborating in a tight-knit, cross-functional team.
  • A high degree of ownership and autonomy, with a proven ability to thrive in ambiguous environments.
  • Strong communication skills in English to navigate our global, remote team.
  • Extensive experience designing, implementing, and optimizing data warehouses in cloud environments (AWS, GCP, etc.).
  • Hands-on expertise with data modeling and transformation tools like dbt.
  • Excellent SQL skills and familiarity with modern columnar storage data warehouses (e.g., ClickHouse, Databricks, BigQuery, or Snowflake).
  • Experience with data pipeline orchestration tools like Prefect or Airflow.
  • Familiarity with BI tools (e.g., Apache Superset, Sigma, Tableau, Looker) is a plus.
  • A solid foundation in software engineering best practices: version control, code reviews, testing, and automation.
  • Exposure to infrastructure-as-code tools (like Terraform) and a DevOps mindset is a bonus.
  • Bonus points for scripting/automation experience in Bash, Python, or Go.

Technologies You’ll Work With:

  • Data Warehouses: ClickHouse, Redshift
  • Data Streaming: Confluent Kafka
  • Data Transformation: dbt, SQL
  • Data Orchestration: Prefect
  • Infrastructure: AWS, Terraform
  • Visualization: Apache Superset, Sigma

At Fingerprint, we believe that diverse perspectives fuel innovation, and we’re committed to building a team where everyone can thrive. If you’re passionate about data engineering and want to make a real impact, we’d love to hear from you!

We have noticed a rise in recruiting impersonations across the industry, where scammers attempt to access candidates' personal and financial information through fake interviews and offers. All Fingerprint recruiting email communications will always come from the @fingerprint.com domain. Any outreach claiming to be from Fingerprint via other sources should be ignored.

Offers vary depending on, but not limited to, relevant experience, education, certifications/licenses, skills, training, and market conditions. 

Due to regulatory and security reasons, there’s a small number of countries where we cannot have Fingerprint teammates based. Additionally, because Fingerprint is an all-remote company and people can join our workforce from almost any country, we do not sponsor visas. Fingerprint teammates need to be authorized to work from their home location.

We are dedicated to creating an inclusive work environment for everyone. We embrace and celebrate the unique experiences, perspectives and cultural backgrounds that each employee brings to our workplace. Fingerprint strives to foster an environment where our employees feel respected, valued and empowered, and our team members are at the forefront in helping us promote and sustain an inclusive workplace. We highly encourage people from underrepresented groups in tech to apply.

If you are applying as a resident of California, please read our CCPA notice here

If you are applying as a resident of the EU, please read our GDPR notice here

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS BigQuery Databricks Data pipelines Data quality Data warehouse dbt DevOps ELT Engineering GCP GitHub Kafka Looker Machine Learning ML models Open Source Pipelines Python Redshift Security Snowflake SQL Streaming Superset Tableau Terraform Testing

Perks/benefits: Salary bonus

Region: Remote/Anywhere

More jobs like this