Data Engineer

San Francisco

HappyRobot

Automate communication across channels with AI workers that integrate with your systems, manage conversations, & log data.

View all jobs at HappyRobot

Apply now Apply later

About HappyRobot

HappyRobot is a platform to build and deploy AI workers that automate communication. See a demo

Our AI workers connect to any system or data source to handle phone calls, email, messages…

We target the logistics industry which relies heavily on communication to book, check on, & pay for freight. Primarily working with freight brokers, 3PLs, freight forwarders, shippers, warehouses, & other supply chain enterprises and tech startups.

We raised a Series A round from a16z and YC and we’re growing very fast, on track to hit double digit ARR by June this year.

We're looking for rockstars with a relentless drive, unstoppable energy, and a true passion for building something great—ready to embrace the challenge, push limits, and thrive in a fast-paced, high-intensity environment, working from 9 to 9, six days a week.

About The Role

We are looking for a Data Engineer to design, build, and optimize scalable data solutions that drive business insights and decision-making.

What You’ll Do

  • Collaborate with business stakeholders to understand and define data needs

  • Partner with product, engineering, and third-party teams to gather and integrate required data

  • Design and implement scalable, high-performance data pipelines and models for both our Data Lake and Data Warehouse

  • Establish and maintain data quality checks, validation processes, and monitoring systems

  • Enhance the scalability and robustness of our ETL infrastructure

  • Manage and evolve a portfolio of reliable data products that deliver trustworthy insights

  • Support and mentor new engineers as they join the team

Must Have

  • 3+ years of experience in data engineering, BI, or a similar technical role

  • Proficiency in programming languages like Python or Java

  • Hands-on experience with ETL orchestration tools (e.g., Airflow, Flink, Oozie, or Azkaban) on AWS or GCP

  • Deeply knowledgeable in SQL, database design, and distributed computing

  • Has worked with big data technologies such as Spark, Hive, Druid, Presto, and streaming platforms like Kafka or Flink

  • Familiar with modern data warehouses and databases such as Snowflake, Redshift, and PostgreSQL

  • Communicate effectively across both technical and non-technical teams

  • Experience with BI tools like Tableau, Looker, or Superset

  • Thrive in a fast-paced, self-directed environment and is highly organized

Why join us?

  • Opportunity to work at a high-growth AI startup, backed by top investors.

  • Fast Growth - Backed by a16z and YC, on track for double-digit ARR.

  • Ownership & Autonomy - Take full ownership of projects and ship fast.

  • Top-Tier Compensation - Competitive salary + equity in a high-growth startup.

  • Comprehensive Benefits - Healthcare, dental, vision coverage.

  • Work With the Best - Join a world-class team of engineers and builders.

The personal data provided in your application and during the selection process will be processed by Happyrobot, Inc., acting as Data Controller.

By sending us your CV, you consent to the processing of your personal data for the purpose of evaluating and selecting you as a candidate for the position. Your personal data will be treated confidentially and will only be used for the recruitment process of the selected job offer.

In relation to the period of conservation of your personal data, these will be eliminated after three months of inactivity in compliance with the GDPR and legislation on the protection of personal data.

If you wish to exercise your rights of access, rectification, deletion, portability or opposition in relation to your personal data, you can do so through security@happyrobot.ai subject to the GDPR.

For more information, visit https://www.happyrobot.ai/privacy-policy

By submitting your request, you confirm that you have read and understood this clause and that you agree to the processing of your personal data as described.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azkaban Big Data Data pipelines Data quality Data warehouse Engineering ETL Flink GCP Java Kafka Looker Oozie Pipelines PostgreSQL Privacy Python Redshift Security Snowflake Spark SQL Streaming Superset Tableau

Perks/benefits: Career development Competitive pay Equity / stock options Health care Startup environment

Region: North America
Country: United States

More jobs like this