Senior Data Analytics Engineer

San Francisco, CA

CloudTrucks

CloudTrucks is a new way to manage your trucking business. Maximize revenue and minimize headaches with technology-driven products that give you everything you need to make the most of your trucking business.

View all jobs at CloudTrucks

Apply now Apply later

The trucking industry is the backbone of the global economy. More than 70% of what we consume in the U.S. is moved by trucks. Those trucks are powered by over 3.5 million drivers per year and  create over $700B in annual revenue. Trucking is a massive industry but it is a traditional industry and like many traditional industries, it is ripe for innovation.

CloudTrucks is building the operating system for trucking and is the first platform specifically designed to empower truck drivers. Our all-in-one, “business in a box” solution optimizes and automates operations and accelerates cash-flow for drivers, so they can focus on building their business.

The Data Analytics team at CloudTrucks owns all business reporting end-to-end, and develops both internal and customer-facing, data analytics products in collaboration with Product development teams.

As a Senior Data Analytics Engineer, you’ll be responsible for maintaining and scaling our Data Infrastructure. You’ll also have the opportunity to collaborate with teams across the company, and support them with their data needs, from ingesting new data sources, to help them design the proper data architecture for more intricate features and reporting.

Examples of projects that you may work on

  • Help engineering teams ingest data from 3rd party sources, in order to build pipelines that power user facing features.

  • Collaborate with our Operations Data Analyst and Operations stakeholders to develop efficient and scalable data-driven solutions for their most pressing operational reporting.

  • Work with our Data Analytics Engineer to audit and identify opportunities to scale our Data Infrastructure

  • Work with Machine Learning Engineers and Data Scientists to support their ETL and data pipeline needs.

  • Leverage existing tooling or introduce new tooling that helps Data Analysts version control their analyzes and iterate on it with an analytics as code mindset

Responsibilities

  • Build, audit, and evolve data ingestion processes, always with performance and scalability in mind - we use a mix of Google Cloud Services, Airflow and Segment

  • Evolve and scale our data warehouse

    • Add additional data, maintain an organize our data warehouse

    • Apply engineering best practices to our data transformation layer - we use Dataform from Google Cloud Services

    • Improve the efficiency of our most demanding transformation queries with performant SQL code

    • Enable operational analytics by syncing data to 3rd party tools, "closing the loop" in data circulation

  • Enable operational analytics by syncing data to 3rd party tools. We have several integrations with 3rd party systems like Salesforce, Marketo, Heap and Segment

  • Be the keystone for self-service analytics and data visualization

    • Manage data visualization in Looker; build and own mission critical dashboards

    • Support the organization to answer questions with data through training, tooling, process and your ingenuity

  • Collaborate across the company to ensure the right data is available for all projects

  • Define, drive and own service level agreements for customer facing, as well as internal, data analytics products

  • Champion data best practices across engineering, especially around efficiency, coding standards, data observability, data security and operations.

  • Own the Data Infrastructure roadmap, and work with the Head of Data Analytics to define the strategy for the data warehouse and data infrastructure

  • Collaborate with software engineers on data needs for Machine Learning pipelines

What we are looking for

  • 5+ years of experience working with data warehouses: building, monitoring, maintaining and scaling ETL pipelines, with a focus on data quality, integrity and security

  • Expertise in software engineering principles - version control, code reviews, testing, CI - as well as git and command line interfaces

  • Expertise in writing complex, efficient and DRY SQL code, as well as handling large data sets, preferably in Python, and identifying and resolving bottlenecks in production systems

  • Understanding of data engineering architectures, tools and resources - databases, computation engines, stream processors, workflow orchestrators and serialization formats - especially cloud hosted and managed versions

  • An efficient, customer-focused approach to development, pursuing pragmatic solutions to deliver the best results

  • Expertise with managing analytics, data engineering & visualization tools, Looker is preferred.

  • Strong experience with GCP (BigQuery, Dataform) as well as with Airflow or other orchestration tooling

  • Strong analytical skills with an ability to work both in with structured and unstructured datasets. Ability to perform data extraction, cleaning, analysis and presentation of insights both to technical and non-technical stakeholders

  • Demonstrated ability to translate business requirements into technical solutions and actionable insights, while leveraging project management tools to successfully organize work and deliver results.

  • Strong written, and verbal communication skills are paramount for this role

  • Comfortable working in the dynamic, collaborative environment of fast growth startup.

Nice to haves

  • Experience with Python or R

  • Experience with Salesforce architecture

  • Experience working in Freight Operations or Logistics

  • Experience working at high-growth startups. Preference for experience in consumer tech, marketplace, or SaaS industries.

A bit about our culture

We value high autonomy, ownership, and delivering results - in short - whatever it takes to set our customers up for success. We encourage each other to push the envelope, execute quickly, and be resilient to failure. We also work occasional late nights or weekends to deliver an above-and-beyond customer experience, while respecting and celebrating each other's personal background, values and commitments. In return, we are well compensated, take pride in seeing outsized impact to our product, and have memorable experiences learning and growing alongside a truly exceptional set of peers.

About CloudTrucks

CloudTrucks is a virtual trucking carrier in the multi-billion dollar trucking space. Core to this industry are over 3.5M truck drivers. They move more than 70% of all goods transported around the U.S., yet operate in a highly fragmented industry with huge opportunities for products, services and automation. We strive to deliver solutions that help truck drivers operate with much greater efficiency, increase their revenue, and offload business complexity. We are looking for uniquely exceptional people to join us on our journey as we massively scale into an industry-defining business.

We provide equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

Apply now Apply later
Job stats:  0  0  0

Tags: Airflow Architecture BigQuery CX Data Analytics Data quality Data visualization Data warehouse Engineering ETL GCP Git Google Cloud Looker Machine Learning Marketo Pipelines Python R Salesforce Security SQL Testing

Perks/benefits: Career development Startup environment

Region: North America
Country: United States

More jobs like this