Data Engineering Lead

Manila, Metro Manila, Philippines - Remote

Zoomo

Check our reliable electric bikes and enjoy flexible lease plans in Australia, US, UK, France and more!

View all jobs at Zoomo

Apply now Apply later

At Zoomo, our vision is to transition every urban mile to Light Electric Vehicles (LEVs). Zoomo offers the world’s leading platform for commercial use e-bikes. We operate across the USA, UK, Europe, Canada and Australia. In 4 years Zoomo has helped transition millions of urban miles to light electric vehicles, built a team of >300 world class engineers, operators, sales staff and vehicle repair technicians; developed and deployed the world's best last mile delivery electric bikes and fleet management software; and successfully sold these fleet solutions into the world's leading logistics businesses.

Zoomo vehicles are used by major players in the food, grocery and parcel delivery segments with partners including UberEats, Doordash, JustEat Takeaway, Deliveroo, Domino’s, Pizza Hut, Amazon, FedEx, Getir, Ocado, GoPuff and many more.

Expect to join a high-performing team where you are trusted to make a direct impact on our business, our customers and our planet.

THE ROLE

We are looking for an experienced Data Engineer with a solid background in Google Cloud Platform (GCP) to join our Technology Team and lead our data infrastructure efforts.

The successful candidate will be responsible for designing, building, and maintaining our data infrastructure on GCP, ensuring high performance and availability of our systems as we scale. This will involve designing, implementing and maintaining data pipelines; configuring our infrastructure to balance scalability and efficiency; and writing SQL queries which incorporate business logic to provide data to key stakeholders across the business.

The person in this role will collaborate closely with their counterparts across the Technology Team (including in Software Engineering, Hardware Engineering, Product and Central Operations), but also across other functions like Finance, Marketing and Strategy. In order to be successful they should be comfortable working as a sole expert, hungry to identify opportunities that will provide large business impact, as they’ll be empowered and supported by the Technology Team to implement them. The role is predominantly a hands-on engineering role, but has a people management aspect.

RESPONSIBILITIES

  • Designs, develops and maintains end to end data pipelines, either via SaaS solutions (such as Fivetran) or via custom pipelines (for example that query REST APIs).
  • Owns, maintains and improves Zoomo’s GCP infrastructure, optimising for cost and implementing new solutions to support our Software Engineers and Product Managers.
  • Understands business requirements, to design and develop solutions which clean, aggregate and surface data to various business stakeholders (including Leadership).
  • Delivers fit-for-purpose reports, dashboards (e.g. using Looker), data extracts and advanced analytics which can be consumed by business stakeholders across the organisation.
  • Monitors data quality to identify and rectify issues and ensure our datasets are trusted.
  • Collaborates with stakeholders across various business functions to gather requirements to support their data needs.
  • Is across the latest best practices and provides technical leadership and expertise to engineers and analysts across the business.
  • Adheres to and champions high quality code standards, testing and technical documentation.
  • Coordinates and communicates with stakeholders to ensure awareness of project status and achievement of project outcomes.
  • Provides line management and personal development of a skilled data engineer, which may grow to a team.

Requirements

  • Advanced SQL skills; preferably BigQuery and PostgreSQL, but other SQL flavours are acceptable.
  • Experience with building data lake and data warehouse ETL pipelines.
  • Experience building within GCP, in particular with BigQuery, PubSub, Cloud Functions, Dataflow, Composer, Firestore and IAM configuration.
  • Experience interacting with REST APIs and webhooks.
  • Proficient coding skills in Python.
  • Experience deploying using Infrastructure-as-Code; preferably Terraform.
  • Experience working with both batch and streaming data processing.
  • Experience working with visualisation tools to build dashboards; preferably Looker, but Tableau is also acceptable.
  • Experience with data governance (managing PII) and metadata management.
  • Comfortable working with both technical and non-technical stakeholders, in order to receive, discuss, and shape requirements; and drive projects to completion.

Benefits

We offer you the chance to be part of a team at the cutting edge of the world’s electrification journey, including:

  • Working with a switched on team that strives to make the streets greener and serve the rider
  • A competitive salary
  • Global and country-specific benefits packages
  • Monthly team outings & events

Zoomers currently represent 45 nationalities and we celebrate diversity and inclusion with equal opportunities for all.

#LI-REMOTE

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: APIs BigQuery Dataflow Data governance Data pipelines Data quality Data warehouse Engineering ETL Finance FiveTran GCP Google Cloud Looker Pipelines PostgreSQL Python SQL Streaming Tableau Terraform Testing

Perks/benefits: Competitive pay Team events

Regions: Remote/Anywhere Asia/Pacific
Country: Philippines

More jobs like this