Senior Data Engineer

London Area, England, United Kingdom

Apply now Apply later

About Zego

At Zego, we understand that traditional motor insurance holds good drivers back. It's too complicated, too expensive, and it doesn't reflect how well you actually drive. Since 2016, we have been on a mission to change that by offering the lowest priced insurance for good drivers.

From van drivers and gig workers to everyday car drivers, our customers are the driving force behind everything we do. We've sold tens of millions of policies and raised over $200 million in funding. And we’re only just getting started.

Overview of the Data Engineering team: 

At Zego the Data Engineering team is integral to our data platform, working closely with Software Engineers, Data Scientists and Data Analysts along with other areas of the business. We use a variety of internal and external tooling to maintain our data repositories. We are looking for people who have a solid understanding of ETL and ELT paradigms, are comfortable using Python and SQL, hold an appreciation for good software engineering and data infrastructure principles, are eager to work with complex and fast growing datasets, reflect a strong desire to learn and are able to communicate well.

Our stack involves but is not limited to Airflow, Data Build Tool (DBT), a multitude of AWS services, Stitch and CICD pipelines. As a Data Engineer you will have the opportunity to promote emerging technologies where they can add value to the business and promote better ways of working.

It is an exciting time to join, and you’ll partner with world class engineers, analysts and product managers to help make Zego the best loved insurtech in the world.

Over the next 12 months you will:

  • Assist in developing and maintaining our ETL and ELT pipelines.
  • Support our data scientists in the development and implementation of our ML pricing models and experiments.
  • Help drive and evolve the architecture of our data ecosystem.
  • Collaborate with product managers and across teams to bring new products and features to the market.
  • Help drive data as a product, by growing our data platform with a focus on strong data modelling, quality, usage and efficiency.
  • Build tailored data replication pipelines as our backend application is broken into microservices.

About you

We are looking for somebody with a working knowledge of building data pipelines and the underlying infrastructure. Experience in data warehouse design undertakings, following best practices during implementation is a big plus. You have worked with (or are keen to do so) Data Analysts, Data Scientists and Software Engineers.

Practical knowledge of (or strong desire to learn) the following or similar technologies:

  • Python
  • Airflow
  • Databases (PostgreSQL)
  • Data Warehousing (Redshift / Snowflake)
  • SQL (We use DBT for modelling data in the warehouse)
  • Data Architecture including Dimensional Modelling
  • Experience in using infrastructure as code tools (e.g. Terraform)

Otherwise an interest in learning these, with the support of the team, is essential. We're looking for people with a commitment to building, nurturing, and iterating on an ever-evolving data ecosystem.

Other beneficial skills include:

  • Familiarity with Docker and/or Kubernetes (EKS)
  • Implementation / Contribution to building a Data Lake or Data Mesh
  • Having worked with a wide variety of AWS services
  • Open Table Formats (e.gr. Apache Iceberg)

How we work

We believe that teams work better when they have time to collaborate and space to get things done. We call it Zego Hybrid.

Our hybrid way of working is unique. We don't mandate fixed office days. Instead, we foster a flexible approach that empowers every Zegon to perform at their best. We ask you to spend at least one day a week in our central London office. You have the flexibility to choose the day that works best for you and your team. We cover the costs for all company-wide events (3 per year), and also provide a separate hybrid contribution to help pay towards other travel costs. We think it’s a good mix of collaborative face time and flexible home-working, setting us up to achieve the right balance between work and life.

Benefits

We reward our people well. Join us and you’ll get a market-competitive salary, private medical insurance, company share options, generous holiday allowance, and a whole lot of wellbeing benefits. And that’s just for starters.

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, national origin, gender, sexual orientation, age, marital status, or disability status.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Data pipelines Data warehouse Data Warehousing dbt Docker ELT Engineering ETL Kubernetes Machine Learning Microservices Pipelines PostgreSQL Python Redshift Snowflake SQL Terraform

Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Health care Insurance Team events

Region: Europe
Country: United Kingdom

More jobs like this