Data Engineer
Berlin
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
About the role
About LANCHLANCH, the fastest-growing consumer company in DACH, is seeking a talented and motivated Data Engineer to join our dynamic team.
Founded in 2023 and headquartered in Berlin, LANCH partners with restaurants and top creators to launch delivery-first food brands such as Happy Slice pizza, Loco Chicken, and the new Korean-style Koco Chicken. Beyond virtual kitchens, we are rolling out a network of physical restaurants and retail brands (“Happy Chips”, “Loco Tortillas”) that already reach thousands of supermarkets. Backed by €26 million in Series A funding (Feb 2025), our Tech & Data team is building the platforms - LANCH OS and the Partner Portal - that power everything from menu management to supply-chain automation.
The Role
We’re looking for our first Data Engineer to lay the foundations of LANCH’s end-to-end data platform. You’ll own everything that turns operational events into trusted, analysis-ready datasets - from real-time streaming and batch pipelines to the orchestration frameworks that keep them humming. Working hand-in-hand with product, engineering, and ops, you will design and implement the data infrastructure that powers menu optimisation, delivery routing, brand performance dashboards, and much more.
Key Responsibilities
- Architect and launch a scalable event-streaming platform (e.g., Pub/Sub, Kafka) that captures orders, logistics updates, and app interactions in real time.
- Build and maintain a modern Reverse ETL layer (e.g., Census, Hightouch) to push clean warehouse data back to internal applications like our Partner Portal, LANCH OS, or our CRM.
- Evolve our Airflow and ELT environment: modular DAG design, automated testing, CI/CD, observability, and cost-efficient GCP execution.
- Collaborate with backend engineers to instrument services for analytics & tracing; champion event naming conventions and schema governance.
- Set engineering standards - code reviews, documentation, security, and infra as code (Terraform) - that will scale as we 10x the team and data volume.
About you
About You – what will make you thrive at LANCH- 2+ years building data infrastructure in cloud environments.
- Professional experience in designing and developing ELT pipelines.
- Hands-on experience with at least one streaming technology (Pub/Sub, Kafka, Kinesis, Dataflow, ...).
- Fluent in Python for data processing; comfortable writing performant SQL (BigQuery dialect a plus).
- Proven track record orchestrating pipelines with Airflow (or Dagster, Prefect) and deploying via Docker & GitHub Actions.
- Product mindset: you enjoy sitting with ops teams or restaurant managers to translate fuzzy business challenges into robust data pipelines.
- Bias for action and ownership: you prototype quickly, measure impact, and iterate - yesterday’s idea should be today’s scheduled DAG.
- Collaborative communicator - fluent English; conversational German.
- Eager to work mostly on-site in our Berlin Prenzlauer Berg office.
Our Tech Stack
- Data Warehouse: BigQuery
- Transformation & Modelling: dbt, SQL
- Orchestration: Airflow
- Streaming / Messaging: Google Pub/Sub, Apache Kafka (greenfield)
- Backend & APIs: Python, FastAPI, SQLModel, PostgreSQL
- Infrastructure: GCP, Terraform, Docker, GitHub Actions
- Analytics & BI: Metabase, Pandas, Notebook-based exploration
- Reverse ETL: Census, Hightouch, ... (greenfield)
If shaping the data foundation of a high-growth food tech startup excites you, we’d love to meet you.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs BigQuery CI/CD Dagster Dataflow Data pipelines Data warehouse dbt Docker ELT Engineering ETL FastAPI GCP GitHub Kafka Kinesis Metabase Pandas Pipelines PostgreSQL Python Security SQL Streaming Terraform Testing
Perks/benefits: Startup environment Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.