Senior Data Engineer (Berlin, hybrid)
Berlin
Kittl
Kittl is the intuitive design platform for professionals, featuring real-time collaboration, advanced tools, and a vast library of fonts and assets. Design, deliver, and collaborate seamlessly.We’re in an exciting growth phase following our Series B funding and are expanding our team. Our hybrid working culture includes three office days per week (Monday, Wednesday, and Friday) at our Berlin headquarters. Looking ahead, we’re also growing London as our second location, with a new office planned for 2025.
A little about us
Redefining graphic design: Kittl is transforming how creators work with an intuitive platform that stands as a modern, competitive alternative to traditional design tools: Build the new Adobe of tomorrow
Rapid growth: Millions of users within just two years of launch
Diverse team: 120+ team members representing 30+ different countries
Truly product-led company: Engineers, Product managers and designers are at the core of Kittl - shaping an engineering driven working culture
Strong funding: Raised over $45M from world-renowned investors who have also backed companies like Slack, Dropbox, and Figma
Learn more: www.kittl.com/career
Your role at Kittl
As a Senior Data Engineer at Kittl, you will be the architect and driver of our data platform, owning end-tend infrastructure that powers everything from real-time event pipelines to analytics tooling. You'll lead initiatives across GCP, PubSub, Dagster and DBT, ensuring reliable data flow for product analytics, experimentation, and decision-making. Working cross-functionally with backend engineers, product managers, and analysts, you'll be the go-to technical expert who translates business goals into scalable data architecture.
What you'll do
Own and evolve data infrastructure: Own and continuously improve our data systems across GCP, including Compute Engine, PubSub, and BigQuery, to support a scalable analytics foundation
Develop event-driven pipelines: Build reliable, event-based data pipelines across frontend and backend systems to enable experimentation, personalization, and real-time analysis
Champion engineering excellence: Lead CI/CD processes, manage secrets securely, and ensure high-quality deployments using Docker, while promoting clean code and peer review practices in GitHub
Support analytics and data science: Design and implement advanced data transformations in Python and SQL, scale Dagster orchestration and dbt models to meet evolving analytical needsEnsure reliable data tracking: Act as a key link between engineering and analytics, maintaining traceable event tracking and A/B testing infrastructure (Growthbook) to support growth initiatives
Monitor and optimize systems: Maintain ingestion tools like Airbyte, monitor and troubleshoot pipelines, and drive architectural improvements across backend and frontend data capture while optimizing for cost and performance
What you’ll need
Experience: Solid background in deploying and managing cloud-native data infrastructure, ideally in GCP
Infrastructure: Production-level experience with containerized deployments and service orchestration
Data lifecycle: Comfortable owning the full data journey from event design to modeling, transformation, and activation
Analytics: Deep understanding of event-driven architecture and the trade-offs between various analytics data sources
Collaboration: Proven track record of translating business needs into scalable solutions in cross-functional environments
Bonus skills: Experience supporting ML workflows or deploying models in production is a strong plus
Interview process
Recruiter interview (30 min)
Interview with Data Lead (45 min)
Technical take home assignment
Technical interview (60 min) & stakeholder interview with Senior Product Analyst & Senior Data Scientist (45 min)
Bar raiser interview with Chief of Staff, right hand to CEO (45 min)
We are looking for someone
Exceptionally driven to drive impact and challenge the status quo
Who takes extreme ownership & gets things done
Who goes above and beyond in their role
Benefits
Maximise your impact: No matter if you’re leading a team or you stand out by your domain expertise - all we care about is supporting you to maximise your own impact
Hackathons: Our quarterly hackathons provide an environment to experiment with new concepts, push boundaries, and potentially deliver the next big thing
Kittl Week: Each year, our global team gathers together for a whole week, to work, celebrate, get inspired, and have fun
Flexible working hours: Our core hours are 11am–5pm CET, leaving the rest of your schedule flexible to fit your style
Remote work: Work up to 50 days (10 weeks) fully remote per year from anywhere in the world, as long as you maintain our core hours
Learning & development: Our L&D budget supports your professional growth
Mobility benefit: We fully cover your monthly BVG public transport ticket
Health and fitness: Urban Sports Club membership discount
Vacation: Up to 30 vacation days per year
#li-hybrid
🌈 At Kittl, we embrace diversity and value every team member's unique background, identity, and experience. We're all about respect, honesty, and inclusivity. Together, we create a safe and supportive work environment where everyone thrives. Join us on this exciting journey of making our company and product even better!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: A/B testing Architecture BigQuery CI/CD Dagster Data pipelines dbt Docker Engineering GCP GitHub Machine Learning Pipelines Python SQL Testing
Perks/benefits: Career development Fitness / gym Flex hours Flex vacation Health care Salary bonus Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.