Senior Data Engineer

Lisboa, Portugal

Sweatcoin

Schließe dich unseren über 120 Mio. Nutzern an und werde Teil der Movement-Ökonomie. Die Sweatcoin-App wandelt deine Schritte in Sweatcoins um – eine virtuelle Währung, die du für Produkte und Dienstleistungen ausgeben kannst. Laufen lohnt sich!

View all jobs at Sweatcoin

Apply now Apply later

At Sweatcoin, we're driven by a shared mission to make the world more active, and we value creativity, collaboration, and a passion for solving complex problems.

Our iOS and Android apps have more than 200M installs, 15M+ active users, more than 500 commercial partners and confirmed by the independent academic research ability to make our users up to 20% more active.

The most downloaded Health app of the year (2022), having been No.1 Overall App in 56 countries and counting!

This is an exciting opportunity to make a huge impact on our data platform as we transition to a more scalable and efficient architecture. If you thrive on solving complex data challenges and want to be part of a high-performing team, we’d love to hear from you!

What we offer:

  • A team of exceptional people who celebrate our community by being supportive and creative all the way. The head of data once was developing mind-reading helmet; a software developer has his certified psychological practice; QA-vet; QA-skipper; and a number of musicians that might allow us to start our own band. All together we’re multiplying each other’s talents which inspire us to develop a product we’re all proud of.
  • A product that promotes health and fitness in 100 countries. Sweatcoin has proven to help people and create multiple inspiring stories like this one: https://blog.sweatco.in/one-sweatcoiners-journey-to-100000-steps/ .
  • A startup that actually works. We are completely self-sufficient yet our investors are excited to provide us with even more resources to keep growing. We recently broke our record of 10M new users each week.
  • Models that help us to verify steps so there is no cheat way a dog can earn coins for an owner.
  • Automatised A/B tests, analytics that are deeply integrated into the product, a modern data stack (jupiter, bigquery, airflow, looker).

What you will do:

We are looking for a Senior Data Engineer to join our team and play a key role in migrating our data infrastructure to a new architecture. You will work alongside two other engineers on this large-scale migration and help shape the future of our data platform. Over time, our team will evolve into a platform-focused group, building automation tools, improving performance, ensuring data pipeline resilience, and strengthening data governance.

What you will do:

  • Lead and execute the migration from Firebase-BigQuery-Looker to a self-hosted stack including Snowplow, Kafka, ClickHouse, Trino, Spark, S3, and Redash.
  • Design, develop, and optimise scalable, high-performance data pipelines.
  • Automate data processing and workflow orchestration.
  • Enhance data infrastructure reliability, scalability, and cost-efficiency.
  • Collaborate with engineers and analysts to define best practices for data processing, storage, and governance.
  • Develop internal tools for data quality monitoring, lineage tracking, and debugging.
  • Optimize query performance and ensure efficient data modeling.

What we expect from you:

  • Expertise in Data Engineering: Strong experience building, maintaining, and optimizing ETL/ELT pipelines.
  • Strong Coding Skills: Proficiency in Python and SQL for data processing and analytics.
  • Distributed Systems Experience: Hands-on experience with Kafka, Trino, ClickHouse, Spark, or similar.
  • Cloud & Storage: Experience with S3 or equivalent object storage solutions.
  • Infrastructure & Tooling: Proficiency with Docker, Kubernetes, Git, and CI/CD pipelines.
  • Orchestration & Automation: Familiarity with workflow orchestration tools like Airflow or dbt.
  • Analytical Thinking: Ability to optimize system performance and troubleshoot complex data issues.
  • Self-Starter Mentality: Comfortable working in a fast-paced, evolving environment with minimal supervision.
  • Strong Communication Skills: Fluent English to ensure smooth collaboration within the team.

Nice to have:

  • Experience with Snowplow for event tracking and data collection.
  • Knowledge of data governance and security best practices.
  • Familiarity with machine learning pipelines and real-time analytics.

What you get in return:

  • Remote-friendly & Flexible working hours. The flexibility is incredible, performance is based on output, rather than hours spent working, you can be wherever you want!
  • Apple devices for work
  • Team buildings abroad in exciting locations!
  • Health insurance coverage
  • WellBeing program, which supports up to 2 counselling sessions per month
  • Unlimited time-off policy

If you feel that that’s a match we would be excited to have you on our team!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: A/B testing Airflow Architecture BigQuery CI/CD Data governance Data pipelines Data quality dbt Distributed Systems Docker ELT Engineering ETL Git Kafka Kubernetes Looker Machine Learning Pipelines Python Redash Research Security Spark SQL

Perks/benefits: Flex hours Flex vacation Health care Startup environment Unlimited paid time off

Region: Europe
Country: Portugal

More jobs like this