Data Engineer

Athens, Attica, Greece - Remote

Blueground

Rent beautiful, fully furnished apartments for monthly stays in the best locations around the world. Thousands of furnished flats for short and long-term stays.

View all jobs at Blueground

Apply now Apply later

🌍 Redefining how people live.

At Blueground, we believe that when your base is reliable, the world opens up. That’s why we’re building the world’s leading platform for living.

Every year, 350 million people move between cities—yet today’s housing options haven’t caught up with the needs of this modern, mobile generation. Blueground was built to change that.

With 40,000+ homes across the globe, available for stays from a few days to a year or more, we’re just getting started. We’re on an exponential growth path to redefine living and create an entirely new category. Our edge? Powerful proprietary tech, operational excellence, and a team that executes with speed and discipline.

 Our culture is grounded in five principles:

  • Guests First – Every decision starts with their experience.
  • Move Fast – We value speed, momentum, and action.
  • Dive In – The magic is always in the details, and we go deep.
  • Embrace Change – Change isn’t a disruption; it’s how we grow.
  • Keep It Honest – Transparency accelerates progress—and strengthens relationships.

If you’re ready to do the best work of your life and help reshape how the world lives, we’d love to meet you.

We are looking for a Data Engineer to join our Data Engineering team and play a pivotal role in shaping the future of our data platform. As we scale our business and analytics capabilities, we are investing heavily in modern, reliable, and developer-friendly data infrastructure. You’ll work at the core of this transformation—designing and building robust data pipelines, developing modular data models, and laying the foundation for data products that empower teams across the company.

You’ll collaborate closely with analysts, engineers, and stakeholders from various teams to ensure that high-quality, trusted data is available when and where it’s needed. This is a high-impact role with significant autonomy, ideal for someone who thrives in a dynamic environment and enjoys owning complex problems end-to-end.

Our Stack

  • Cloud: AWS (EKS, RDS, S3, IAM, etc.)
  • Orchestration & IaC: Kubernetes, Terraform
  • Data Orchestration & Modelling: Airflow / DBT
  • Data Warehouse: Snowflake
  • Data Reporting: Metabase, Tableau
  • Languages: Python, Java
  • CI/CD: GitHub Actions
  • Observability: Datadog

What you'll do

  • Design, build, and maintain scalable and cost-efficient ETL/ELT data pipelines on AWS using tools like Apache Airflow and DBT, supporting ingestion, transformation, and delivery of data across multiple systems.
  • Solve complex data engineering challenges related to data lineage, embedded and semi-structured data, and the delivery of reliable, auditable data products that serve both operational and analytical needs.
  • Work with Snowflake to implement and manage robust data warehousing solutions.
  • Write clean, modular, and reusable code in Python for data engineering tasks, automation, and pipeline orchestration.
  • Partner with data analysts and business stakeholders to understand requirements and translate them into technical solutions.
  • Work on data engineering best practices, including CI/CD, testing, and version control in data workflows.

What to expect

  • To work in agile, cross functional, devops enabled :) teams
  • To build lots of stuff from scratch
  • Your opinion to matter
  • To get your hands on latest goodies - we’re open minded geeks
  • To enjoy some quality engineering
  • A rapidly growing company
  • Cool colleagues :)

Requirements

  • 4+ years of experience in data engineering or a related role.
  • Strong expertise in DBT (Data Build Tool) for data transformation and modelling.
  • Experience with Airflow (or similar orchestrators like Dagster, Prefect) and Kubernetes.
  • Solid background in Python programming for data workflows and SQL.
  • Deep hands-on experience with Snowflake or other cloud data warehouses.
  • Proven track record of implementing data quality frameworks
  • Excellent at solving complex problems and translating technical insights into business-friendly language.
  • Excellent communication in English
  • BS/MS degree in Computer Science or a related subject
  • Bonus: Understanding of data governance, data quality, and lineage
  • Bonus: Experience or strong interest in greenfield data initiatives, including the design and implementation of data contracts to ensure scalable and reliable data exchange across teams.

Benefits

  • Mobile telephone and data plan
  • Workstation of your choice
  • Access to training budget and resources (safaribooksonline.com, frontendmasters.com et al)
  • Being part of a working culture which embraces autonomy and initiative taking
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS CI/CD Computer Science Dagster Data governance Data pipelines Data quality Data warehouse Data Warehousing dbt DevOps ELT Engineering ETL GitHub Java Kubernetes Metabase Pipelines Python Snowflake SQL Tableau Terraform Testing

Perks/benefits: Career development Competitive pay Startup environment

Regions: Remote/Anywhere Europe
Country: Greece

More jobs like this