Data Engineer
Athens, Attica, Greece - Remote
Blueground
Rent beautiful, fully furnished apartments for monthly stays in the best locations around the world. Thousands of furnished flats for short and long-term stays.đ Redefining how people live.
At Blueground, we believe that when your base is reliable, the world opens up. Thatâs why weâre building the worldâs leading platform for living.
Every year, 350 million people move between citiesâyet todayâs housing options havenât caught up with the needs of this modern, mobile generation. Blueground was built to change that.
With 40,000+ homes across the globe, available for stays from a few days to a year or more, weâre just getting started. Weâre on an exponential growth path to redefine living and create an entirely new category. Our edge? Powerful proprietary tech, operational excellence, and a team that executes with speed and discipline.
 Our culture is grounded in five principles:
- Guests First â Every decision starts with their experience.
- Move Fast â We value speed, momentum, and action.
- Dive In â The magic is always in the details, and we go deep.
- Embrace Change â Change isnât a disruption; itâs how we grow.
- Keep It Honest â Transparency accelerates progressâand strengthens relationships.
If youâre ready to do the best work of your life and help reshape how the world lives, weâd love to meet you.
We are looking for a Data Engineer to join our Data Engineering team and play a pivotal role in shaping the future of our data platform. As we scale our business and analytics capabilities, we are investing heavily in modern, reliable, and developer-friendly data infrastructure. Youâll work at the core of this transformationâdesigning and building robust data pipelines, developing modular data models, and laying the foundation for data products that empower teams across the company.
Youâll collaborate closely with analysts, engineers, and stakeholders from various teams to ensure that high-quality, trusted data is available when and where itâs needed. This is a high-impact role with significant autonomy, ideal for someone who thrives in a dynamic environment and enjoys owning complex problems end-to-end.
Our Stack
- Cloud: AWS (EKS, RDS, S3, IAM, etc.)
- Orchestration & IaC: Kubernetes, Terraform
- Data Orchestration & Modelling: Airflow / DBT
- Data Warehouse: Snowflake
- Data Reporting: Metabase, Tableau
- Languages: Python, Java
- CI/CD: GitHub Actions
- Observability: Datadog
What you'll do
- Design, build, and maintain scalable and cost-efficient ETL/ELT data pipelines on AWS using tools like Apache Airflow and DBT, supporting ingestion, transformation, and delivery of data across multiple systems.
- Solve complex data engineering challenges related to data lineage, embedded and semi-structured data, and the delivery of reliable, auditable data products that serve both operational and analytical needs.
- Work with Snowflake to implement and manage robust data warehousing solutions.
- Write clean, modular, and reusable code in Python for data engineering tasks, automation, and pipeline orchestration.
- Partner with data analysts and business stakeholders to understand requirements and translate them into technical solutions.
- Work on data engineering best practices, including CI/CD, testing, and version control in data workflows.
What to expect
- To work in agile, cross functional, devops enabled :) teams
- To build lots of stuff from scratch
- Your opinion to matter
- To get your hands on latest goodies - weâre open minded geeks
- To enjoy some quality engineering
- A rapidly growing company
- Cool colleagues :)
Requirements
- 4+ years of experience in data engineering or a related role.
- Strong expertise in DBT (Data Build Tool) for data transformation and modelling.
- Experience with Airflow (or similar orchestrators like Dagster, Prefect) and Kubernetes.
- Solid background in Python programming for data workflows and SQL.
- Deep hands-on experience with Snowflake or other cloud data warehouses.
- Proven track record of implementing data quality frameworks
- Excellent at solving complex problems and translating technical insights into business-friendly language.
- Excellent communication in English
- BS/MS degree in Computer Science or a related subject
- Bonus: Understanding of data governance, data quality, and lineage
- Bonus: Experience or strong interest in greenfield data initiatives, including the design and implementation of data contracts to ensure scalable and reliable data exchange across teams.
Benefits
- Competitive full time salary
- Mobile telephone and data plan
- Workstation of your choice
- Access to training budget and resources (safaribooksonline.com, frontendmasters.com et al)
- Being part of a working culture which embraces autonomy and initiative taking
* Salary range is an estimate based on our AI, ML, Data Science Salary Index đ°
Tags: Agile Airflow AWS CI/CD Computer Science Dagster Data governance Data pipelines Data quality Data warehouse Data Warehousing dbt DevOps ELT Engineering ETL GitHub Java Kubernetes Metabase Pipelines Python Snowflake SQL Tableau Terraform Testing
Perks/benefits: Career development Competitive pay Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.