(Senior) Data Engineer—Data Platform (m/f/d)

All Unzer Locations

Unzer

Entdecken Sie unsere Zahlungsmethoden, Lösungen für Instore und den E-Commerce. Lernen Sie uns als Zahlungsanbieter kennen.

View all jobs at Unzer

Apply now Apply later

ABOUT US
Unzer is a leading European fintech company with a mission to simplify international payments for e-commerce and retail businesses. Our brand was formed from 13 companies that now contribute to building a unique product covering the entire payment flow. 
At Unzer, we are driven by the belief that customers should enjoy a seamless shopping experience, no matter where they choose to shop. We are a team of over 750 experts of 70 different nationalities, dedicated to creating a state-of-the-art unified commerce platform. Our goal is to enable businesses to delight their customers with a seamless payment experience. 
Whether you're a tech enthusiast, payment expert, or a dedicated support professional, we are looking for individuals who are passionate about making a difference. 
Our offices
We are based across Austria, Denmark, Germany, and Luxembourg with a HQ in Berlin. 

ABOUT THE PROJECT
We’re looking for an experienced freelance Data Engineer to support Unzer’s Data Team in scaling and maintaining our self-serve data platform. This is a contract-based role focused on building and optimizing data pipelines, enhancing infrastructure, and enabling data-driven product features.
Please note: Residency within the EU is a requirement for this position.

What your work will look like:

  • Operate, monitor, and extend our data platform infrastructure, pipelines, and services.
  • Implement and maintain Airflow workflows for orchestration.
  • Use DBT for data transformation, materialization, and data quality checks.
  • Develop and optimize schemas on PostgreSQL/MySQL.
  • Collaborate with analytics stakeholders to improve our semantic layer from an engineering perspective.
  • Work with Redshift to design and implement scalable data models and ingestion pipelines, and optimize performance for large-scale analytics workloads.
  • Work with Kafka and other streaming technologies to enable real-time data processing and integration into the data platform.
  • OUR TECH STACK
  • AWS (Redshift, Athena, S3, EMR, DMS)
  • Airflow, DBT, Terraform
  • PostgreSQL, MySQL
  • AWS DMS, AWS Glue

What you need to be successful in this role:

  • Strong SQL skills and experience with AWS data stack.
  • Proven experience building and operating ETL/ELT pipelines using Airflow and DBT.
  • Hands-on with Terraform and infrastructure-as-code in cloud environments.
  • Ability to own delivery: from requirements to production monitoring.
  • Self-driven and communicative, able to work independently with async teams.

What’s next?

  • Does it sound exciting? -  Apply with your CV in English. Please don’t shy away if you don’t meet all the requirements! We’re looking forward to meeting you.  
  • We will get back to you within 14 days of receiving your application.  
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  2  0
Category: Engineering Jobs

Tags: Airflow Athena AWS AWS Glue Data pipelines Data quality dbt E-commerce ELT Engineering ETL FinTech Kafka MySQL Pipelines PostgreSQL Redshift SQL Streaming Terraform

More jobs like this