(Senior) Data Engineer (m/f/x)

Neckarsulm, DE

Schwarz Gruppe

Die Schwarz Gruppe ist mit weltweit 575.000 Mitarbeitern in 32 Ländern eines der führenden Handelsunternehmen. Erfahren Sie mehr über uns.

View all jobs at Schwarz Gruppe

Apply now Apply later

How can we change the world to make marketing both relevant and impactful? With your help!At Schwarz Media Platform, we are on a mission to build Europe's largest and most advanced ad network for retail - a real-life AdTech application with a big impact on consumers, stores, and advertisers. It is based on Europe's largest retail data pool from Europe's No.1 retailer, Schwarz Group, and cutting-edge technology that understands individual consumer behavior at scale. If you are interested in this vision and are excited about how data and engineering excellence can help us get there, you will love Schwarz Media Platform.

What you´ll do

  • Work in a cross-functional product team to design and implement data centered features for Europe’s largest Ad Network
  • Help to scale our data stores, data pipelines and ETLs handling terabytes of one of the largest retail companies
  • Design and implement efficient data processing workflows
  • Extend our reporting platform for external customers and internal stakeholders to measure advertising performance
  • Continue to develop our custom data processing pipeline and continuously search for ways to improve our technology stack along our increasing scale
  • Work with machine learning engineers and software engineers, to build and integrate fully automated and scalable reporting, targeting and ML solutions

What you’ll bring along

  • 5+ years of professional experience working on data-intensive applications
  • Fluency with Python and good knowledge of SQL
  • Experience with developing scalable data pipelines with Apache Spark
  • Good understanding of efficient algorithms and know-how to analyze them
  • Curiosity about how databases and other data processing tools work internally
  • Familiarity with git
  • Ability to write testable and maintainable code that scales
  • Excellent communication skills and a team-player attitude

 

Great if you also have

  • Experience with Kubernetes
  • Experience with Google Cloud Platform
  • Experience with Snowflake, Big Query, Databricks and DataProc
  • Knowledge of columnar databases and file formats like Apache Parquet
  • Knowledge of "Big Data" technologies like Delta Lake
  • Experience with workflow management solutions like Apache Airflow
  • Affinity for Data Science tasks to prototype Reporting and ML solutions
  • Knowledge of Dataflow / Apache Beam

Wir freuen uns auf deine Bewerbung!

Schwarz Dienstleistung KG · Larissa Blümich · Referenz-Nr. 43709 
Stiftsbergstraße 1 · 74172 Neckarsulm 
www.it.schwarz

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: Airflow Big Data BigQuery Databricks Dataflow Data pipelines Dataproc Engineering ETL GCP Git Google Cloud Kubernetes Machine Learning Parquet Pipelines Python Snowflake Spark SQL

Region: Europe
Country: Germany

More jobs like this