Data Platform Engineer (f/m/d)

Frankfurt am Main, DE

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Deutsche Börse

Die offizielle Website der Gruppe Deutsche Börse mit Informationen zum Unternehmen und den Bereichen Investor Relations, Media, Karriere, Nachhaltigkeit...

View all jobs at Deutsche Börse

Apply now Apply later

Your area of work:

Big Data & Advanced Analytics provides data platform and services to enable the development of data science, analytics by the businesses across the value chain served by Deutsche Börse Group. We standardize and automate processes with modern technologies and frameworks.

As part of our cloud migration and digital transformation journey, we are looking for an experienced and passionate Data Platform Engineer with a diverse range of skills who brings fresh and innovative ideas and enjoys designing and building big data solutions with positive attitude.

In this role you will join a growing and diverse big data experts to cover topics ranging from contributing to defining and deploying the overall cloud data architecture including cloud services evaluation, platform design and configuration, to support the implementation of business use cases in the big data and data science fields in one of the biggest exchanges in the world.

 

Your responsibilities:
  • Provision and configure Databricks workspaces using Terraform, CLI & SDK
  • Set up workspace-level settings including clusters, libraries, and compute policies
  • Define and manage catalogs, schemas, and tables across workspaces
  • Ensure the data platform is operational, secure, scalable, and reliable
  • Contribute to define Data Mesh paradigm with different data domains
  • Act as a technical advisor to data scientists, analysts, and business users
  • Write and maintain technical documentation and perform tasks in Agile methodology

 

Your profile:
  • Hands on experience with cloud native big data technologies, Azure/GCP
  • Experience in building data pipelines such as Data Factory, Apache Beam, or Apache Airflow
  • Familiarity with at least one data platforms and processing frameworks such as Kafka, Spark, Flink; Delta Lake and Databricks is a big plus
  • Demonstrated experience in one or more programming languages, preferably Python
  • Knowledge in CI/CD tools such as GitHub Actions is a plus
  • Knowledge of data management, monitoring, security, and privacy
  • Strong team player willing to cooperate with colleagues across office locations and functions
  • Very strong English language skills are a must
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture Azure Big Data CI/CD Databricks Data management Data pipelines Flink GCP GitHub Kafka Pipelines Privacy Python Security Spark Terraform

Region: Europe
Country: Germany

More jobs like this