Data Platform Engineer (f/m/d)
Prague, CZ
Deutsche Börse
Die offizielle Website der Gruppe Deutsche Börse mit Informationen zum Unternehmen und den Bereichen Investor Relations, Media, Karriere, Nachhaltigkeit und Regulierung.
Your area of work:
Big Data & Advanced Analytics provides data platform and services to enable the development of data science, analytics by the businesses across the value chain served by Deutsche Börse Group. We standardize and automate processes with modern technologies and frameworks.
As part of our cloud migration and digital transformation journey, we are looking for an experienced and passionate Data Platform Engineer with a diverse range of skills who brings fresh and innovative ideas and enjoys designing and building big data solutions with positive attitude.
In this role you will join a growing and diverse big data experts to cover topics ranging from defining the overall big data architecture including cloud services evaluation, platform design and configuration, and ETL/ELT deployment to support the implementation of business use cases in the big data and data science fields in one of the biggest exchanges in the world.
Your responsibilities:
- Define the migration steps of solutions from legacy systems to cloud Lakehouse
- Evaluate different cloud services to build out open format data layers
- Ensure the data platform is operational, secure, scalable, and reliable
- Contribute to define Data Mesh paradigm with different data domains
- Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and provide solutions
- Write and maintain technical documentation and perform tasks in Agile methodology
Your profile:
- At least 2 years of experience on similar position
- Hands on experience with cloud native big data technologies, Azure/GCP
- Ability to design and implement engineering solutions for business opportunities
- Knowledge of data management, monitoring, security, and privacy
- Experience in building data pipelines such as Data Factory, Apache Beam, or Apache Airflow
- Demonstrated experience in one or more programming languages, preferably Python
- Familiarity with at least one data platforms and processing frameworks such as Kafka, Spark, Flink; Delta Lake and Databricks is a big plus
- Knowledge in CI/CD tools such as GitHub Actions is a plus
- Strong team player willing to cooperate with colleagues across office locations and functions
- Very strong English language skills are a must
You can look forward to our benefit package:
- Hybrid Work and Flexible working hours
- Work from abroad - 12 days of remote work from EU countries per year
- Group Share Plan - discount on company shares
- Pension fund contribution - 3% of your gross salary (5% after 5 years with us)
- Health & Wellbeing - fully covered Multisport card, life & accident insurance, sick days and 100% salary contribution during sick leave (up to 56 days)
- 25 vacation days
- Mobility - fully covered public transport in Prague & free parking
- Flexible Benefit Account (Pluxee) - 1200 per month
- Personal Development - annual budget of €690 ... and way more!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Azure Big Data CI/CD Databricks Data management Data pipelines ELT Engineering ETL Flink GCP GitHub Kafka Pipelines Privacy Python Security Spark
Perks/benefits: Equity / stock options Flex hours Flex vacation
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.