DataOps Engineer (m/f/x)
Mülheim an der Ruhr, NW, DE, 45481
ALDI SÜD
ALDI SÜD online - Entdecke unser großes Sortiment ✔ Hohe Qualität ✔ Aktionen & Angebote ✔ Rezepte ✔ Produktinformationen ✔ RatgeberInfo text
At ALDI DX, we develop innovative digital products and services for our employees as well as our customers in 11 ALDI SÜD countries and over 7,300 ALDI SÜD stores worldwide. We drive digital value to offer great quality at the lowest price.
We will be guided along the way by the three core values of the ALDI SÜD Group – simplicity, reliability and responsibility. Our team and our performance are also at the heart of everything we do at ALDI DX.
Your Job
What you give your best for.
- Monitoring, restarting, analysing, fixing and improving existing data pipelines between source systems and the data lake in both directions
- Communicating the impact of service degradations with data lake user community and internal service management team
- Handling incident and problem management for the team
- Observing, controlling and optimising the cluster configuration (i.e. setup, version, credentials) in collaboration with the cloud team
- Developing and maintaining squad-specific data architecture and pipelines that adhere to defined ETL and data lake principles
- Solving technical data problems that help the business area achieve its goals
- Proposing and contributing to education and improvement plans for IT operations capabilities, standards, tools and processes
Your Profile
What you should have.
- Background in computer science
- Three years of experience in an IT operations role, working with solutions in distributed computing, big data and advanced analytics
- Expertise in SQL, data analysis and at least one programming language (e.g. Python)
- Understanding of database administration, ideally using Databricks/Spark and SQL Server DB, as well as knowledge of relational, NoSQL and cloud database technologies
- Proficiency in distributed computing and the underlying concepts, preferably Spark and MapReduce
- Familiarity with Microsoft Azure tools, e.g. Azure Data Factory, Azure Databricks, Azure Event Hub
- Operational knowledge of ETL, scheduling, reporting tools, data warehousing as well as structured and unstructured data
- Familiarity with the Unix operating system, especially shell scripting
- Basic understanding of network level problems and connectivity requirements
- Excellent communication skills and business fluency in English; knowledge of German is a plus
Your Benefits
How we value your work.
- Mobile working within Germany and flexible working hours
- State-of-the-art technologies
- Attractive remuneration as well as holiday and Christmas bonuses
- Future-oriented training and development
- Modular onboarding and buddy
- Health activities
Your Tech Stack
What you work with, among other things.
- Azure Databricks
- Azure Data Factory
- Python
- PySpark
- ServiceNow
- M365
- Many more depending on the job
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Azure Big Data Computer Science Data analysis Databricks DataOps Data pipelines Data Warehousing ETL NoSQL Pipelines PySpark Python Shell scripting Spark SQL Unstructured data
Perks/benefits: Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.