Data Engineer (all genders)
Hamburg, Germany
diconium
Wir sind ein globales Unternehmen, spezialisiert auf den Aufbau von Software-Delivery-Organisationen & ermöglichen Wertschöpfung aus Software, Daten & KI.Join our global team of experts
At diconium, you will work on projects that create value from software, data, and AI, enabling businesses to achieve more with less. You will collaborate with over 2,300 fellow experts to support global leading companies in maximizing the impact of digital efforts and delivering solutions with measurable business impact. We prioritize people and genuine human connection, ensuring a supportive and inclusive work environment. And we give you maximum flexibility thanks to our hybrid workplace.
WHAT YOU CAN EXPECTYou will support our customers in sourcing data efficiently and transparently and thus create the basis for the development of smart data products in exciting areas such as mobility, automotive, industry, consumer goods, finance and non-profit organizations.
You will develop and operate scalable data pipelines that enable our internal and external customers to efficiently process their data and turn it into valuable insights.
You will be part of a dynamic team working on cutting-edge projects to develop innovative and customized solutions for our clients.
You will be responsible for the continuous monitoring and optimization of data pipelines to ensure high data quality and performance.
You will act as a technical advisor to internal and external stakeholders, helping them to make informed technology decisions that support their business goals.
You will work with various data formats - from raw to structured to semi-structured and unstructured data - and utilize both batch and real-time processing frameworks to create valuable data solutions.
Must haves:
- At least 3 years of relevant work experience
- Sound knowledge of Python for ETL/ELT processes and SQL for data queries and optimization
- Practical experience with cloud platforms (Azure, AWS or GCP)
- Knowledge of developing and managing data pipelines and workflows (e.g. Apache Airflow)
- Familiarity with container technologies such as Docker and Kubernetes for managing data applications
- Experience with CI/CD tools for the automation of deployment processes
- Experience with data warehouses and NoSQL databases
- Fluency in English
Nice to have:
- Knowledge of real-time data processing (e.g. Apache Kafka or Azure Event Hubs)
- Experience with big data technologies such as Apache Spark and distributed data processing frameworks
- Knowledge of Data Lakes and Data Vault / Snowflake architectures
- Experience with versioning data with Delta Lake or Apache Iceberg
- Additional programming language (e.g. Java, Scala, Golang, Rust)
- Basic knowledge of machine learning frameworks (e.g. TensorFlow or PyTorch)
- German language skills
WHAT WE HAVE TO OFFER
Discover new skills and improve your strengths, adapt your working day to your personal lifestyle, celebrate community, sustainability and diversity. And sweeten your working life with awesome perks and benefits!
Professional & Personal Growth: Develop yourself both professionally and personally through training programs, free language courses, competence centers and an active tech community.
Flexible Work-Life Balance: Benefit from hybrid work, workation, flexible hours, parental support and sabbaticals.
Embrace Diversity & Sustainability: Engage in our Sustainability Hub, diverse communities, Diversity Taskforce and after-work activities.
Comprehensive Benefits: Enjoy public transport tickets, job bikes, health offers, supplementary insurances, a pension plan and various discounts.
WHAT WE VALUE
At diconium, we value and recognize the unique perspectives and experiences of each individual. With this in mind, we welcome and cherish every single application equally. At the same time, we stand up against any type of discrimination and harassment based on gender, age, skin color, religion, sexual orientation, origin, disability, gender identity and other protected characteristics.
If you have any questions, feel free to reach out.
Your contact person is
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure Big Data CI/CD Data pipelines Data quality Docker ELT ETL Finance GCP Golang Java Kafka Kubernetes Machine Learning NoSQL Pipelines Python PyTorch Rust Scala Snowflake Spark SQL TensorFlow Unstructured data
Perks/benefits: Career development Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.