DataOps Engineer (m/f/d)

Madrid, ES

BASF

Our aspiration is to grow profitably and create value for society ► This is how we create chemistry for a sustainable future

View all jobs at BASF

Apply now Apply later

ABOUT US

At BASF Digital Hub Madrid we develop innovative digital solutions for BASF, create new exciting customer experiences and business growth, and drive efficiencies in processes, helping to strengthen BASF´s position as the digital leader in the chemical industry. We believe the right path is through creativity, trial and error and great people working and learning together. Become part of our team and develop the future with us - in a global team that embraces diversity and equal opportunities.

WHAT YOU CAN EXPECT

At our unit “Data Foundation - Big Data Management” we aim to offer organizations a robust and scalable solution for managing and deriving insights from vast quantities of data. By utilizing Azure's PaaS components, our platform streamlines the deployment and handling of Big Data workloads, empowering our clients to make data-driven decisions and propel business expansion. Our team is accountable for the creation and upkeep of the Big Data platform built on Azure PaaS components EDL (Enterprise Data Lake). We collaborate intensively with stakeholders to comprehend their needs and devise solutions that fulfill their expectations. The team is also in charge of guaranteeing the platform's scalability, dependability, and security, and for staying abreast with the newest technologies and trends in the big data domain. 

 

About the Job: 

We are seeking a highly motivated and detail-oriented candidate who will be responsible for the development of new and existing software and data pipelines with Python and Databricks and maintaining/developing CI/CD pipelines and processes in an ambitious team.

RESPONSIBILITIES

  • Develop, analyse, improve and test quality software to meet both user and internal needs.
  • Experience developing data pipelines, troubleshooting and implementing data quality control measures to ensure data’s accuracy and consistency.
  • Work in cloud environment  (Azure and Databricks).
  • Collaborate with users and the data science team to understand their data needs and provide suitable solutions.
  • Test new features and improvements and suggest their implementation if beneficial.
  • Demonstrate fluent communication skills in English (spoken and written).

 

QUALIFICATIONS

  • Bachelor's degree in computer science, Information Technology, Engineering, Business, or related fields.
  • Minimum 3-4 years related working experience as DataOps Engineer using Azure, Python, Azure Data Factory and Databricks
  • Strong software engineering and development skills with Python and Git.
  • Experience with Databricks and Azure’s cloud environment.
  • Experience with databases, data structures and data manipulation.
  • Experience in an Agile way of working with a DevOps mindset
  • Knowledge in automation & scripting with tools like Azure DevOps pipelines, etc.
  • In-depth know how with Big Data concepts and Databricks. • Team player with strong interpersonal, written, and verbal communication skills.

NICE TO HAVE

Experience with event streaming, message brokers and other event driven architectures (Kafka, RabbitMQ, etc.) and familiarity with Big Data concepts.

 Ideally, managing and operating solutions via command line utilities and working to improve automation and enlist modern practises (DevOps, CI/CD, Terraform, etc).

Integration knowledge between Data Catalog Collibra and Big Data Platforms.

Ideally, backed up by vendor certifications (e.g. Microsoft, Linux).

BENEFITS

  • A secure work environment because your health, safety and wellbeing is always our top priority.
  • Flexible work schedule and Home-office options, so that you can balance your working life and private life.
  • Learning and development opportunities
  • 23 holiday days per year
  • 5 additional days (readjustment)
  • 2 cultural days
  • A collaborative, trustful and innovative work environment
  • Being part of an international team and work in global projects
  • Relocation assistance to Madrid provided
  • Developing and supporting custom made business intelligence software in a broad range of business topics in an international team

At BASF, the chemistry is right.

Because we are counting on innovative solutions, on sustainable actions, and on connected thinking. And on you. Become a part of our formula for success and develop the future with us - in a global team that embraces diversity and equal opportunities irrespective of gender, age, origin, sexual orientation, disability or belief.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture Azure Big Data Business Intelligence Chemistry CI/CD Computer Science Databricks Data management DataOps Data pipelines Data quality DevOps Engineering Git Kafka Linux Pipelines Python RabbitMQ Security Streaming Terraform

Perks/benefits: Career development Flex hours Health care Relocation support

Region: Europe
Country: Spain

More jobs like this