Data Engineer

NLD-LI Heerlen Earl Bakkenstraat, Netherlands

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Medtronic

Medtronic on maailmanlaajuinen johtaja terveyteen liittyvien teknologioiden, palvelujen ja ratkaisujen alalla. Teemme laajalti yhteistyötä terveydenhuollon suurimpiin haasteisiin vastaamiseksi. Selvitä, miten teemme sen.

View all jobs at Medtronic

Apply now Apply later

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world.

A Day in the LifeAre you passionate about harnessing the power of data to drive innovation and business success? Join our dynamic team as a Data Engineer, where you'll play a pivotal role in designing and building cutting-edge data pipelines, optimizing cloud infrastructures, and exploring innovative tools like LLMs and agentic systems. This is an exciting opportunity to collaborate with cross-functional teams, solve complex challenges, and make a meaningful impact on our data ecosystem. If you're ready to take your expertise to the next level and thrive in a fast-paced, technology-driven environment, we want to hear from you!

This role offers a flexible working setup, with the option for hybrid work. Occasional travel may be required to collaborate with team members or stakeholders on key projects. You'll be joining a forward-thinking organization that values innovation, collaboration, and continuous learning, making it the perfect place to grow your career while contributing to transformative solutions.

Responsibilities may include the following and other duties may be assigned:

  • Design & Build Data Pipelines: Architect, implement, and optimize ETL/ELT processes to ingest data from various sources (databases, APIs, streaming systems)
  • Data Modelling: Develop and maintain logical and physical data models in data warehouses/lakes like AWS S3 & Athena or Oracle
  • Deploy and manage data infrastructure on cloud platforms (AWS, Azure, or GCP) using IaC tools to ensure scalability, reliability and cost efficiency
  • Data Quality & Governance: Implement data validation, monitoring, and alerting frameworks to ensure accuracy, consistency, and security of data assets
  • Collaboration & Documentation: Understand requirements, translate business needs into technical solutions, and document data schemas, pipeline designs, and best practices
  • Explore and integrate innovative data tools, including LLMs and agentic systems, into the existing data ecosystem to enhance analytics and automation.

Required Knowledge and Experience:

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field (or equivalent work experience). Master’s degree is a bonus
  • 3+ years of hands-on experience in data engineering or related roles
  • Experience building and optimizing data pipelines, architectures, and data sets. Strong understanding of ETL/ELT concepts and best practices
  • Proficient in SQL and relational databases (e.g., PostgreSQL, MySQL, Oracle). Knowledge of data warehousing solutions (e.g., Snowflake). Experience with big data processing frameworks (e.g., Apache Spark, Hadoop). Experience with data transformation, metadata management, dependencies, and workload orchestration. Hands-on experience with cloud data services (e.g., AWS Glue, Azure Data Factory, GCP Dataflow)
  • Familiarity with scripting languages such as Python (required), Scala, or Java. Comfortable with shell scripting and command-line tools (e.g., Bash, CLI) . Experience with version control systems, especially Git, including Git user interfaces (e.g., GitHub, GitLab) . Experience using LLMs and agentic tools for process automation
  • Soft skills: Strong analytical and problem-solving skills. Effective verbal and written communication skills. Customer-focused mindset with the ability to adapt to evolving needs. Collaborative, proactive, and able to build trust with both internal and external stakeholders. Comfortable navigating ambiguity and solving complex, large-scale problems. Curious, resilient, and motivated to continuously improve and explore new solutions.

Desirable qualifications 

Experience with containerization and orchestration (Docker, Kubernetes). Familiarity with real-time data streaming. Understanding of data privacy and security standards (GDPR, HIPAA). Exposure to BI tools (Looker, Tableau, Power BI) and dashboard development. Certification in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, Azure Data Engineer Associate. 

Physical Job Requirements

The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. 

Benefits & Compensation

Medtronic offers a competitive Salary and flexible Benefits Package
A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create.  We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage.
 

About Medtronic

We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions.
Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. 
We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.

Learn more about our business, mission, and our commitment to diversity here  
 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: APIs Architecture Athena AWS AWS Glue Azure Big Data Computer Science Data Analytics Dataflow Data pipelines Data quality Data Warehousing Docker ELT Engineering ETL GCP Git GitHub GitLab Hadoop Healthcare technology Java Kubernetes LLMs Looker MySQL Oracle Pipelines PostgreSQL Power BI Privacy Python R R&D RDBMS Scala Security Shell scripting Snowflake Spark SQL Streaming Tableau

Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Health care Salary bonus

Region: Europe
Country: Netherlands

More jobs like this