Data Engineer

Sant Cugat del Valles

Applications have closed

Roche

As a pioneer in healthcare, we have been committed to improving lives since the company was founded in 1896 in Basel, Switzerland. Today, Roche creates innovative medicines and diagnostic tests that help millions of patients globally.

View all jobs at Roche

Roche fosters diversity, equity and inclusion, representing the communities we serve. When dealing with healthcare on a global scale, diversity is an essential ingredient to success. We believe that inclusion is key to understanding people’s varied healthcare needs. Together, we embrace individuality and share a passion for exceptional care. Join Roche, where every voice matters.

The Position

Data Engineer

Full-Time (Sant Cugat) 

Diabetes is a pesky monster — and that’s putting it mildly. If you’re serious about helping us face it head-on, come join us!

Being a global leader in integrated Personalized Diabetes Management (iPDM), Roche Diabetes Care collaborates with pioneers around the globe, including people with diabetes, caregivers, healthcare professionals, and payers. We aim to transform and advance care provision and foster sustainable care structures. Under the brands RocheDiabetes, Accu-Chek, and mySugr, comprising glucose monitoring, insulin delivery systems, and digital solutions, we unite with our partners to create patient-centered value. By building and collaborating in an open ecosystem, connecting devices, and digital solutions as well as contextualizing relevant data points, we aim to support the optimization of therapy and enable better-informed therapy decisions.

For a behind-the-scenes look, check out our Roche code4life to see what makes Roche Diabetes Care tick. 

Here’s what we’re looking for:

We’re looking for an experienced data engineer who will have a significant impact on our platform making data accessible. You will work with different data stakeholders to gain a good understanding of the data needs. You will define and develop data pipelines, data products and data services. You will drive automation - wherever possible and feasible - to ensure that automation is in place to increase quality and efficiency. 

Your upcoming mission:

  • Build and maintain the data & analytics platform ensuring FAIR principles are achieved.

  • Assemble large and complex datasets within the platform and make them available for different purposes and stakeholders. 

  • Build scalable automated data pipelines to produce quality datasets from different sources meeting functional and non-functional requirements. Define and implement automated testing of the data pipelines and orchestrations.

  • Define and implement data models and data products to provide access to data to final stakeholders. 

  • Implement the tools and processes to access, manage and work with the data for the reporting, advanced analytics and evidence generation teams. 

  • Collaborate with data team members to define data quality rules and KPIs for monitoring and exception alerts.

  • Diagnose and triage infrastructure problems and outages related to the data & analytics platform.

Essential skills for this mission:

  • You have a B.S. degree in Computer Science, Information Systems, Math, Statistics, Engineering or equivalent training.

  • You have at least 3 years of professional experience working with large scale data platforms in the fields of Data Operations or Data engineering, using agile methodologies.

  • You are proficient implementing data pipelines (using i.e Pyspark, Spark SQL, Scala),  orchestration tools/services (i.e. Airflow, data factory) and testing frameworks. 

  • You have experience in databases (columnar, nosql and relational databases: Redshift, Dynamodb, Aurora, Postgres and/or Snowflake), data modeling and data management tools (Hive, Jupiter, Zeppelin and/or Databricks).

  • You have experience in one of the main cloud services (AWS, Google Cloud or Azure) with Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, etc.)

  • You are familiar with orchestration, automation, integration and continuous delivery frameworks such as Jenkins or Streamsets.

  • Proficient with Software engineering best practices, such as unit testing and integration testing, and software development tools, such as IDE, Maven, Git, Docker among others.

  • Autonomy in solving technical challenges with a problem solving mindset

Bonus skills:

  • Experience with reporting systems and visualization tools, (preferred: Quicksight, Superset; optional: Tableau, Looker).

  • Experience with DevOps, DataOps and MLOps.

  • Experience with security and privacy regulations (GDPR, HIPAA).

  • Knowledge on the Health Industry.

  • Strong analytical skills and statistical knowledge.

Here's what you can expect from us: 

  • We are ambitious and passionate people building meaningful products.

  • We actively promote personal and career development so if you like writing blog posts, contributing to open source projects, or are an active participant and/or speaker at conferences and meet-ups, we are here to support you.

  • An innovative agile working environment allowing for collaboration and knowledge sharing in cross-functional teams.

  • Hybrid home office/on-site working model. 

  • Loads of benefits (brand new Apple hardware, lunch benefit, training, ...)

  • Oh, and did we mention the best team in the world!

Who we are

At Roche, more than 100,000 people across 100 countries are pushing back the frontiers of healthcare. Working together, we’ve become one of the world’s leading research-focused healthcare groups. Our success is built on innovation, curiosity and diversity.

Roche is an Equal Opportunity Employer.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Azure Big Data Computer Science Databricks Data management DataOps Data pipelines Data quality DevOps Docker DynamoDB Engineering GCP Git Google Cloud Jenkins Kinesis KPIs Looker Mathematics Maven MLOps NoSQL Open Source Pipelines PostgreSQL Privacy PySpark QuickSight RDBMS Redshift Research Scala Security Snowflake Spark SQL Statistics Superset Tableau Testing

Perks/benefits: Career development Conferences Equity / stock options Health care Salary bonus

Region: Europe
Country: Spain

More jobs like this