Data Engineer- AI/ML

Canada Pharma Campus

Roche

As a pioneer in healthcare, we have been committed to improving lives since the company was founded in 1896 in Basel, Switzerland. Today, Roche creates innovative medicines and diagnostic tests that help millions of patients globally.

View all jobs at Roche

Apply now Apply later

At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections,  where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters.

The Position

A healthier future. That’s what drives us. 

Galileo is a strategic Roche Informatics program aiming to enable high-value AI (initial focus: Generative AI - GenAI) use cases at Roche through fit-for-purpose platforms and services, establishing a foundation for a Center of Excellence in AI. The recently formed Use Case Delivery (UCD) Team, consisting of a number of delivery squads, is tasked with building innovative GenAI applications.

We are looking for a highly skilled and dedicated Data Engineer to join a new AI solutions development squad that will be building cutting-edge applications leveraging Large Language Models (LLMs). We will be building AI solutions end-to-end: from concept, through prototyping, productization, to operations. The Data Engineer will be responsible for designing, building, and maintaining robust data infrastructure to support AI applications. The ideal candidate will have expertise in handling structured and unstructured data, vector databases, real-time data processing, and cloud-based AI solutions (AWS or Azure).

The Opportunity:

  • Generative AI Application Co-creation: Collaborate with AI engineers, data scientists, product owners, and other developers in Agile teams to integrate LLMs into scalable, robust, fair and ethical end-user applications, focusing on user experience, relevance, and real-time performance

  • Data Infrastructure Development and Data Integration: Design and implement scalable, high-performance data pipelines for AI/GenAI applications, ensuring efficient data ingestion, transformation, storage and retrieval; integrate different databases, requiring understanding of data architectures / Domain data ecosystem

  • Vector Database Management: Work with vector databases (e.g., AWS OpenSearch or Azure AI Search) to store and retrieve high-dimensional data for Generative AI workloads

  • Cloud-Based Data Engineering: Build and maintain cloud-based data solutions using AWS (OpenSearch, S3) or Azure (Azure AI Search, Azure Blob Storage)

  • Snowflake Implementation: Design and optimize data storage and processing using Snowflake for scalable, cloud-native analytics solutions

  • Data Processing & Transformation: Develop ETL/ELT pipelines to enable real-time and batch data processing

  • Support AI Model Workflows: Collaborate with AI/ML Engineers and Data Scientists to ensure seamless integration of data pipelines with AI finetuning, inference and training workflows

  • Performance Optimization: Optimize data storage, retrieval, and processing strategies for efficiency, scalability, and cost-effectiveness

Who you are:

  • Experience: A minimum of 5-7+ years in data engineering, preferably supporting AI/ML applications and hold B.Sc., B.Eng., or higher, or equivalent in Computer Science, Data Engineering or related fields

  • Programming: Proficiency in Python, SQL and vector database native languages

  • Databases: Experience with relational, NoSQL, vector databases, and Snowflake in particular

  • Cloud Platforms: Hands-on experience with AWS (OpenSearch, S3, Lambda) or Azure (Azure AI Search, Azure Blob Storage, Azure Automation)

  • ETL/ELT Pipelines: Experience building scalable ETL/ELT workflows using dbt, Apache Airflow, or similar

  • APIs & Microservices: Ability to design and integrate RESTful APIs for data exchange

  • Data Security & Governance: Understanding of encryption and role-based access controls

  • Version Control & DevOps: Familiarity with Git, CI/CD, containerization (Docker, Kubernetes), and Infrastructure as Code (Terraform, CloudFormation)

  • Generative AI Support: Experience working with AI-specific data needs, such as embeddings, RAG (Retrieval Augmented Generation), and LLM fine-tuning data preparation

Relocation benefits are not available for this job posting.


 

Who we are

A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact.


Let’s build a healthier future, together.

Roche is an Equal Opportunity Employer.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Agile Airflow APIs Architecture AWS Azure CI/CD CloudFormation Computer Science Data pipelines dbt DevOps Docker ELT Engineering ETL Generative AI Git Kubernetes Lambda LLMs Machine Learning Microservices NoSQL OpenSearch Pipelines Prototyping Python RAG Security Snowflake SQL Terraform Unstructured data

Perks/benefits: Career development Relocation support

Region: North America
Country: Canada

More jobs like this