Data Engineer

Hyderabad, India

Sanofi

Sanofi pushes scientific boundaries to develop breakthrough medicines and vaccines. We chase the miracles of science to improve people’s lives.

View all jobs at Sanofi

Apply now Apply later

Job Title: Senior Data Engineer

Location: Hyderabad

Opella, the Consumer Healthcare Business Unit of Sanofi. We’re the purest and third-largest player globally in the Over The Counter (OTC) & Vitamins, Minerals & Supplements (VMS) market. We believe in the power of self-care and the role it can play in creating a healthier society and a healthier planet. That’s why we want to make self-care as simple as it should be by being consumer-led always. Through our over 100 loved brands such as Allegra, Dulcolax and Buscopan, we deliver our mission: helping more than half a billion consumers worldwide take their health in their hands. This mission is brought to life by an 11,000-strong team, 13 best-in-class manufacturing sites, and 4 specialized science and innovation development centers. We are proudly B Corp certified in multiple markets.

We aim to be a positive force by embedding sustainability throughout our operations and culture. To succeed, we seek talented individuals who can transform our business and support our mission to become the world's best fast-moving consumer healthcare (FMCH) company in and for the world.

About the Role

  • As a Data Engineer at Opella, you will play a critical role in designing, developing, and maintaining our data infrastructure and pipelines.

  • The ideal candidate will apply modern cloud data technologies, work both independently and as part of a team, and partner with diverse stakeholders including analysts, scientists, engineers, architects, and project managers to deliver business-aligned outcomes.

​Key Responsibilities

  • Design and maintain comprehensive data pipelines using Apache Airflow, dbt, Databricks, and Snowflake technologies

  • Create and refine Python scripts for efficient ETL/ELT processes

  • Oversee Snowflake cloud database operations with focus on security, performance, and availability

  • Implement structured data transformations through dbt for enhanced modeling and reporting

  • Utilize Elementary for comprehensive data quality monitoring and reliability assurance

  • Partner with diverse teams to capture requirements, design data models, and drive data initiatives

  • Ensure optimal workflow performance through continuous monitoring and optimization to meet business standards

  • Apply governance and security best practices to maintain data integrity and compliance

  • Support analytics teams by preparing high-quality datasets for analysis and machine learning projects

Qualifications

Required Experience & Skills:

  • Technical Expertise: Minimum 10+ years of hands-on data engineering experience with proficiency in AWS/Azure, Snowflake, dbt, Airflow, Python, and Databricks or Iceberg. Good to have Informatica (IICS) hands-on.

  • Python Development: Strong capabilities in Python programming for data manipulation, automation, and scripting

  • Data Orchestration: Deep understanding of Apache Airflow for pipeline orchestration and workflow management.

  • Cloud Database: Extensive experience with Snowflake architecture, including Snowpipe implementation, warehouse optimization, Back-up and Recovery planning, and query performance tuning

  • Data Source: SAP eco-system

  • Data Transformation: Expertise in using dbt for building scalable data models and transformation workflows

  • Data Quality: Practical experience with Elementary for pipeline observability and data quality assurance

  • SQL Proficiency: Advanced SQL skills for complex data querying and transformation

  • Data Architecture: Proven experience in data modeling, schema design, and performance optimization

  • Governance & Security: Solid understanding of data governance frameworks, security best practices, and privacy regulations

  • Collaboration: Excellent problem-solving abilities with strong attention to detail, capable of working both independently and in team environments

  • Functional Domain: FMCH

Preferred Additional Skills:

  • AI enthusiast

  • Automation expertise

  • Experience with AWS/Azure/GCP cloud platforms

  • Knowledge of CI/CD methodologies and Git version control

  • Understanding of modern data architectures including data lakes and real-time processing

  • Familiarity with BI tools such as Power BI, Tableau, Looker

  • Ingestion/maintenance of the data from Google Analytics, TRAX, Salesforce CRM.

Education

  • Bachelor’s degree in computer science, Information Technology, or similar quantitative field of study.

  • Fluent in English

  • Function effectively within teams of varied cultural backgrounds and expertise sources.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azure CI/CD Computer Science Databricks Data governance Data pipelines Data quality dbt ELT Engineering ETL GCP Git Informatica Looker Machine Learning Pipelines Power BI Privacy Python Salesforce Security Snowflake SQL Tableau

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this