Data Engineer Expert

Hyderabad

Sanofi

Explore Sanofi's global impact through our science, healthcare R&D, and partnerships. Committed to advancing global health with innovative solutions.

View all jobs at Sanofi

Apply now Apply later

  • Job Title: Data Engineer Expert
  • Location: Hyderabad

About Opella: We are Opella. The purest and 3rd largest Global Over-the-counter (OTC) and Vitamins, Minerals & Supplements (VMS) player. ~11,000 employees, brand-led with consumers at the core, we’re unified behind one mission: Health. In Your Hands. By making self-care as simple as it should be. We believe that health is more than a set of symptoms. It’s inextricably linked to the world around us. That’s why we challenge the conventions of care, defining a whole new way to think about health – for our people, our communities, and our planet. 

Position Overview: As a Data Engineer at Opella, you will play a critical role in designing, developing, and maintaining our data infrastructure and pipelines. You will work closely with data scientists, analysts, and other stakeholders to ensure that data is accessible, accurate, and actionable. The ideal candidate will have expertise in Snowflake database, Python, dbt, and be proficient in all major cloud platforms (AWS, Google Cloud, Azure). Experience with Informatica Cloud and Airflow will be considered a plus.

Key Responsibilities:

  • Design, implement, and maintain data pipelines and ETL processes using Snowflake, Python, and dbt.
  • Develop and optimize data models and schemas to support analytical and reporting needs.
  • Collaborate with cross-functional teams to understand data requirements and deliver scalable data solutions.
  • Manage data integration and migration projects across various cloud platforms including AWS, Google Cloud, and Azure.
  • Monitor and troubleshoot data workflows and ensure high data quality and availability.
  • Stay current with industry trends and best practices in data engineering and cloud technologies.

Required Qualifications:

  • Bachelor’s degree in computer science, Engineering, Data Science, or a related field.
  • Proven experience with Snowflake database design and optimization.
  • Proficiency in Python for data manipulation, automation, and ETL/ELT processes.
  • Strong experience with dbt for data transformation and modeling.
  • Hands-on experience with major cloud providers (AWS, Google Cloud, Azure).
  • Solid understanding of data warehousing concepts and practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills with the ability to work collaboratively in a team environment.

Optional Qualifications:

  • Experience with Informatica Cloud for data integration and cloud data management.
  • Familiarity with Apache Airflow for workflow automation and scheduling.
  • Knowledge of data governance and data security best practices.

Pursue progress, discover extraordinary

Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people.

At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.

Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure Computer Science Data governance Data management Data pipelines Data quality Data Warehousing dbt ELT Engineering ETL GCP Google Cloud Informatica Pipelines Python Security Snowflake

Region: Asia/Pacific
Country: India

More jobs like this