Data Engineering Analyst - RWD

Hyderabad, India

Sanofi

Sanofi pushes scientific boundaries to develop breakthrough medicines and vaccines. We chase the miracles of science to improve people’s lives.

View all jobs at Sanofi

Apply now Apply later

Main responsibilities:

  • Ownership of the entire back end of the application including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes
  • Fine-tune and optimize queries using Snowflake platform and databases techniques
  • Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements
  • Assess and resolve data pipeline issues to ensure performance and timeliness of execution
  • Assist with technical solution discovery to ensure technical feasibility
  • Assist in setting up and managing CI/CD pipelines and development of automated tests
  • Developing and managing microservices using python
  • Conduct peer reviews for quality, consistency, and rigor for production level solution
  • Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods
  • Promote best practices and standards for code management, automated testing, and deployments
  • Own all areas of the product lifecycle: design, development, test, deployment, operation, and support
  • Create detail documentation on Confluence to be able to support and maintain codebase and its functionality

Key Functional Requirements & Qualifications: 

  • 3+ years of relevant experience developing backend, integration, data pipelining, and infrastructure
  • Bachelor’s degree in computer science, engineering, or similar quantitative field of study
  • Expertise in database optimization and performance improvement
  • Expertise in Python, PySpark, and Snowpark
  • Experience data warehouse and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries
  • Experience in cloud-based data platforms (Snowflake, AWS)
  • Proficiency in developing robust, reliable APIs using Python and FastAPI Framework
  • Understanding of data structures and algorithms
  • Experience in modern testing framework (SonarQube, K6 is a plus)
  • Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side
  • Knowledge of DevOps best practices and associated tools, a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools
  • Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift)
  • Infrastructure as code (Terraform)
  • Monitoring and Logging (CloudWatch, Grafana)
  • CI/CD Pipelines
  • Scripting and automation (Python, GitHub, Github actions)

Pursue progress, discover extraordinary

Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people.

At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.

Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: APIs Architecture AWS CI/CD Computer Science Confluence Data pipelines Data warehouse DevOps Engineering ETL FastAPI GitHub Grafana Kubernetes Microservices Pipelines PostgreSQL PySpark Python RDBMS Snowflake SQL Terraform Testing

Region: Asia/Pacific
Country: India

More jobs like this