Data Engineering Analyst - RWD
Hyderabad
Sanofi
Sanofi pushes scientific boundaries to develop breakthrough medicines and vaccines. We chase the miracles of science to improve people’s lives.Primary Responsibilities
Ownership of the entire back end of the application including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes
Fine-tune and optimize queries using Snowflake platform and databases techniques
Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements
Assess and resolve data pipeline issues to ensure performance and timeliness of execution
Assist with technical solution discovery to ensure technical feasibility
Assist in setting up and managing CI/CD pipelines and development of automated tests
Developing and managing microservices using python
Conduct peer reviews for quality, consistency, and rigor for production level solution
Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods
Promote best practices and standards for code management, automated testing, and deployments
Own all areas of the product lifecycle: design, development, test, deployment, operation, and support
Create detail documentation on Confluence to be able to support and maintain codebase and its functionality
About you
Qualifications
3+ years of relevant experience developing backend, integration, data pipelining, and infrastructure
Bachelor’s degree in computer science, engineering, or similar quantitative field of study
Expertise in database optimization and performance improvement
Expertise in Python, PySpark, and Snowpark
Experience data warehouse and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries
Experience in cloud-based data platforms (Snowflake, AWS)
Proficiency in developing robust, reliable APIs using Python and FastAPI Framework
Understanding of data structures and algorithms
Experience in modern testing framework (SonarQube, K6 is a plus)
Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side
Knowledge of DevOps best practices and associated tools, a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools:
Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift)
Infrastructure as code (Terraform)
Monitoring and Logging (CloudWatch, Grafana)
CI/CD Pipelines (JFrog Artifactory)
Scripting and automation (Python, GitHub, Github actions)
Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people.
At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.
Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS CI/CD Computer Science Confluence Data pipelines Data warehouse DevOps Engineering ETL FastAPI GitHub Grafana Kubernetes Microservices Pipelines PostgreSQL PySpark Python RDBMS Snowflake SQL Terraform Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.