Senior Engineer, Python Data Engineer
Gurugram, India
- Remote-first
- Website
- @nagarro 𝕏
- Search
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
Job Description
REQUIREMENTS:
- Total experience 3+years.
- Hands on experience as a Snowflake SQL Developer.
- Strong hands-on expertise in SQL programming and Snowflake development.
- Experience working with structured and semi-structured data (JSON, Parquet, Avro) in Snowflake.
- Knowledge of Snowflake features like Streams, Tasks, Snowpipe, Time Travel, and Cloning.
- Background in data modeling and warehouse design (Star/Snowflake schema).
- Strong analytical skills and attention to detail.
- Experience with pharmaceutical data (e.g., clinical trials, sales, patient data, regulatory, RWE) is preferred.
- Familiarity with data governance, compliance (HIPAA, GDPR, CFR Part 11), and security standards.
- Hands-on experience with ETL/ELT tools (dbt, Matillion, Informatica, Talend, Fivetran).
- Cloud platform experience (AWS, Azure, GCP) and integration techniques.
- Certifications in Snowflake or cloud platforms are a plus.
- Familiarity with scripting languages (Python, Shell) and BI tools (Tableau, Power BI, Looker).
- Excellent communication and collaboration skills.
RESPONSIBILITIES:
- Design, develop, and maintain performant SQL scripts, stored procedures, and queries in Snowflake.
- Work closely with business analysts and domain experts to translate pharmaceutical data requirements into scalable solutions.
- Build and optimize ELT pipelines using Snowflake features and external ETL tools.
- Create and maintain data models and data marts to support analytics and reporting.
- Collaborate with cross-functional teams to troubleshoot issues and improve system efficiency.
- Implement data security, governance, and compliance protocols based on industry regulations.
- Tune performance for complex queries and large data sets to ensure optimal utilization.
- Develop reusable components and utilities for accelerated development.
- Support production rollouts, troubleshoot incidents, and drive resolution.
- Participate in knowledge-sharing initiatives such as white papers, technical discussions, and learning sessions.
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Avro AWS Azure Computer Science Data governance dbt ELT Engineering ETL FiveTran GCP Informatica JSON Looker Matillion Parquet Pharma Pipelines Power BI Python Security Snowflake SQL Tableau Talend
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.