Senior Staff Engineer, Data-Engineer

Gurugram, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

šŸ‘‹šŸ¼We're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!

Job Description

REQUIREMENTS:

  • Total experience 10+years.
  • Experience in data engineering and database management.
  • Expert knowledge in PostgreSQL (preferably cloud-hosted on AWS, Azure, or GCP).
  • Experience with Snowflake Data Warehouse and strong SQL programming skills.
  • Deep understanding of stored procedures, performance optimization, and handling large-scale data.
  • Knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning.
  • Strong understanding of index design and performance tuning techniques.
  • Familiarity with SQL security techniques, including data encryption, Transparent Data Encryption (TDE), signed stored procedures, and user permission assignments.
  • Competence in data preparation and ETL tools to build and maintain data pipelines and flows.
  • Experience in data integration by mapping various source platforms into Entity Relationship Models (ERMs).
  • Exposure to source control systems like Git, Azure DevOps.
  • Expertise in Python and Machine Learning (ML) model development.
  • Experience in automated testing and test coverage tools.
  • Hands-on experience in CI/CD automation toolsĀ 
  • Programming experience in GolangĀ 
  • Understanding of Agile methodologies (Scrum, Kanban).
  • Ability to collaborate with stakeholders across Executive, Product, Data, and Design teams.

RESPONSIBILITIES:

  • Design and maintain an optimal data pipeline architecture.
  • Assemble large, complex data sets to meet functional and non-functional business requirements.
  • Develop pipelines for data extraction, transformation, and loading (ETL) using SQL and cloud database technologies.
  • Prepare and optimize ML models to improve business insights.
  • Support stakeholders by resolving data-related technical issues and enhancing data infrastructure.
  • Ensure data security across multiple data centers and regions, maintaining compliance with national and international data laws.
  • Collaborate with data and analytics teams to enhance data systems functionality.
  • Conduct exploratory data analysis to support database and dashboard development.

Additional Information

Bachelorā€™s or masterā€™s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0

Tags: Agile Architecture AWS Azure CI/CD Computer Science Data analysis Data pipelines Data warehouse DevOps EDA Engineering ETL GCP Git Golang Kanban Machine Learning ML models Pipelines PostgreSQL Python Scrum Security Snowflake SQL Testing

Region: Asia/Pacific
Country: India

More jobs like this