Staff Engineer, Data Engineer

Remote, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

👋🏼We're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

Job Description

REQUIREMENTS:

  • Total experience 7+ years.
  • Strong working experience in data engineering.
  • Strong experience in building scalable ETL/ELT data pipelines
  • Hands on experience in Big Data technologies such as Hadoop, MapReduce, Spark (including tuning & optimization)
  • Hands-on with Python,  PySpark, Azure (ADF, ADLS Gen2, Synapse Analytics) and Kafka for streaming data
  • String knowledge of Cloud computing (preferably AWS) – EC2, S3, RDS, Redshift, Glue, EMR
  • Strong experience with Databricks, Delta Lake, and Databricks SQL
  • Hands on experience in Data modeling (Star Schema, Snowflake Schema, Normalization/Denormalization) and Data Governance & Security (Lineage, Encryption, Access Control)
  • Experience with SQL and MySQL
  • Experience in Data Quality frameworks and Proficiency in building and maintaining ETL/ELT pipelines using tools such as Apache Airflow, DBT, or similar.
  • Familiarity with data visualization tools and strategies is a plus (e.g., Power BI, Tableau).
  • Knowledge of data governance, security practices, and compliance standards like GDPR and CCPA.
  • Excellent problem-solving, communication, and collaboration skills.

RESPONSIBILITIES:

  • Writing and reviewing great quality code.
  • Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
  • Mapping decisions with requirements and be able to translate the same to developers.
  • Identifying different solutions and being able to narrow down the best option that meets the clients' requirements.
  • Defining guidelines and benchmarks for NFR considerations during project implementation
  • Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers.
  • Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
  • Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it.
  • Understanding and relating technology integration scenarios and applying these learnings in projects.
  • Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
  • Carrying out POCs to make sure that suggested design/technologies meet the requirements.

Qualifications

Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Airflow Architecture AWS Azure Big Data Computer Science Databricks Data governance Data pipelines Data quality Data visualization dbt EC2 ELT Engineering ETL Hadoop Kafka MySQL Pipelines Power BI PySpark Python Redshift Security Snowflake Spark SQL Streaming Tableau

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this