Staff Engineer, Big Data Engineer

Remote, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

đŸ‘‹đŸŒWe're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!

Job Description

REQUIREMENTS:

  • Total experience 7+years.
  • Excellent knowledge and experience in Big data engineering. 
  • Strong hands-on experience with Apache Spark (PySpark) and Scala.
  • Proficiency in SQL for data transformation and querying.
  • Deep understanding of AWS Data Services: S3, Glue, Redshift, EMR.
  • Experience with streaming data tools (Kafka or Kinesis).
  • Familiarity with CI/CD practices and version control systems (e.g., Git).
  • Solid knowledge of data modeling, data lake architecture, and cloud-native data pipelines.
  • Knowledge of data governance and security practices in cloud-based environments. 
  • Problem-solving mindset with the ability to tackle complex data engineering challenges. 
  • Strong communication and teamwork skills, with the ability to mentor and collaborate effectively.

RESPONSIBILITIES: 

  • Writing and reviewing great quality code.
  • Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements 
  • Mapping decisions with requirements and be able to translate the same to developers 
  • Identifying different solutions and being able to narrow down the best option that meets the client’s requirements.
  • Defining guidelines and benchmarks for NFR considerations during project implementation 
  • Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers 
  • Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
  • Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it 
  • Understanding and relating technology integration scenarios and applying these learnings in projects 
  • Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
  • Carrying out POCs to make sure that suggested design/technologies meet the requirements.

Qualifications

Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture AWS Big Data CI/CD Computer Science Data governance Data pipelines Engineering Git Kafka Kinesis Pipelines PySpark Redshift Scala Security Spark SQL Streaming

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this