Senior Staff Engineer, Big Data Engineer (Scala and Snowflake)

Remote, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

👋🏼We're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

Job Description

REQUIREMENTS:

  • Total experience 10+ years.
  • Excellent knowledge and experience in Big data engineer. 
  • Strong working experience with architecture and development in Spark, Scala, AWS (EMR, EC2, Lambda, Glue, S3 etc), Architecture and Designing, SQL Server/NoSQL.
  • Hands on experience in Hadoop, Spark, Scala, SQL and Kafka.
  • Expertise in AWS and Snowflake, procedures in snowflake orchestrated through Airflow hence working knowledge on Python and Airflow.
  • Experience with building and maintaining a cloud system.
  • Familiarity with data modeling, data warehousing, and building distributed systems. 
  • Expertise in Spanner for high-availability, scalable database solutions. 
  • Knowledge of data governance and security practices in cloud-based environments. 
  • Problem-solving mindset with the ability to tackle complex data engineering challenges. 
  • Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. 

RESPONSIBILITIES: 

  • Writing and reviewing great quality code. 
  • Understanding the client’s business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements. 
  • Mapping decisions with requirements and be able to translate the same to developers. 
  • Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. 
  • Defining guidelines and benchmarks for NFR considerations during project implementation.
  • Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers. 
  • Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. 
  • Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. 
  • Understanding and relating technology integration scenarios and applying these learnings in projects.
  • Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. 
  • Carrying out POCs to make sure that suggested design/technologies meet the requirements. 

Qualifications

Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow Architecture AWS Big Data Computer Science Data governance Data Warehousing Distributed Systems EC2 Engineering Hadoop Kafka Lambda NoSQL Python Scala Security Snowflake Spark SQL

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this