Associate Staff Engineer, Big Data Engineer

Bengaluru, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

đŸ‘‹đŸŒWe're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

Job Description

REQUIREMENTS: 

  • Total experience 5+years.
  • Hands on experience in Data Engineering, Data Lakes, Data Mesh, or Data Warehousing/ETL environments.
  • Strong working knowledge in Python, SQL, Airflow and PySpark.
  • Hands-on experience implementing projects applying SDLC practices.
  • Hands on experience in building data pipelines and building data frameworks for unit testing, data lineage tracking, and automation.
  • Experience with building and maintaining a cloud system.
  • Familiarity with databases like DB2 and Teradata.
  • Strong working knowledge in Apache Spark, Apache Kafka, Hadoop and MapReduce.
  • Strong troubleshooting skills and ability to design for scalability and flexibility.
  • Expertise in Spanner for high-availability, scalable database solutions. 
  • Knowledge of data governance and security practices in cloud-based environments. 
  • Problem-solving mindset with the ability to tackle complex data engineering challenges
  • Familiar with containerization technologies (Docker/Kubernetes).
  • Excellent communication and collaboration skills.

RESPONSIBILITIES: 

  • Writing and reviewing great quality code.
  • Understanding the client’s business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements 
  • Mapping decisions with requirements and be able to translate the same to developers 
  • Identifying different solutions and being able to narrow down the best option that meets the client’s requirements.
  • Defining guidelines and benchmarks for NFR considerations during project implementation 
  • Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers 
  • Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
  • Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it 
  • Understanding and relating technology integration scenarios and applying these learnings in projects.
  • Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken 
  • Carrying out POCs to make sure that suggested design/technologies meet the requirements.

Qualifications

Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Airflow Architecture Big Data Computer Science Data governance Data pipelines Data Warehousing DB2 Docker Engineering ETL Hadoop Kafka Kubernetes Pipelines PySpark Python SDLC Security Spark SQL Teradata Testing

Region: Asia/Pacific
Country: India

More jobs like this