Associate Principal Engineer, Big Data Engineer

Hyderabad, India

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

👋🏼We're Nagarro.

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital , and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

Job Description

REQUIREMENTS:

  • Total Experience 11+ years
  • Excellent knowledge and experience in Big data.
  • Strong expertise in GCP services such as Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub.
  • Proficiency in designing and implementing data processing frameworks for ETL/ELT, batch, and real-time workloads.
  • In-depth understanding of data modeling, data warehousing, and distributed data processing using tools like Dataproc and Spark.
  • Hands-on experience with Python programming
  • Strong working experience in SQL, and modern data engineering practices.
  • Knowledge of data governance, security, and compliance best practices on GCP.
  • Strong problem-solving, leadership, and communication skills, with the ability to guide teams and engage stakeholders..
  • Strong communication and collaboration skills, with experience working in cross-functional teams

RESPONSIBILITIES:

  • Writing and reviewing great quality code
  • Understanding the client’s business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements
  • Mapping decisions with requirements and be able to translate the same to developers
  • Identifying different solutions and being able to narrow down the best option that meets the client’s requirements
  • Defining guidelines and benchmarks for NFR considerations during project implementation
  • Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers
  • Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed
  • Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it
  • Understanding and relating technology integration scenarios and applying these learnings in projects
  • Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken
  • Carrying out POCs to make sure that suggested design/technologies meet the requirements

 

Qualifications

Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture Big Data BigQuery Computer Science Dataflow Data governance Dataproc Data Warehousing ELT Engineering ETL GCP Google Cloud Python Security Spark SQL

Region: Asia/Pacific
Country: India

More jobs like this