Associate Staff Engineer (Big Data)
Hyderabad, India
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in!
Job Description
REQUIREMENTS:
- Experience: 5+ Years
- Strong expertise in GCP services: Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub.
- Proficiency in designing and implementing data processing frameworks for ETL/ELT, batch, and real-time workloads.
- In-depth understanding of data modeling, data warehousing, and distributed data processing using tools like Dataproc and Spark.
- Hands-on experience with Python, SQL, and modern data engineering practices.
- Knowledge of data governance, security, and compliance best practices on GCP.
- Strong problem-solving, leadership, and communication skills, with the ability to guide teams and engage stakeholders.
RESPONSIBILITIES:
- Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
- Mapping decisions with requirements and be able to translate the same to developers.
- Identifying different solutions and being able to narrow down the best option that meets the client’s requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation
- Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers
- Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it
- Understanding and relating technology integration scenarios and applying these learnings in projects
- Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Big Data BigQuery Computer Science Dataflow Data governance Dataproc Data Warehousing ELT Engineering ETL GCP Google Cloud Python Security Spark SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.