Staff Engineer, Big Data Engineer
Remote, India
- Remote-first
- Website
- @nagarro đ
- Search
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
đđŒWe're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
Job Description
REQUIREMENTS:
- Total experience 7+years.
- Excellent knowledge and experience in Big data engineering.Â
- Strong hands-on experience with Apache Spark (PySpark) and Scala.
- Proficiency in SQL for data transformation and querying.
- Deep understanding of AWS Data Services: S3, Glue, Redshift, EMR.
- Experience with streaming data tools (Kafka or Kinesis).
- Familiarity with CI/CD practices and version control systems (e.g., Git).
- Solid knowledge of data modeling, data lake architecture, and cloud-native data pipelines.
- Knowledge of data governance and security practices in cloud-based environments.Â
- Problem-solving mindset with the ability to tackle complex data engineering challenges.Â
- Strong communication and teamwork skills, with the ability to mentor and collaborate effectively.
RESPONSIBILITIES:Â
- Writing and reviewing great quality code.
- Understanding the clientâs business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirementsÂ
- Mapping decisions with requirements and be able to translate the same to developersÂ
- Identifying different solutions and being able to narrow down the best option that meets the clientâs requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementationÂ
- Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developersÂ
- Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize itÂ
- Understanding and relating technology integration scenarios and applying these learnings in projectsÂ
- Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications
Bachelorâs or masterâs degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index đ°
Tags: Architecture AWS Big Data CI/CD Computer Science Data governance Data pipelines Engineering Git Kafka Kinesis Pipelines PySpark Redshift Scala Security Spark SQL Streaming
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.