Staff Engineer (Big Data)
Bengaluru, India
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in!
Job Description
REQUIREMENTS:
- Experience: 7+ Years
- Excellent in Hadoop, Hive, Spark with Scala with demonstrable hands-on experience in performance tuning and debugging issues
- Good knowledge on stream processing Spark/Java Kafka, integration with REST APIs
- Good knowledge in Functional programming and OOP concepts, SOLID principles, design patterns for developing scalable applications in Data Engineering
- Familiarity with build tools like Maven
- Must have experience writing unit and integration tests using ScalaTest
- Must have experience using any versioning control system - GitHub
- Must have experience with CI/CD pipeline – Jenkins
- Shell scripting, Oozie
Good to Have Skills:
- Airflow, Databricks, Azure, Splunk
RESPONSIBILITIES:
- Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
- Mapping decisions with requirements and be able to translate the same to developers.
- Identifying different solutions and being able to narrow down the best option that meets the client’s requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation
- Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers
- Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it
- Understanding and relating technology integration scenarios and applying these learnings in projects
- Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Azure Big Data CI/CD Computer Science Databricks Engineering GitHub Hadoop Java Jenkins Kafka Maven OOP Oozie Scala Security Shell scripting Spark Splunk
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.