Associate Staff Engineer, Big Data Engineer
Bengaluru, India
- Remote-first
- Website
- @nagarro đ
- Search
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
đđŒWe're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale â across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!
Job Description
REQUIREMENTS:Â
- Total experience 5+years.
- Hands on experience in Data Engineering, Data Lakes, Data Mesh, or Data Warehousing/ETL environments.
- Strong working knowledge in Python, SQL, Airflow and PySpark.
- Hands-on experience implementing projects applying SDLC practices.
- Hands on experience in building data pipelines and building data frameworks for unit testing, data lineage tracking, and automation.
- Experience with building and maintaining a cloud system.
- Familiarity with databases like DB2 and Teradata.
- Strong working knowledge in Apache Spark, Apache Kafka, Hadoop and MapReduce.
- Strong troubleshooting skills and ability to design for scalability and flexibility.
- Expertise in Spanner for high-availability, scalable database solutions.Â
- Knowledge of data governance and security practices in cloud-based environments.Â
- Problem-solving mindset with the ability to tackle complex data engineering challenges
- Familiar with containerization technologies (Docker/Kubernetes).
- Excellent communication and collaboration skills.
RESPONSIBILITIES:Â
- Writing and reviewing great quality code.
- Understanding the clientâs business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirementsÂ
- Mapping decisions with requirements and be able to translate the same to developersÂ
- Identifying different solutions and being able to narrow down the best option that meets the clientâs requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementationÂ
- Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developersÂ
- Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize itÂ
- Understanding and relating technology integration scenarios and applying these learnings in projects.
- Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision takenÂ
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications
Bachelorâs or masterâs degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index đ°
Tags: Airflow Architecture Big Data Computer Science Data governance Data pipelines Data Warehousing DB2 Docker Engineering ETL Hadoop Kafka Kubernetes Pipelines PySpark Python SDLC Security Spark SQL Teradata Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.