Senior Engineer, Big Data Engineer
Hyderabad, India
- Remote-first
- Website
- @nagarro 𝕏
- Search
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!
Job Description
REQUIREMENTS:
- Total experience 3+ years.
- Excellent knowledge and experience in Big data engineer.
- Strong hands-on experience with PySpark and Apache Spark development.
- Proficiency in Python for data processing and automation.
- Expertise in SQL for querying, transformation, and performance tuning.
- Hands-on experience with Databricks or similar cloud-based big data platforms.
- Proven ability to work with large datasets in distributed computing environments.
- Solid understanding of data engineering best practices, including version control, CI/CD, and testing.
- Knowledge of data modeling, ETL frameworks, and workflow orchestration tools (e.g., Airflow, Azure Data Factory).
- Experience with cloud platforms such as Azure, AWS, or GCP.
- Familiarity with Delta Lake, Lakehouse architecture, or data warehousing concepts.
- Excellent problem-solving and communication skills.
RESPONSIBILITIES:
- Writing and reviewing great quality code
- Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project
- Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it
- Determining and implementing design methodologies and tool sets
- Enabling application development by coordinating requirements, schedules, and activities.
- Being able to lead/support UAT and production roll outs
- Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it
- Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement
- Giving constructive feedback to the team members and setting clear expectations.
- Helping the team in troubleshooting and resolving of complex bugs
- Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure Big Data CI/CD Computer Science Databricks Data Warehousing Engineering ETL GCP PySpark Python Spark SQL Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.