Senior Data Engineer
Remote, Romania
- Remote-first
- Website
- @nagarro 𝕏
- Search
Nagarro
A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.Company Description
👋🏼 We're Nagarro.
We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17,500+ experts across 37 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!
By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level? Yes? You may be ready to join us.
Job Description
We are looking for a Senior Data Engineer to join our team and drive the design, implementation, and optimization of data pipelines. The ideal candidate will have expertise in Azure, Fabric, and Business Intelligence.
Key Responsibilities:
- Lead and mentor junior data engineers while driving QA initiatives.
- Develop end-to-end data pipelines from source to end consumers.
- Implement source integration, data ingestion, transformation, and data modeling.
- Design and develop Power BI dashboards for business intelligence insights.
- Contribute to data platform enhancements, including CI/CD, infrastructure as code, and networking.
- Work with Azure technologies to maintain and improve the data platform.
- Collaborate with cross-functional teams to ensure seamless data flow and optimal performance.
Qualifications
- 5+ years of experience in data engineering with strong expertise in Azure Databricks.
- Proficiency in Python and PySpark for data processing and automation.
- Strong knowledge of SQL for data modeling and source system analysis.
- Hands-on experience with Azure services, including Azure Data Factory, Data Lake, and SQL-based solutions.
- Familiarity with Fabric and Business Intelligence tools like Power BI.
- Experience in data profiling, cataloging, and technical data flow design.
- Exposure to event-based streaming technologies (Kafka, Event Hubs) is a plus.
- Experience working in cloud-based data platforms and implementing best practices in data governance and security.
Additional Information
- 5+ years of experience in data engineering with expertise in Azure Databricks.
- Previous experience with Azure (Databricks, Fabric, and related services)
- Strong expertise in visualization and reporting using PowerBI
- Strong knowledge of ETL processes, data modelling, and transformation
- Hands-on experience with CI/CD, infrastructure as code, and cloud networking
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Azure Business Intelligence CI/CD Databricks Data governance Data pipelines Engineering ETL Kafka Pipelines Power BI PySpark Python Security SQL Streaming
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.