Data Engineering Specialist - ASL

Assago, IT

Applications have closed

Nestlé

Nestlé is the world's largest food & beverage company. We unlock the power of food to enhance quality of life for everyone, today and for generations to come.

View all jobs at Nestlé

Position Snapshot
•    Global role in the Analytics, Data and Integration stream, part of Nestlé IT

•    Nesté welcomes people with disabilities
•    Located in Nestlé Milan Global IT Hub
•    Permanent Contract, Full Time
•    Fluent Technical/Business English 
 
Position Summary
Are you a passionate Data Engineer ready to join our Analytical Service Line team? We need you to design and build strategic data assets and products that bring tangible business value. Collaborate with our team and stakeholders to apply the latest cloud technologies and data architecture principles, contributing to our digital transformation journey.
As a Data Engineering Specialist, you'll leverage your technical expertise to build scalable distributed software on the cloud. You'll combine cognitive computing and advanced analytics with traditional data engineering to transform enterprise business processes. Your background in handling both unstructured and structured data using modern data stack tools, including Azure and Snowflake, will be essential.
 
A Day in the Life of a Data Engineering Specialist:
•    Collaborate with Market and Centre Functions, Global and Regional Managed Businesses, and Above Market entities to identify business needs and opportunities and create data assets and products that enhance our analytics initiatives.
•    Collaborate with other Business Analytics teams and networks to utilize existing or new sources of data, for the purpose of enabling insights generation and re-applicability across markets
•    Conduct end-to-end analyses – including data discovery and gathering, requirements specification, data processing tasks, data modeling, ongoing business deliverables, and internal customer presentations.
•    Design, implement, and optimize robust data pipelines.
•    Develop and maintain ETL processes, integrating data from various sources like APIs, databases, and third-party applications.
•    Collaborate with data architects to identify and apply best practices
•    Ensure high data quality, accuracy, and reliability within our data lakes and data warehouses, while also enhancing findability, accessibility, interoperability, and reusability.
•    Solve difficult, non-routine analysis problems, applying advanced analytical methods as needed.
•    Analyse large and complex data sets to answer strategic and operational business questions or help troubleshoot and resolve data related issues across complex distributed systems.
•    Stay updated on emerging trends and tools in cloud computing, big data, and analytics, including AI and GenAI applied to data management.
 
What else will make you successful?
•    Bachelor's degree in computer science, information technology, statistics, mathematics, engineering or equivalent experience (Master’s preferred).
•    +5 years of experience in similar technical roles with proven track record 
•    Strong expertise in Azure (e.g. Data Lake, Data Factory, Synapse), and Snowflake or other equivalent cloud-based data platforms. 
•    Hands on experience with Databricks and proficient in Python/PySpark for data manipulation and processing. 
•    Strong SQL skills and experience in building data models and pipelines.
•    Familiarity with dbt for data transformation.
•    Experience with Azure DevOps for CI/CD pipelines.
•    Solid understanding of data warehouse methodologies and best practices. SAP HANA knowledge and experience would be a plus.
•    Strong knowledge of Power BI for creating semantic models and analytical reports.
•    Experience in gathering and processing data at scale (including writing scripts, web scraping, calling APIs, writing complex SQL queries, etc.)
•    Strategic and critical thinking, strong business acumen, and effective communication skills.
•    Ability to organize and manage several projects at the same time.
•    Experience working in a global environment and with virtual team. 

 

Only CVs written in English will be considered.

 

 

We are Nestlé, the largest food and beverage company. We are 308,000 employees strong driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respect: respect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 91.4 billion sales in 2018, we have an expansive presence with 413 factories in more than 85 countries. We believe our people are our most important asset, so we'll offer you a dynamic inclusive international working environment with many opportunities across different businesses, functions and geographies, working with diverse teams and cultures. Want to learn more? Visit us at www.nestle.com.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: APIs Architecture Azure Big Data Business Analytics CI/CD Computer Science Databricks Data management Data pipelines Data quality Data warehouse dbt DevOps Distributed Systems Engineering ETL Generative AI Mathematics Pipelines Power BI PySpark Python Snowflake SQL Statistics

Region: Europe
Country: Italy

More jobs like this