Data Engineer

Business Office (Joy House 2) - Mumbai, India

Mondelēz International

Mondelēz International, Inc. (NASDAQ: MDLZ) is one of the world’s largest snacks companies, empowering people to snack right in over 150 countries.

View all jobs at Mondelēz International

Apply now Apply later

Job Description

Are You Ready to Make It Happen at Mondelēz International?

Join our Mission to Lead the Future of Snacking. Make It With Pride.

You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data,Ā building infrastructure and data pipelines/retrieval mechanisms to support our data needs

How you will contribute

You will:

  • Operationalize and automate activities for efficiency and timely production of dataĀ visuals
  • Assist in providing accessibility, retrievability, security and protection of data in an ethicalĀ manner
  • Search for ways to get new data sources and assess their accuracy
  • Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
  • Understand data and metadata to support consistency of information retrieval,Ā combination, analysis, pattern recognition and interpretation
  • Validate information from multiple sources.
  • Assess issues that might prevent the organization from making maximum use of itsĀ information assets

What you will bring

A desire to drive your future and accelerate your career and the following experience and knowledge:

  • Extensive experience in data engineering in a large, complex business with multipleĀ systems such as SAP, internal and external data, etc. and experience setting up, testingĀ and maintaining new systems
  • Experience of a wide variety of languages and tools (e.g. script languages) to retrieve,Ā merge and combine data
  • Ability to simplify complex problems and communicate to a broad audience

In This Role

As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.

Role & Responsibilities:

  • Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
  • Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
  • Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity.
  • Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance.
  • Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.

Technical Requirements:

  • Programming: Python, PySpark, Go/Java
  • Database: SQL, PL/SQL
  • ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran.
  • Data Warehousing: SCD, Schema Types, Data Mart.
  • Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker.
  • GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex.
  • AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis.
  • Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics.
  • Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow.

Soft Skills:

  • Problem-Solving: The ability to identify and solve complex data-related challenges.
  • Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
  • Analytical Thinking: The capacity to analyze data and draw meaningful insights.
  • Attention to Detail: Meticulousness in data preparation and pipeline development.
  • Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field.

Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy

Business Unit Summary

At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.

We have a rich portfolio of strong brands globally and locally including many household names such as Oreo, belVita and LU biscuits; Cadbury Dairy Milk, Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum.

Our 80,000 makers and bakers are located in more thanĀ 80Ā countries and we sell our products in overĀ 150Ā countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast.

Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.

Job Type

Regular

Data Science

Analytics & Data Science
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure BigQuery Databricks Dataflow Data pipelines Dataproc Data quality Data Warehousing dbt Engineering ETL FiveTran GCP Informatica Java Kafka Kinesis Lambda Looker Neo4j Pentaho Pipelines Power BI PySpark Python Redshift Security SQL Tableau Talend Testing

Perks/benefits: Career development Relocation support

Region: Asia/Pacific
Country: India

More jobs like this