(Senior) Data Engineer

Subang Jaya (Office), Malaysia

ROCKWOOL Group

Our stone wool insulation is a key component in fire-resilient buildings.

View all jobs at ROCKWOOL Group

Apply now Apply later

(Senior) Data Engineer

We are looking for a (Senior) Data Engineer based in our Malaysia location to join the global Data Science & Engineering team in ROCKWOOL to support the development of our global data platform for factory data collecting datasets from IoT infrastructure from our plants. This role involves working on a platform built on Azure, leveraging Databricks, Airflow, Docker, Blob Storage, and MongoDB. You will address technical challenges, implement new features, streamline CI/CD pipelines, automate processes, and enhance governance and monitoring. We are also planning a migration to a new tech stack most likely based on Databricks, Azure EventHub and Kafka, and you may contribute to tasks related to the new platform in the future.

You will join the operational part of a global Data & AI team where you will collaborate with colleagues in Poland and Denmark. The operational team in Malaysia will contain data scientists, data engineers and ML engineers and there will be a lot of collaboration with other parallel teams in other locations.

Key Requirements

  • Technical Skills:
    • 3-5+ years of experience in data engineering focused on data integration, ETL, and data warehousing,
    • Proficiency in Databricks and Spark.
    • Experience with Azure services and platform.
    • Programming expertise in Python.
    • Familiarity with DevOps/DataOps practices.
  • Communication:
    • Fluency in English (C1).
  • Personal Attributes:
    • Proactive and self-reliant in task execution.
    • Capable of proposing solutions and effectively collaborating with stakeholders.

Big Advantages

  • Technical Expertise:
    • Experience with MongoDB and NoSQL databases.
    • Knowledge of scheduling tools, especially Airflow.
    • Proficiency in containerization using Docker.
    • Familiarity with advanced Databricks features like Unity Catalog or DLT.
    • Expertise in SQL databases.
    • Experience managing on-premises or cloud-based servers.
    • Understanding of programming structures and algorithms.
    • Version control expertise (Git, preferably GitHub).
    • Advanced understanding of data pipelines and data architecture (data warehouses, lakes, vaults).
    • Experience with IaC concepts and tools, preferably Terraform.
    • Familiarity with LLM tools (e.g., ChatGPT, GitHub Copilot) to enhance productivity.
  • Nice to Have:
    • Experience with data modeling,
    • Knowledge of additional programming languages (e.g., C#, Java, Scala),
    • DBT,
    • Snowflake,
    • Kafka.

Other Expectations

  • Familiarity with Scrum/Kanban methodologies and tools like Jira.
  • Ability to independently gather requirements and collaborate with business stakeholders.
  • Experience working in an international team.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  1  0
Category: Engineering Jobs

Tags: Airflow Architecture Azure ChatGPT CI/CD Copilot Databricks DataOps Data pipelines Data Warehousing dbt DevOps Docker Engineering ETL Git GitHub GPT Java Jira Kafka Kanban LLMs Machine Learning MongoDB NoSQL Pipelines Python Scala Scrum Snowflake Spark SQL Terraform

Region: Asia/Pacific
Country: Malaysia

More jobs like this