Data Engineer

South Jakarta, Indonesia

Applications have closed

Devoteam

Transform your bussines with Devoteam, the AI-driven tech consulting. Become a leading company embracing AI for sustainable value.

View all jobs at Devoteam

Company Description

Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.

By combining creativity, tech, and data insights, we empower our customers to transform their business and unlock the future.

With 25 years’ of experience and 8,000 employees across Europe and the Middle East, Devoteam promotes responsible tech for people and works to create better change.

#Creative Tech for Better Change

Devoteam has launched in January 2021 its new strategic plan, Infinite 2024, with the ambition to become the #1 EMEA partner of the leading Cloud-based platform companies (AWS, Google Cloud, Microsoft, Salesforce, ServiceNow), further reinforced by deep expertise in digital strategy, cybersecurity, and data.

Job Description

The Data Engineer will be responsible for the following activities: 

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization's requirements
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices
  • Optimise data processing and storage systems for performance and scalability
  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions

Qualifications

 

  • Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.

 

  • Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.

 

  • Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large-scale data processing.

 

  • ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.

 

  • Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.

 

  • Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud-based data services is valuable.

 

  • Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.

 

  • Problem-Solving and Analytical Skills: Data engineers should have strong problem-solving abilities and analytical thinking to identify data-related issues, troubleshoot problems, and optimize data processing workflows.

 

  • Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross-functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms.

Additional Information

Education and Experience

  • Bachelor’s degree in Engineering required.
  • Minimum Two years of related experience is highly preferred.
  • Two certifications in GCP (within 3 months after joining).

Status: Full-Time

Duration: -

Beginning date: December 2024

The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  10  1  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure Big Data BigQuery Consulting Consulting firm Data governance Data pipelines Data quality Data Warehousing Engineering ETL GCP Google Cloud Hadoop Informatica Java Kafka NiFi Oozie Pipelines Python RDBMS Redshift Salesforce Scala Security Snowflake Spark SQL Talend

Region: Asia/Pacific
Country: Indonesia

More jobs like this