Data Engineer Denodo

Genève, GE, Switzerland

Expleo

Expleo is a trusted partner for end-to-end, integrated engineering, quality services and management consulting for digital transformation.

View all jobs at Expleo

Apply now Apply later

Overview

Expleo offers a unique range of integrated engineering, quality and strategic consulting services for digital transformation. At a time of unprecedented technological acceleration, we are the trusted partner of innovative companies. We help them develop a competitive advantage and improve the daily lives of millions of people.  Joining Expleo Switzerland means working for 19,000 people in 32 countries, with a turnover of €1.5 billion by 2023:   - Technical and human support for each project and effective career management - Training to develop your professional skills - Take part in special dedicated events - Join a dynamic team 

 

To support our growth in the French-speaking Switzerland region of Geneva, we are looking for a DATA ENGINEER DENODO

 

Responsibilities

As a valued member of the Data Engineering team, you will play a crucial role in overseeing the maintenance and optimization of data pipelines within the DENODO platform.

 

Your primary responsibilities will encompass addressing evolving business requirements, refining ETL processes, and ensuring the seamless flow of energy data across our systems.

 

Design, Develop, and Maintain Robust Data Workflows:• Create and maintain scalable data workflows on Denodo.• Collaborate closely with cloud and frontend teams to unify data sources and establish a coherent datamodel.

 

Ensure Data Pipeline Reliability and Performance:• Guarantee the availability, integrity, and performance of data pipelines.• Proactively monitor workflows to maintain high data quality.

 

Collaborate for Data-Driven Insights:• Engage with cross-functional teams to identify opportunities for data-driven enhancements and insights.• Analyze platform performance, identify bottlenecks, and recommend improvements.

 

Documentation and Continuous Learning:• Develop and maintain comprehensive technical documentation for ETL implementations.• Stay abreast of the latest Denodo/Spark features and best practices, contributing to the continuous improvement of our data management capabilities.

Essential skills

Profile:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Minimum of 3 years as a Data Engineer, with a proven track record of implementing pipelines in DENODO.
  • Experience in cloud environments (AWS or Azure) is a plus.
  • Strong expertise in PySpark.
  • Proficiency in SQL and scripting languages (e.g., Python)
  • Excellent analytical, problem-solving skills and Communication Skills.
  • Strong communication skills in French (both written and verbal) and fluency in English.

 

Desired skills

 Additional Preferred Skills:

  • Familiarity with industry-specific regulations and compliance requirements.
  • Previous experience in the energy trading domain is a nice-to-have.
  • Ability to work effectively in a fast-paced, collaborative environment
  • Detail-oriented with effective task prioritization skills
  • Demonstrated adaptability and a keen willingess to learn new technologies and tools.
  • • Strong customer orientation.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: AWS Azure Computer Science Consulting Data management Data pipelines Data quality Engineering ETL Pipelines PySpark Python Spark SQL

Perks/benefits: Career development Team events

Region: Europe
Country: Switzerland

More jobs like this