Data Engineer Intern

Luxembourg, LU

⚠ We'll shut down after Aug 1st - try foo🩍 for all jobs in tech ⚠

Apply now Apply later

About CANAL+Group

 

Founded as a French subscription-TV channel 40 years ago, CANAL+ is now a global media and entertainment company. The group has 26.9 million subscribers worldwide, over 400 million monthly active users on its OTT and video streaming platforms, and a total of more than 9,000 employees. It generates revenues in 195 countries and operates directly in 52 countries, with leading positions in Pay-TV in 20 of them. CANAL+ operates across the entire audio-visual value chain, including production, broadcast, distribution and aggregation.

 

It is home to STUDIOCANAL, a leading film and television studio with worldwide production and distribution capabilities; Dailymotion, one of the world’s largest short-form video streaming platforms; Thema, a production and distribution company specialized in creating and distributing diverse content and channels; and telecommunication services, through GVA in Africa and CANAL+ Telecom in the French overseas jurisdictions and territories. It also operates the iconic performance venues L’Olympia and Théùtre de l’ƒuvre in France and CanalOlympia in Africa.

 

CANAL+ also has significant equity stakes across Africa, Europe and Asia, namely in MultiChoice (the Pay-TV leader in English and Portuguese-speaking Africa), Viaplay (the Pay-TV leader in Scandinavia) and Viu (a leading OTT platform in Southern-Asia).From the CANAL+ office in Luxemburg, the activities for Austria, Belgium, Czech, Germany, Hungary, The Netherlands, Romania, Slovakia, Switzerland and SPI are being coordinated.

 

Job Objective

 

Design, implement, and optimize scalable data pipelines to transform multi-terabyte OTT usage data, CRM records, and marketing datasets into reliable and trusted assets within Snowflake. Enable advanced analytics, Power BI reporting, and real-time personalization through robust data engineering practices and continuous pipeline monitoring

 

The position is based in Luxembourg and reports to the Head of Business Intelligence & Data Analytics.

 

Your responsibilities

 

1. Pipeline Development

  • Design batch & streaming jobs with Airflow, dbt, Python/SQL/JS on AWS & GCP.
  • Connect multiple CRM sources, OTT play‑logs, campaign data.

 

2. Data Quality & Monitoring

  • Implement dbt tests, Airflow alerts and Power BI monitors for freshness & completeness.

 

3. Warehouse & Architecture

  • Manage Snowflake layers (staging → cleansed → marts).
  • Expose governed views to BI, data science and personalisation APIs.

 

4. FinOps & Performance

  • Tune storage/compute (partitions, clusters, warehouses).
  • Analyse cost reports (Snowflake, S3, GCS) and propose savings.

 

5. DevOps

  • Maintain CI/CD with GitHub CI & Terraform.
  • Automate documentation generation (dbt docs) and publish in Notion workspace.

Skills & Experience

  • Final-year Master’s student (M2) —or 3ᔉ d’école d’ingĂ©nieurs —specialising in Computer Science, Data Engineering or a related field.
  • Strong foundation in SQL and Python, with initial hands-on experience using cloud platforms or ETL tools."
  • Analytical mindset, curiosity, focus on optimisation & reliability.
  • Fluent in English (spoken and written). French or Dutch is a plus
Apply now Apply later
Job stats:  1  1  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Business Intelligence CI/CD Computer Science Data Analytics Data pipelines Data quality dbt DevOps Engineering ETL GCP GitHub Pipelines Power BI Python Snowflake SQL Streaming Terraform

Region: Europe
Country: Luxembourg

More jobs like this