Data Engineer
Lyon, France
ABOUT US
We’re like-minded, curious, excitable people here at Chiliz who work well in teams, spread across the globe. Chiliz is a global blockchain company, which powers Socios.com - the creators of Fan Tokens, and the popular fan rewards platform.
Socios has partnered with some of the world’s best teams, including Paris Saint-Germain, Juventus, FC Barcelona, Atlético de Madrid, UFC, Galatasaray, Manchester City FC, and many more.
The curious nature of a Chilizen is what drives this company forward, and since we’re looking to grow even more, apply for your dream role today.
OUR BRANDS & CHANNELS
We are building the web3 infrastructure for sports & entertainment!
Founded in 2018, Chiliz is a blockchain provider focused on the sports and entertainment industry. We build scalable, secure blockchain-enabled solutions that supercharge fan experiences using digital assets.
$CHZ is the native digital token for the Chiliz sports & entertainment ecosystem currently powering Socios.com and the Chiliz Chain blockchain.
Socios.com is a fan engagement and rewards app that allows fans to engage with their favourite teams and clubs through digital assets known as Fan Tokens.
THE ROLE
The Data Engineer will play a key role in designing, developing, and maintaining our data infrastructure, enabling the organization to extract valuable insights from diverse datasets. The candidate will collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to ensure the seamless flow of data and facilitate data-driven decision-making.
Duties & Responsibilities:
Implement scalable data lake/lakehouse solution on AWS.
Design and implement big data pipelines using batch and streaming tools.
Develop data modelling workflows with DBT.
Orchestrate workflows with Airflow.
Collaborate with global teams to ensure data governance standards are met.
Build and deploy end-to-end CI/CD pipelines, incorporating DevOps best practices.
Requirements:
3+ years experience in a similar role.
Bachelor's degree or higher in a quantitative/technical field (e.g. Computer Science, Engineering or similar).
Great at communicating ideas and working in a team.
Solid proficiency experience with Python to develop data ingestion pipelines.
Advanced knowledge of REST APIs and Websockets.
Experience with Delta Lake infrastructures and large datasets processing with Apache Spark / PySpark.
AWS experience preferred, with proficiency in a wide range of AWS services (e.g., EC2, S3, Glue, Fargate, Lambda, IAM, EMR, Kinesis)
Working knowledge of DBT for Data Modeling.
Expert knowledge of SQL and experience in using cloud-based DWH, such as Redshift/Snowflake/Google Big Query.
Knowledge designing CI/CD pipelines and IaC experience, Terraform preferred.
Experience with building data pipeline processes and orchestration tools such as Airflow.
Good understanding of cloud infrastructure platforms.
Self-starter and detail-oriented. You can complete projects with minimal supervision.
What We Offer:
We offer you the chance to grow, to learn, to flex your creative muscles and to work on a project that is providing excitement to thousands of users.
Our interview phase is a 3-step process where you’ll be able to ask us anything and get to know your team too. From HR right through to your team lead, we need this process to work both ways: It's not just about you fitting in, but about us being the right fit for you too.
Are you ready to work with the world’s best teams? Are you happy to try, fail and bounce back? Are you excited to keep pushing the boundaries of technology?
We’ve got offices across the world, over 30 nationalities in our ranks and the most important superpower of all - flexibility. Our competitive salaries, wellness allowance, healthcare and pension plan are just the tip of the iceberg. You’ll gain friends, experience and a good challenge, we’ll gain you.
Are you ready?
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS Big Data BigQuery Blockchain CI/CD Computer Science Data governance Data pipelines dbt DevOps EC2 Engineering Kinesis Lambda Pipelines PySpark Python Redshift Snowflake Spark SQL Streaming Terraform
Perks/benefits: Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.