Analytics Data Engineer

Paris

Smart AdServer

Equativ - Built To Fulfill The Promise Of Advertising Technology

View all jobs at Smart AdServer

Apply now Apply later

About the team
At Equativ, we’re on a mission to develop advertising technologies that empower our customers to reach their digital business goals. This means that we rely on massively scalable, widely distributed, highly available, and efficient software systems; the platform deals with over 200 billions auctions per day and sends to the data stack 20Tb per day. 
Our data engineering team is composed of 10 skilled engineers and is based in Paris. We are part of the R&D department which is composed of 150+ engineers spread across Paris, Nantes, Limoges, Krakow and Berlin all working in an Agile environment and ready to tackle the most complex technical challenges.
Our Mission
Our Data Engineering team is central to Equativ’s data centric business and is responsible to ingest, transform, model and redistribute all data coming from our ad tech platform.We aim at building scalable and robust Big Data platforms from ingestion to business actionable consumption. Our Big Data ecosystem must handle massive log ingestion, short & long term data storage, complex data modeling, real-time and batch ELT as well as providing external access through dedicated APIs.We enhance and deliver Equativ’s data directly to our customers and throughout the company whether it is for BI analysis, data science models feeding, customer reporting, invoicing, quick feedback loops and more.We work with the latest technologies (ClickHouse, BigQuery, airflow, kafka, Flink, DBT…) at a very high scale and are driven by the objectives to build best in class data consumption capabilities, increase modelling automation, optimise performances and simplify the access to our raw data
What you will doAs an Analytics Data Engineer at Equativ, you'll be a pivotal player in driving our data-driven approach to adtech. You'll work alongside a talented team to build innovative data models and reporting tools that support our product, R&D, and analytics teams.In this role you will apply best in class data modeling strategies, orchestrate data transformations and optimize cost/performances of our data warehouses (ClickHouse, BigQuery, SnowFlake):

  • Be responsible for the performance, integrity and security of our data warehouses. Be highly involved in planning, developing and troubleshooting our data marts and orchestrators:
  • - Provide guidance and support to other analysts/engineers- Assist with schema design, code review, SQL query tuning- Upgrade and improve application schema and data upgrades- Proactively and regularly make recommendations for modeling improvements
  • Develop best-in-class data modeling expertise across all Equativ’s business line and work closely with product teams to streamline our datamodels
  • Contribute to product line data roadmap definition in coordination with teams involved in the valorization of our data (product, r&d feature teams, analysts and data scientists) in order to build best in class data platforms that will generate insights for Equativ’s analytics
  • Perform end-to-end monitoring to ensure high availability and reliability of production data in our warehouses. This includes also the setup of a robust data quality framework to improve trust in our modeling
  • Work closely with other data engineers on the evolution and industrialization of our data pipelines and reporting APIs so that it facilitates modeling and access to the data from our warehouses
  • Take part in improving and deploying data engineering standards, documentation and operational guidelines around data usage at Equativ (data catalogs, schemas, trainings…)

About you

  • Master degree in Computer Science or similar technical field of study
  • 3+ years as an analytics data engineer, BI engineer, analyst or data engineer
  • Product driven and willing to gain deep expertise into adtech business
  • Passion for working with large datasets and a commitment to data-driven decision-making
  • SQL mastery is a must
  • Experience in data warehouse management (ClickHouse, BigQuery, SnowFlake) and data transformation tool (knowledge of DBT is a plus)
  • Knowledge of at least one data programming language is a plus (Python, Scala, Java) and good understanding of the software development process (git, ci/cd, test, scrum)
  • Entrepreneurial spirit, very good business acumen and know-how to identify opportunities of improvement
  • Working proficiency and communication skills in verbal and written English
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow APIs Big Data BigQuery CI/CD Computer Science Data pipelines Data quality Data warehouse dbt ELT Engineering Flink Git Java Kafka Pipelines Python R R&D Scala Scrum Security Snowflake SQL

Region: Europe
Country: France

More jobs like this