BI Data Engineer

Tel Aviv-Yafo, Tel Aviv District, IL

Skai

With Skai’s omnichannel marketing platform, brands and agencies get data-driven marketing intelligence, connected media & measurement technology at scale. Get a demo.

View all jobs at Skai

Apply now Apply later

Description

Who are we?

Skai (formerly Kenshoo) is a leading omnichannel marketing platform that leverages advanced AI and machine learning to deliver intelligent solutions based on data with performance media, enabling smarter decision-making, increased efficiency, and maximized returns - Revenue enabler for businesses around the world. Its partners include Google, Meta, Amazon, and Microsoft and more. ~$7 billion in ad spending is managed on the SkaiTM platform every year.

Established in 2006, We’re 700 employees strong. We work hybrid with a great home/ office work mix.


What will you do?

As a BI Data Engineer at Skai, you will play a key role in designing, developing, and maintaining scalable data solutions that generate business impact. 

Key Responsibilities:

  • Design and implement robust data models, ETL pipelines, and data flows that support advanced analytics and reporting across the organization.
  • Develop new data solutions from scratch while maintaining and enhancing existing infrastructure
  • Partner with the Data Architect to manage and evolve Skai’s data architecture, ensuring it accurately reflects and supports the evolving business model.
  • Work closely with business stakeholders to convert complex requirements into actionable data insights that drive informed decisions.
  • Become a focal point for all data-related matters in the organization.



Requirements

  • Bachelor's degree in Industrial Engineering, Information Systems, or a related field.
  • Minimum of 3+ years in a BI Developer or Data Engineer role.
  • Advanced SQL
  • Python scripting – For data manipulation, automation, and pipeline support.
  • BI Tools Experience: Such as Tableau, QlikView. For building dashboards and visual insights.
  • Data Warehouse Experience.
  • ETL Optimization – Experience in tuning SQL and improving data workflows.
  • Experience with Snowflake and Airflow – Big advantage.
  • Exposure to CI/CD pipelines with Jenkins – Advantage.
  • Familiarity with modern data stack tools such as Databricks, Kafka, Spark, Docker – A plus.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow Architecture CI/CD Databricks Data warehouse Docker Engineering ETL Industrial Jenkins Kafka Machine Learning Pipelines Python QlikView Snowflake Spark SQL Tableau

Region: Middle East
Country: Israel

More jobs like this