Data Engineer

Tel Aviv-Yafo, Tel Aviv District, IL

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Blockaid

Trusted by chains, protocols, wallets, exchanges, banks, and digital asset managers to understand and secure what matters most.

View all jobs at Blockaid

Apply now Apply later

Description

Blockaid builds trust in blockchain technology through better security. Our on-chain security platform is trusted by the biggest names in web3—including Coinbase, Metamask, Uniswap, Backpack, Stellar, and more—to detect, understand, and automatically prevent or mitigate damage from fraud, scams, hacks, and financial risks. Blockaid is backed by leading global venture capital firms, including Ribbit Capital, Sequoia Capital, and Cyberstarts — investors known for supporting some of the most innovative and impactful companies in tech and cybersecurity.

As we continue to expand and evolve, we’re looking for a Data Engineer to join our growing team and help shape the future of trust and security in the decentralized world. 

We have a large amount of varied, exciting, and unique data on our hands, and we’re already squeezing value out of it for our customers—but there’s so much more in there. Your goal will be to help construct the data in ways that enable both our business users and research group to dig deeper. As our very first dedicated data engineer, you’ll have a huge impact—but you’ll also need the independence and proactiveness to own it.

What You’ll Do:

  • Design and build complex data pipelines to ingest, process, and transform data from a variety of sources, especially logs and textual inputs.
  • Collaborate closely with Software Engineering and Product teams to ensure data is accessible and usable.
  • Develop efficient ETL processes using frameworks such as DBT, Airflow, or their equivalents.
  • Own and optimize your data environment (e.g., Snowflake), focusing on performance tuning, governance, and reliability.
  • Build dashboards for various company-wide use cases.
  • Implement best practices for data management, quality assurance, and security within cloud infrastructures (AWS).
  • Enable ML and analytics teams by building pipelines that feed feature stores and model training workflows.

Requirements

  • 4+ years of hands-on Data Engineering experience, in a cybersecurity or security-adjacent environment.
  • Proficiency in Python and SQL, with proven experience handling large or unstructured data.
  • Familiarity with data warehouse technologies (Snowflake, BigQuery).
  • Experience with big data infrastructure  (Snowflake/Databricks), orchestration tools (Airflow), and cloud platform (AWS).
  • Solid understanding of data governance, quality assurance, and pipeline observability.
  • Ability to deliver end-to-end solutions—from ingestion to production-ready datasets with minimal supervision.

Nice to Have:

  • Experience in cybersecurity, threat intelligence, or blockchain data processing.
  • Experience orchestrating large-scale ETLs in Snowflake
  • Experience using DBT in production
  • Knowledge of OLTP databases (e.g., PostgreSQL)
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow AWS Big Data BigQuery Blockchain Databricks Data governance Data management Data pipelines Data warehouse dbt Engineering ETL Machine Learning Model training Pipelines PostgreSQL Python Research Security Snowflake SQL Unstructured data

Region: Middle East
Country: Israel

More jobs like this