Senior Data Engineer
Tel Aviv-Yafo, Tel Aviv District, IL
IVIX
Detect financial crime and close the tax gap in the digital age with IVIX.Description
We’re looking for a brilliant, hands-on, and mission-driven Senior Data Engineer to join our R&D team at IVIX. Our technology helps governments uncover hidden business activity and fight financial crimes, by turning massive volumes of public web data into actionable intelligence. You’ll play a central role in designing and scaling our data infrastructure and pipelines that process hundreds of millions of records across diverse domains — from eCommerce to crypto to social media.
Equal Opportunity Employer
About the company
IVIX is the first AI-powered solution designed specifically to address a $20 trillion problem: illuminating the global shadow economy. IVIX leverages Open-Source Intelligence (OSINT) and highly advanced, cutting-edge technologies to reveal illicit business activity around the world, empowering governments in their mission to fight financial crime and close the tax gap.
IVIX employs a variety of AI tools (deep neural networks, large language models, and predictive modeling) as well as advanced data analytics to rapidly pinpoint large-scale illicit business activity, so government authorities can combat financial crime in the digital age.
Led by security, tech, tax and financial crime experts, and advised by a diverse team of former IRS commissioners, IVIX works with dozens of state and federal governments globally.
About the postion
- Build and maintain robust, scalable data pipelines using Spark, Kafka, and Python.
- Architect and optimize big data systems that fuel our AI, ML, and graph-based analytics layers.
- Develop ingestion frameworks for processing unstructured and structured public web data at scale.
- Collaborate with data scientists, software engineers, and product teams to support business-driven data flows.
- Manage and monitor workloads running in AWS (e.g., EMR, Glue, S3, Redshift, Athena, Lambda).
- Take ownership of data quality, reliability, and performance across our pipelines.
- Drive improvements in observability, tooling, and infrastructure resilience.
Requirements
- 5+ years of hands-on experience in data engineering roles.
- Strong programming skills in Python (Scala/Java a plus).
- Proven experience with Apache Spark and streaming technologies like Kafka.
- Solid background in building scalable data pipelines and working with big data architectures.
- Experience working in cloud environments, preferably AWS.
- Experience with orchestration tools like Airflow.
- Familiarity with data lakes, data warehouses, and schema design.
- Excellent analytical skills, self-motivated attitude, and strong communication skills.
- BSc in Computer Science, Engineering, or equivalent.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture Athena AWS Big Data Computer Science Crypto Data Analytics Data pipelines Data quality E-commerce Engineering Java Kafka Lambda LLMs Machine Learning Open Source Pipelines Predictive modeling Python R R&D Redshift Scala Security Spark Streaming
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.