Data Engineer

Tel Aviv-Yafo, Tel Aviv District, IL

Faye

Travel insurance that goes the distance to cover you at every step of your journey with incredible coverage, 24/7 support, instant claims processing and more.

View all jobs at Faye

Apply now Apply later

Description

Hey, we’re Faye!

Faye is the first-ever digital, consumer-centric travel insurance for Americans with a product redefining travel coverage and care, taking it from a forgettable add-on to a must-have advantage that enhances the entire trip experience. Faye’s whole-trip protection, coupled with its proprietary technology, enables 24/7 immediate assistance, claims processing and reimbursements anywhere in the world, setting a new standard and over-delivering in an industry synonymous with doing the opposite.

What we're looking for

As a Data Engineer at Faye, you will be instrumental in building and maintaining the data infrastructure that power our analytics and decision-making processes. Working closely with the broader data team, R&D, and various stakeholders, you will design, implement, and optimize data pipelines and storage solutions, ensuring efficient and reliable data flow across the organization.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using tools such as Airflow and DBT.​
  • Manage and optimize our data warehouse in Snowflake, ensuring data integrity and performance.​
  • Collaborate with analytics and business teams to understand data requirements and deliver appropriate solutions.​
  • Implement and maintain data integration processes between various systems and platforms.​
  • Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption.​
  • Stay updated with the latest industry trends and technologies to continually improve our data infrastructure.​

Requirements

  • 3+ years of experience in data engineering or a related field.​
  • Proficiency in SQL and experience with modern lakehouse modeling
  • Hands-on experience with data pipeline orchestration tools like Apache Airflow.​
  • Experience with DBT for data transformation and modeling.​
  • Familiarity with data visualization tools such as Tableau.​
  • Strong programming skills in languages such as Python or Java.​
  • Hands-on experience with AWS data solutions (or other major cloud vendor)
  • Excellent problem-solving skills and attention to detail.​
  • Strong communication skills and the ability to work collaboratively in a team environment.​
  • Relevant academic degree in Computer Science, Engineering, or related field (or equivalent work experience).

Preferred Qualifications:

  • Experience in the travel or insurance industries.​
  • Familiarity with Mixpanel or similar analytics platforms.​
  • Knowledge of data security and privacy best practices.

Qualifications

None
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  0  0
Category: Engineering Jobs

Tags: Airflow AWS Computer Science Data pipelines Data visualization Data warehouse dbt Engineering Java Pipelines Privacy Python R R&D Security Snowflake SQL Tableau

Region: Middle East
Country: Israel

More jobs like this