Data Engineer
Tel Aviv-Yafo, Tel Aviv District, IL
β οΈ We'll shut down after Aug 1st - try fooπ¦ for all jobs in tech β οΈ
Applications have closed
Faye
Travel insurance that goes the distance to cover you at every step of your journey with incredible coverage, 24/7 support, instant claims processing and more.Description
Hey, weβre Faye!
Faye is the first-ever digital, consumer-centric travel insurance for Americans with a product redefining travel coverage and care, taking it from a forgettable add-on to a must-have advantage that enhances the entire trip experience. Fayeβs whole-trip protection, coupled with its proprietary technology, enables 24/7 immediate assistance, claims processing and reimbursements anywhere in the world, setting a new standard and over-delivering in an industry synonymous with doing the opposite.
What we're looking for
As a Data Engineer at Faye, you will be instrumental in building and maintaining the data infrastructure that power our analytics and decision-making processes. Working closely with the broader data team, R&D, and various stakeholders, you will design, implement, and optimize data pipelines and storage solutions, ensuring efficient and reliable data flow across the organization.
Responsibilities
- Design, develop, and maintain scalable data pipelines using tools such as Airflow and DBT.β
- Manage and optimize our data warehouse in Snowflake, ensuring data integrity and performance.β
- Collaborate with analytics and business teams to understand data requirements and deliver appropriate solutions.β
- Implement and maintain data integration processes between various systems and platforms.β
- Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption.β
- Stay updated with the latest industry trends and technologies to continually improve our data infrastructure.β
Requirements
- 3+ years of experience in data engineering or a related field.β
- Proficiency in SQL and experience with modern lakehouse modeling
- Hands-on experience with data pipeline orchestration tools like Apache Airflow.β
- Experience with DBT for data transformation and modeling.β
- Familiarity with data visualization tools such as Tableau.β
- Strong programming skills in languages such as Python or Java.β
- Hands-on experience with AWS data solutions (or other major cloud vendor)
- Excellent problem-solving skills and attention to detail.β
- Strong communication skills and the ability to work collaboratively in a team environment.β
- Relevant academic degree in Computer Science, Engineering, or related field (or equivalent work experience).
Preferred Qualifications:
- Experience in the travel or insurance industries.β
- Familiarity with Mixpanel or similar analytics platforms.β
- Knowledge of data security and privacy best practices.
Qualifications
None* Salary range is an estimate based on our AI, ML, Data Science Salary Index π°
Tags: Airflow AWS Computer Science Data pipelines Data visualization Data warehouse dbt Engineering Java Pipelines Privacy Python R R&D Security Snowflake SQL Tableau
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.