Full Stack Data Engineer
Tel Aviv-Jaffa, Tel Aviv District, IL
Darrow
Darrow offers AI-driven legal intelligence to connect top plaintiffs' attorneys with high value, impactful cases.Description
We’re Darrow - a fast-growing legal tech startup with an open, action-based culture unlike any other. We are committed to pursuing our vision of "frictionless justice," using advanced Machine Learning & AI to revolutionize the justice system.
We’re not just looking for someone to maintain pipelines, databases, or APIs. This role goes beyond typical engineering, offering you a unique opportunity to influence business-critical decisions, shape the technology stack, and work at the intersection of data, business, and product innovation.
If you are passionate about solving complex problems and thrive in a dynamic, entrepreneurial environment, this is your chance to make an impact.
Responsibilities
- Design, implement, and manage scalable, reliable products
- Implement and manage ETLs and backend infrastructure
- Collaborate with development teams to improve deployment processes
- Design and implement scalable, reliable data products that are aligned with business goals.
- Design and implement full stack products.
Why you should join us:
- Join at a pivotal moment: This is a pivotal time to join the team. We are in the early stages of developing our data platform, and your technical expertise will shape our architecture and technology choices for years to come.
- Product & Impact-Driven mindset: You won't just work with data—you’ll collaborate closely with product managers, data scientists, engineers, and customers to ensure our solutions drive meaningful results across all company products.
- Core Team Influence: The Data team is central to everything we do. Your work will have a direct impact on the success of our products and the company as a whole. You’ll collaborate with multiple departments, providing the backbone that power our mission of frictionless justice.
Requirements
- 3+ Years of ETL Pipeline Expertise: Proven experience designing, developing, and maintaining ETL pipelines, focusing on data integration, cleaning, and transformation.
- 3+ Years of Full stack Expertise: Proven experience designing, developing, and maintaining back-end and front-end products.
- Cloud Data Warehousing Experience: Hands-on experience with cloud-based data warehouses such as Snowflake, BigQuery, or Redshift.
- Programming Skills: Proficiency in SQL and Python (or other scripting languages). Familiarity with orchestration tools like Airflow is a plus.
- Data Quality and Monitoring: Experience with data validation, testing, and monitoring tools to ensure the integrity and quality of data throughout the pipeline.
- Collaboration Skills: Strong ability to work collaboratively in cross-functional teams, translating business requirements into technical solutions that have a tangible impact on the business.
- Problem-Solving: Ability to troubleshoot complex issues and optimize data workflows to improve performance and reliability.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture BigQuery Data quality Data Warehousing Engineering ETL Machine Learning Pipelines Python Redshift Snowflake SQL Testing
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.