Senior ML Engineer
Tel Aviv-Jaffa, Tel Aviv District, IL
Darrow
Darrow offers AI-driven legal intelligence to connect top plaintiffs' attorneys with high value, impactful cases.Description
About Darrow
Darrow is on a mission to pursue frictionless justice—using data and AI to uncover large-scale harms and bring justice to the people.
We’re a fast-growing startup of more than 150 team members based in Tel Aviv and New York, backed by world-class investors including Georgian, F2 Venture Capital, Entree Capital, NFX, and Y Combinator.
At Darrow, we build products that turn data into justice. Our platform uncovers legal violations—like environmental crimes, financial fraud, and privacy breaches—and helps bring legal action to scale. The result? A justice system that works better, faster, and more fairly.
Our Research team is growing, and we’re looking for a Senior ML Engineer to help us push the boundaries of applied machine learning and large language models (LLMs) in production.
About the Role
As a Senior ML Engineer at Darrow, you will lead the design, deployment, and maintenance of production-ready machine learning systems that power legal insights at scale. You’ll work across disciplines—integrating LLMs, building pipelines, and optimizing infrastructure—to deliver real-world impact.
You’ll gain full ownership over the systems you build, from selecting the tools and platforms to deploying services in production. You’ll work closely with DevOps and Data Engineering to ship robust solutions that drive meaningful change in the world.
Responsibilities
- Design and implement production-grade ML systems, including APIs, batch jobs, and streaming pipelines using Databricks, MLflow, AWS/GCP.
- Build and manage ML infrastructure, including data pipelines, model training, deployment, and monitoring.
- Develop and maintain end-to-end ML/LLM pipelines—from data ingestion and labeling to synthetic data generation, model registry, and rollout.
- Own and improve MLOps practices: automated testing, CI/CD, monitoring, alerting, and model governance.
- Write clean, maintainable Python code and uphold best practices in engineering and documentation.
- Collaborate with DevOps and Data Engineering teams to scale systems and improve performance.
- Research and recommend the best tools, platforms, and practices to support ML at scale.
Requirements
- BSc in Computer Science or a related field.
- 6+ years of experience building and deploying ML systems in production environments.
- Proficiency in Python and production frameworks like FastAPI, Databricks, SageMaker, and MLflow.
- Proven track record in deploying and maintaining ML/LLM services (APIs, microservices, serverless, or containerized).
- Strong understanding of software engineering fundamentals: object-oriented design, testing, version control, CI/CD, and performance optimization.
- Experience working with agentic workflows or LLM-based agents.
- Ability to work independently and break down complex, ambiguous problems into structured solutions.
- Strong communication skills—able to explain technical concepts to both technical and non-technical stakeholders.
Advantages:
- Hands-on experience with Kubernetes, Airflow, Spark, ArgoCD, and Docker.
- Experience working with databases such as Elasticsearch, vector databases, PostgreSQL, and SQL.
- Experience fine-tuning or integrating open-source LLMs in production environments (e.g., RAG, LoRA, agent frameworks).
- Contributions to open-source ML or MLOps projects.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS CI/CD Computer Science Databricks Data pipelines DevOps Docker Elasticsearch Engineering FastAPI GCP Kubernetes LLMs LoRA Machine Learning Microservices MLFlow ML infrastructure MLOps Model training Open Source Pipelines PostgreSQL Privacy Python RAG Research SageMaker Spark SQL Streaming Testing
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.