Data Engineer
Ramat Gan, Tel Aviv District, IL
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Description
About Agmatix
Agmatix is a leading agro-informatics company that delivers data-driven solutions to the agriculture industry. Our innovative platform standardizes agronomic data and transforms it into actionable insights, empowering Ag professionals to make informed decisions. We leverage cutting-edge agronomy data science to accelerate field trial research, enhance crop yields, and drive sustainable agriculture practices. By addressing the critical need for data standardization and harmonization, Agmatix brings advanced AI technology to Ag professionals globally. Our platform fosters collaboration across the industry, enabling professionals to work together and advance agricultural innovation for a more resilient future.
What We Offer
This role provides the opportunity to drive significant impact through scalable data solutions while working with a collaborative, growth-focused team. You'll work with cutting-edge AWS technologies, solve complex data challenges, and play a key role in shaping our data platform's future. You'll learn unique graph relations models and valuable graph database experience.
Ideal for: A passionate data engineer who enjoys solving complex problems, thrives in collaborative environments, and is excited about continuous learning in the rapidly evolving data engineering landscape.
Requirements
Core Technical Requirements
AWS Data Engineering Stack (Required)
- Core Services: AWS Lambda (serverless ETL functions, event-driven processing), Step Functions (workflow orchestration, state management), SQS (message queuing, decoupled architectures, dead letter queues), S3 (data lake design, partitioning strategies, lifecycle management), Glue (data catalog, ETL development, schema evolution, data lineage), CloudWatch (monitoring), RDS(warehousing)
- Graph Databases: Relationship modeling, query optimization, integration with analytical workflows with Neptune
- Relational Databases: Advanced SQL optimization, indexing strategies, query performance analysis
Data Engineering Core Skills
- Python: Advanced proficiency with AWS SDK (boto3), pandas/numpy, data validation frameworks
- PySpark: Distributed processing, performance tuning, EMR integration, memory optimization
- SQL and OpenCypher: Complex analytical queries, window functions, performance tuning across multiple engines
- Pipeline Architecture: End-to-end design, dependency management, fault tolerance, scalability patterns
The checklist
Essential Personal & Professional Skills
End-to-End Solution Ownership
- Project Leadership: Drive complex data initiatives from requirements gathering through production deployment
- Stakeholder Engagement: Translate business requirements into technical solutions, manage expectations
- Knowledge Sharing: Actively mentor team members, conduct technical reviews, contribute to best practices
- Sprint Methodology: Thrive in iterative development cycles, participate actively in agile ceremonies
- Performance Optimization: Identify bottlenecks, optimize resource utilization, reduce processing costs
Accountability & Domain Curiosity
- Quality Ownership: Take personal responsibility for solution reliability, performance, and data accuracy
- Business Understanding: Demonstrate genuine curiosity about different business domains and their data challenges
- Proactive Improvement: Identify optimization opportunities and drive implementation without being asked
- Results Focus: Commit to measurable outcomes and follow through on deliverables consistently
Experience & Background
Professional Requirements
- 4+ years hands-on data engineering experience with cloud-native solutions
- 2+ years production AWS experience building scalable data pipelines
- Proven track record of end-to-end project delivery in collaborative environments
- Experience with high-volume data processing (TB+ datasets, real-time streaming)
Advantages
NoneThe Goods
Fast-growing startup, big enough to still have fun and enjoy ourselves.
Our offices are located in Sapir Tower - Ramat Gan, near Savidor Merkaz train station, but we foster a remote-friendly environment, combining work from home 2 days a week.
Our perks - 10bis, bttr benefits club, happy hours, PlayStation and parking.
We believe in quality work while maintaining a work-home balance.
There is some agronomic geek stuff but it's all for the greater good :)
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Data pipelines Engineering ETL Lambda NumPy Pandas Pipelines PySpark Python RDBMS Research SQL Step Functions Streaming
Perks/benefits: Career development Startup environment Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.