Software Engineer - Data Core
Tel Aviv-Yafo, Tel Aviv District, IL
4M Analytics Ltd
4M Analytics' unique technology provides a subsurface utility map with a complete, accurate, and up-to-date subsurface-infrastructure database in the US.Description
Who We Are:
The 4M story is likely one you haven’t heard before: We are on a mission to unlock access to the world below us, to do for the world below ground what Google Maps did for the world above. By leveraging cutting-edge technology, we are mapping the subsurface infrastructure to make reliable, real-time utility data accessible to the construction industry - completely transforming a traditional industry. We’re a growing startup with 100 employees currently based in Tel Aviv, Israel, and Austin, Texas.
The Opportunity
We seek an experienced Software Engineer with a strong background to become an integral member of our Data-Core team, tasked with the mission of processing, structuring, and analyzing hundreds of millions of data sources. Your role will be pivotal in creating a unified, up-to-date, and accurate utilities map, services, and applications for accelerating our mapping operations. Your contributions will directly impact our core product's success.
Responsibilities
- Collaborate with cross-functional teams to design, build, and maintain data processing pipelines while contributing to our common codebase.
- Contribute to designing and implementing data architecture, ensuring effective data storage and retrieval.
- Develop and optimize complex Python-based applications and services to allow more efficient data processing and orchestration, enhancing the quality and accuracy of our datasets.
- Implement geospatial data processing techniques and contribute to the creation of our unified utilities map, enhancing the product's geospatial features.
- Drive the scalability and performance optimization of data systems, addressing infrastructure challenges as data volume and complexity grow.
- Create and manage data infrastructure components, including ETL workflows, data warehouses and databases, supporting seamless data flow and accessibility.
- Design and implement CI/CD processes for data processing, model training, releasing, testing and monitoring, ensuring robustness and consistency.
Requirements
- 5+ years of proven experience as a backend/software engineer with a strong Python background.
- Experience in deploying a diverse range of cloud-based technologies to support mission-critical projects, including expertise in writing, testing, and deploying code within a Kubernetes environment.
- A proven experience in building scalable online services.
- Experience with frameworks like Airflow, Docker, and K8S to build data processing and exploration pipelines along with ML infrastructure to power our intelligence.
- Experience in AWS/Google cloud environments.
- Experience working with both SQL and NoSQL databases such as Postgres, MySQL, Redis, or DynamoDB.
- Experience as a Data Infrastructure Engineer or in a similar role in managing and processing large-scale datasets - a significant advantage
Diverse Perspectives
We know that innovation thrives on teams where diverse points of view come together to solve hard problems. As such, we explicitly seek people that bring diverse life experiences, diverse educational backgrounds, diverse cultures, and diverse work experiences. Please be prepared to share with us how your perspective will bring something unique and valuable to our team.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS CI/CD Docker DynamoDB ETL GCP Google Cloud Kubernetes Machine Learning ML infrastructure Model training MySQL NoSQL Pipelines PostgreSQL Python SQL Testing
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.