Data Engineer

Tel Aviv-Yafo, Tel Aviv District, IL

LSports

LSports provides Innovative sports betting data API for the sports betting industry. We are a leading provider of high-quality live sports data feeds.

View all jobs at LSports

Apply now Apply later

Description

LSports is the leading global provider of sports data, dedicated to revolutionizing the industry through innovative solutions. We excel in sports data collection and analysis, advanced data management, and cutting-edge services like AI-based sports tips and high-quality sports visualization. As the sports data industry continues to grow, LSports remains at the forefront, delivering real-time solutions. 

If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team! We are looking for a highly motivated Data Engineer.

About the team: Trade360

Trade360 group is the powerhouse behind LSports' exceptional portfolio of customer-facing products, blending innovation, functionality, and user experience to deliver industry-leading solutions that redefine customer engagement and satisfaction.

Responsibilities:

  • Building production-grade data pipelines and services
  • Taking ownership of major projects from inception to deployment
  • Architecting simple yet flexible solutions, and then scaling them as we grow
  • Collaborating with cross-functional teams to ensure data integrity, security, and optimal performance across various systems and applications.
  • Staying current with emerging technologies and industry trends to recommend and implement innovative solutions that enhance data infrastructure and capabilities.

Requirements

  • 4+ years of experience as a Data Engineer
  • 2+ years of experience using PySpark
  • Experience building and maintaining production grade data pipelines, and working in distributed architectures
  • Experience with SQL and NoSQL Databases
  • Knowledge and understanding of work in a modern CI environment: Git, Docker, K8S
  • Experience with ETL tools: AWS Glue/Apache Airflow/Prefect, etc.
  • Experience with Databricks or similar

Advantages:

  • MLOps experience
  • Experience with Kafka
  • Experience with designing and implementing data lake/warehouse


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS AWS Glue Databricks Data management Data pipelines Docker ETL Excel Git Kafka Kubernetes MLOps NoSQL Pipelines PySpark Security SQL

Perks/benefits: Flex hours

Region: Middle East
Country: Israel

More jobs like this