Full Stack Engineer

Ramallah, PS

Freightos

Freightos makes global trade frictionless with the world's online marketplace for the trillion-dollar international shipping industry.

View all jobs at Freightos

Apply now Apply later

Description

Have you ever checked the price of something before you bought it? Yep, thought so (and if you haven’t, please start!). Either way, for millions of organizations (literally!) around the world, getting a read on global freight prices is way, way harder than checking Amazon.

International freight moves the world in a way that most people never think about. But unless you’re living in a rural hunter-gatherer society 2,300 years ago (Hi, Og!)Almost every single thing you own relies on international air and ocean freight working. And in order for it to work well, the companies that ship need to be able to understand exactly how much it costs on a day to day basis.


Don’t worry, we have the solution. 


Freightos Terminal is the world’s leading market intelligence solution for global freight. It gives the largest companies in the world - retailers, manufacturers, car makers, freight companies and universities, a real-time pulse into global freight It’s been cited in the New York Times, Washington Post, The Information, CNBC, FOX, and the Guardian. But , more importantly, it is used by the people who run global freight at tens of thousands of companies worldwide. 

And for it to work, the numbers have to add up. And the platform needs to tick really well.

Sure, we can call it a full stack engineer but it’s so much more. 


And we know what you’re thinking - no, the role doesn’t come with a cape and a mask… but you can wear one if you like. This role combines the precision of data engineering with software development to keep the bits and bytes moving…so that the companies that move the boxes can do the same.

Responsibilities


  • Design and implement scalable data pipelines capable of handling large datasets efficiently. 
  • Optimize SQL queries for performance and scalability, ensuring minimal latency in data retrieval.
  • Build and maintain APIs and back-end services that enable seamless access to large datasets. 
  • Develop and maintain user interfaces using React, ensuring efficient data integration and user-friendly interaction with large datasets. 
  • Mentor and guide junior engineers in best practices related to data engineering, SQL optimization, and React development. 
  • Work closely with product and data teams to define and deliver technical solutions aligned with business goals. 
  • Stay updated with the latest tools and frameworks to enhance the efficiency of data processing and front-end development 
  • Troubleshoot issues across the full stack, including front-end (React) and back-end (SQL, data pipeline) bottlenecks.
  • You’ll make sure that the bytes behind the container boxes are shipshape, leading the collection, validation, and analysis of data behind global shipping rates.
  • Large organizations can ship well over half a million containers a year. So one decimal in the wrong place is kinda a big deal. You’ll help make sure that doesn’t happen, implementing QA protocols to ensure accuracy, integrity and general awesomeness of our indexes and other data products.

Spidey senses are good to nail down villains. You’ll rely on more conventional (and slightly more realistic) methods to ensure anomaly detection, data validation, and correction.

Requirements

Basic Requirements

  • 3+ years of experience in data engineering and data analysis (or, as you probably know, SELECT * FROM candidates WHERE career_start_date < DATEADD(year, -5, GETDATE());
  • Degree in Data Science, Computer Science, Statistics, or a related field. Sorry, Scarf Knitting doesn’t work.
  • Strong proficiency in SQL and Python programming experience.
  • Experience with ETL and data orchestration tools like Airflow. Actual music orchestra experience is cool too but totally unrelated.
  • Understanding of RESTful APIs.
  • Excellent problem-solving skills and attention to detail.


Preferred Requirements

  • Experience with Google Cloud Platform (GCP).
  • Knowledge of statistical analysis, regression models, and forecasting techniques.
  • Familiarity with data visualization tools (e.g., Tableau, Power BI) and big data technologies (e.g. Hadoop, Spark).
  • Experience with logistics is nice. Hey, a company can dream, right?
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Big Data Computer Science Data analysis Data pipelines Data visualization Engineering ETL GCP Google Cloud Hadoop Pipelines Power BI Python React Spark SQL Statistics Tableau

Perks/benefits: Career development

Region: Asia/Pacific
Country: Palestine

More jobs like this