Senior Market Data Engineer

Dubai, Dubai, United Arab Emirates

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

BHFT

Innovative algorithmic trading company

View all jobs at BHFT

Apply now Apply later

Company Description

We are a proprietary algorithmic trading firm. Our team manages the entire trading cycle, from software development to creating and coding strategies and algorithms. We have a team of 200+ professionals, with a strong emphasis on technology—70% of our team is made up of technical specialists.

We operate as a fully remote organization, fostering a culture of transparency, clarity, and open communication. We are expanding into new markets and technologies, continuously innovating in the world of algorithmic trading.

Job Description

Data Engineering team is responsible for designing, building, and maintaining the Data Lake infrastructure, including ingestion pipelines, storage systems, and internal tooling for reliable, scalable access to market data.

Key Responsibilities

  • Ingestion & Pipelines: Architect batch + stream pipelines (Airflow, Kafka, dbt) for diverse structured and unstructured marked data. Provide reusable SDKs in Python and Go for internal data producers.

  • Storage & Modeling: Implement and tune S3, column‑oriented and time‑series data storage for petabyte‑scale analytics; own partitioning, compression, TTL, versioning and cost optimisation.

  • Tooling & Libraries: Develop internal libraries for schema management, data contracts, validation and lineage; contribute to shared libraries and services for internal data consumers for research, backtesting and real-time trading purposes.

  • Reliability & Observability: Embed monitoring, alerting, SLAs, SLOs and CI/CD; champion automated testing, data quality dashboards and incident runbooks.

  • Collaboration: Partner with Data Science, Quant Research, Backend and DevOps to translate requirements into platform capabilities and evangelise best practices.

Qualifications

  • 6+ years of experience building and maintaining production-grade data systems, with proven expertise in architecting and launching data lakes from scratch.
  • Expert-level Python development skills (Go and C++ nice to have).
  • Hands-on experience with modern orchestration tools (Airflow) and streaming platforms (Kafka).
  • Advanced SQL skills including complex aggregations, window functions, query optimization, and indexing.
  • Experience designing high-throughput APIs (REST/gRPC) and data access libraries.
  • Solid fundamentals in Linux, containerization (Docker), and cloud object storage solutions (AWS S3, GCS).
  • Strong knowledge of handling diverse data formats including structured and unstructured data, with experience optimizing storage strategies such as partitioning, compression, and cost management.
  • English at C1 level - confident communication, documentation, and collaboration within an international team.

Additional Information

What we offer:

  • Working in a modern international technology company without bureaucracy, legacy systems, or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world, with a flexible schedule.
  • We offer compensation for health insurance, sports activities, and professional training.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs AWS CI/CD Data quality dbt DevOps Docker Engineering Kafka Linux Pipelines Python Research SQL Streaming Testing Unstructured data

Perks/benefits: Career development Flex hours Startup environment Team events Transparency

Regions: Remote/Anywhere Middle East

More jobs like this