Senior Data Engineer

Dubai, Dubai, United Arab Emirates

BHFT

Innovative algorithmic trading company

View all jobs at BHFT

Apply now Apply later

Company Description

We are a proprietary algorithmic trading firm. Our team manages the entire trading cycle, from software development to creating and coding strategies and algorithms. We have a team of 200+ professionals, with a strong emphasis on technology—70% of our team is made up of technical specialists.

We operate as a fully remote organization, fostering a culture of transparency, clarity, and open communication. We are expanding into new markets and technologies, continuously innovating in the world of algorithmic trading.

Job Description

Key Responsibilities

  • Ingestion & Pipelines: Architect batch + stream pipelines (Airflow, Kafka, dbt) for diverse structured and unstructured marked data. Provide reusable SDKs in Python and Go for internal data producers.

  • Storage & Modeling: Implement and tune S3, column‑oriented and time‑series data storage for petabyte‑scale analytics; own partitioning, compression, TTL, versioning and cost optimisation.

  • Tooling & Libraries: Develop internal libraries for schema management, data contracts, validation and lineage; contribute to shared libraries and services for internal data consumers for research, backtesting and real-time trading purposes.

  • Reliability & Observability: Embed monitoring, alerting, SLAs, SLOs and CI/CD; champion automated testing, data quality dashboards and incident runbooks.

  • Collaboration: Partner with Data Science, Quant Research, Backend and DevOps to translate requirements into platform capabilities and evangelise best practices.

Qualifications

Required Skills & Experience

  • 7 + years building production‑grade data systems.

  • Familiarity with market data formats (e.g., MDP, ITCH, FIX, proprietary exchange APIs) and market data providers.

  • Expert‑level Python (Go and C++ nice to have).

  • Hands‑on with modern orchestration (Airflow) and event streams (Kafka).

  • Strong SQL proficiency: aggregations, joins, subqueries, window functions (first, last, candle, histogram), indexes, query planning, and optimization.

  • Designing high‑throughput APIs (REST/gRPC) and data access libraries.

  • Strong Linux fundamentals, containers (Docker) and cloud object storage (AWS S3 / GCS).

  • Proven track record of mentoring, code reviews and driving engineering excellence.

Additional Information

What we offer:

  • Working in a modern international technology company without bureaucracy, legacy systems, or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world, with a flexible schedule.
  • We offer compensation for health insurance, sports activities, and professional training.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs AWS CI/CD Data quality dbt DevOps Docker Engineering Kafka Linux Pipelines Python Research SQL Testing

Perks/benefits: Career development Flex hours Startup environment Team events Transparency

Regions: Remote/Anywhere Middle East

More jobs like this