Senior Market Data Engineer

Dubai, Dubai, United Arab Emirates

BHFT

Innovative algorithmic trading company

View all jobs at BHFT

Apply now Apply later

Company Description

We are a proprietary algorithmic trading firm. Our team manages the entire trading cycle, from software development to creating and coding strategies and algorithms. We have a team of 200+ professionals, with a strong emphasis on technology—70% of our team is made up of technical specialists.

We operate as a fully remote organization, fostering a culture of transparency, clarity, and open communication. We are expanding into new markets and technologies, continuously innovating in the world of algorithmic trading.

Job Description

  • Historical Data Capture and Storage: Design, develop, and maintain systems for the acquisition, storage, and retrieval of historical market data from multiple financial exchanges, brokers, and market data vendors

  • Data Integrity and Accuracy: Ensure the integrity and accuracy of historical market data, including implementing data validation, cleansing, and normalization processes.

  • Data Architecture Development: Build and optimize data storage solutions, ensuring they are scalable, high-performance, and capable of managing large volumes of time-series data.

  • Versioning and Reconciliation: Develop systems for data versioning and reconciliation to ensure that changes in exchange formats or corrections to past data are properly handled.

  • Data Source Integration: Implement robust integrations with various market data providers, exchanges, and proprietary data sources to continuously collect and store historical data.

  • Data Access Tools: Build internal tools to provide easy access to historical data for research and analysis, ensuring performance, ease of use, and data integrity

  • Collaborate with Trading and Research Teams: Work closely with quantitative researchers and traders to understand their data requirements and optimize the systems for data retrieval and analysis for backtesting and strategy development.

  • Performance and Scalability: Develop scalable solutions to handle growing volumes of historical market data, including ensuring efficient queries and data retrieval for research and backtesting needs.

  • Optimize Storage Costs: Work on optimizing data storage solutions, balancing cost-efficiency with performance, and ensuring that large datasets are managed effectively.

  • Compliance and Auditing: Ensure historical market data systems comply with regulatory requirements and assist in data retention, integrity, and reporting audits.

Qualifications

Required Skills and Experience

  • Commercial experience of financial instruments and markets (equities, futures, options, forex, etc.), particularly understanding how historical data is used for algorithmic trading.

  • Familiarity with market data formats (e.g., MDP, ITCH, FIX, SWIFT, proprietary exchange APIs) and market data providers.

  • Strong programming skills in Python (Go/Rust is a nice to have)

  • Familiarity with ETL (Extract, Transform, Load) processes (or other data pipeline architecture) and tools to clean, normalize, and validate large datasets.

  • Commercial experience in building and maintaining large-scale time series or historical market data in the financial services industry.

  • Strong SQL proficiency: aggregations, joins, subqueries, window functions (first, last, candle, histogram), indexes, query planning, and optimization.

  • Strong problem-solving skills and attention to detail, particularly in ensuring data quality and reliability.

  • Bachelor’s degree in Computer Science, Engineering, or related field.

Preferred Qualifications

  • Experience in a proprietary trading firm or buy-side environment working with historical market data and its vendors.

  • Experience with data governance and compliance related to financial data storage and retrieval.

  • Experience in working with distributed data systems and tools such as Hadoop, Kafka, Spark, or similar technologies.

  • Proficiency in containerization, orchestration - Docker, Airflow, SLURM tools.

  • Linux/Unix expertise, particularly in managing and optimizing systems for data storage and processing.

  • Experience with cloud-based storage solutions such as AWS S3, Google Cloud Storage, or Azure, and the ability to optimize for performance and cost.

  • Familiarity with machine learning and data science workflows to support quantitative research teams.

Additional Information

What we offer:

  • Working in a modern international technology company without bureaucracy, legacy systems, or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world, with a flexible schedule.
  • We offer compensation for health insurance, sports activities, and professional training.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure Computer Science Data governance Data quality Docker Engineering ETL GCP Google Cloud Hadoop Kafka Linux Machine Learning Python Research Rust Spark SQL Swift

Perks/benefits: Career development Flex hours Startup environment Team events

Regions: Remote/Anywhere Middle East

More jobs like this