Data Engineer

London, Singapore

GSR

GSR is the global leader in crypto trading and market-making. We specialize in providing liquidity, trading and risk management solutions.

View all jobs at GSR

Apply now Apply later

Title: Data Engineer

Location: London, Singapore

About us:

Founded in 2013, GSR is a leading market-making and programmatic trading company in the exciting and fast-evolving world of cryptocurrency trading. With more than 200 employees in 5 countries, we provide billions of dollars of liquidity to cryptocurrency protocols and exchanges on a daily basis. We build long-term relationships with cryptocurrency communities and traditional investors by offering exceptional service, expertise and trading capabilities tailored to their specific needs.

GSR works with token issuers, traders, investors, miners, and more than 30 cryptocurrency exchanges around the world. In volatile markets we are a trusted partner to crypto native builders and to those exploring the industry for the first time. 

Our team of veteran finance and technology executives from Goldman Sachs, Two Sigma, and Citadel, among others, has developed one of the world’s most robust trading platforms designed to navigate issues unique to the digital asset markets. We have continuously improved our technology throughout our history, allowing for our clients to scale and execute their strategies with the highest level of efficiency.

Working at GSR is an opportunity to be deeply embedded in every major sector of the cryptocurrency ecosystem. 

About the role:

This role sits within GSR’s global Data Engineering team, where you’ll contribute to the design and development of scalable data systems that support our trading and business operations. You’ll work closely with stakeholders across the firm to build and maintain pipelines, manage data infrastructure, and ensure data is reliable, accessible, and secure.

It’s a hands-on engineering position with scope to shape the way data is handled across the business, working with modern tools in a fast-moving, high-performance environment.

 

Your responsibilities may include:

Data Pipeline Development

  • Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing.
  • Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability.

Infrastructure & Architecture

  • Design and manage data storage solutions, including databases, warehouses, and lakes.
  • Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads.

Operations & Tooling

  • Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency.

  • Implement data governance, access controls, and security measures in line with best practices and regulatory standards.

  • Develop observability and anomaly detection tools to support Tier 1 systems.

Collaboration & Continuous Improvement

  • Work with engineers and business teams to gather requirements and translate them into technical solutions.
  • Maintain documentation, follow coding standards, and contribute to CI/CD processes.
  • Stay current with new technologies and help improve the team’s tooling and infrastructure.

 

What We’re Looking For

  • 8+ years of experience in data engineering or a related field.
  • Strong programming skills in Java, Python and SQL; familiarity with Rust is a plus.
  • Proven experience designing and maintaining scalable ETL/ELT pipelines and data architectures.
  • Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services.
  • Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch.
  • Strong understanding of data governance, security, and best practices for data quality.
  • Effective communicator with the ability to work across technical and non-technical teams.

Additional Strengths

  • Experience with orchestration tools like Apache Airflow.
  • Knowledge of real-time data processing and event-driven architectures.
  • Familiarity with observability tools and anomaly detection for production systems.
  • Exposure to data visualization platforms such as Tableau or Looker.
  • Relevant cloud or data engineering certifications.

 

What we offer: 

  • A collaborative and transparent company culture founded on Integrity, Innovation and Performance. 
  • Competitive salary with two discretionary bonus payments a year.
  • Benefits such as Healthcare, Dental, Vision, Retirement Planning, 30 days holiday and free lunches when in the office. 
  • Regular Town Halls, team lunches and drinks. 
  • A Corporate and Social Responsibility program as well as charity fundraising matching and volunteer days.   

 

GSR is proudly an Equal Employment Opportunity employer. We do not discriminate based upon any applicable legally protected characteristics such as race, religion, colour, country of origin, sexual orientation, gender, gender identity, gender expression or age. We operate a meritocracy, all aspects of people engagement from the decision to hire or promote as well as our performance management process will be based on the business needs and individual merit, competence in the role. Learn more about us at www.gsr.io.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Big Data CI/CD Crypto Data governance Data pipelines Data quality Data visualization ELT Engineering ETL Finance Flink Java Looker Pipelines Python Rust Security SQL Streaming Tableau

Perks/benefits: Competitive pay Health care Lunch / meals

Regions: Asia/Pacific Europe

More jobs like this