Full Stack Data Engineer
London, United Kingdom
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Orbital
Move Money Faster. Unifying stablecoin speed & traditional payments. Move money in minutes, not days—all with the highest security standards.What is our mission?:
Orbital is on an exciting mission to revolutionise global cross-border payments by innovatively combining traditional fiat banking rails with stablecoins over blockchain rails for a variety of use cases. Our class leading B2B payments platform offers multi-currency e-money accounts (corporate IBANs) combined with a suite of digital assets services. Our company sits at the frontier of payments & fintech, by intersecting blockchain and traditional (fiat) financial services, and is leading the way to bridging those two worlds for corporate enterprises globally.
We believe blockchain technology is firmly here to stay, and we want to be the first to bring a combined offering of fiat & crypto payment services under one exciting platform. Learn more about our team and company story here.
What is the purpose of this role in the delivery of our mission?
We’re looking for a Full-Stack Data Engineer who can design, build, and optimize modern data systems from the ground up. You’ll own the full data lifecycle—from architecting databases to building ETL pipelines, writing advanced queries, and enabling data-driven decision-making through powerful insights.
This role blends traditional data engineering with a strong analytics mindset. You’ll collaborate closely with engineering, product, and compliance teams to ensure clean, accessible, and scalable data flows across our platform.
What are the key responsibilities of the role?
Design and develop scalable, reliable data architectures and storage solutions (SQL, NoSQL, etc.)
Build and maintain robust ETL/ELT pipelines to ingest, transform, and enrich data from multiple sources
Write performant SQL queries for reporting, dashboards, and ad-hoc analysis
Develop and optimize data models for both operational and analytical use
Collaborate with analysts and stakeholders to define metrics, KPIs, and data definitions
Implement data validation, monitoring, and observability across pipelines
Support data visualization efforts via BI tools (Metabase, Power BI or custom dashboards)
Ensure data security, governance, and compliance across all systems
What is the scope of accountability for the role?
Design, develop, deploy and maintain mission critical data applications
Delivery of various data driven applications
Develop and owning data models, dashboard and reporting
Business analysis and the query of databases
What are the essential skills, qualifications and experience required for the role?
3+ years of experience in data engineering or similar roles
Strong SQL skills and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server)
Experience with cloud data platforms (AWS Redshift, Stich, Airbyte, Athena, Glue, S3)
Proficient in Python or another data scripting language
Experience with orchestration tools (e.g., Airflow, Prefect, Dagster)
Familiarity with data warehousing, data lakes, and stream processing (Kafka, Spark, etc.)
Understanding of data modelling techniques (e.g., star/snowflake schema, normalization)
Ability to communicate complex data concepts to non-technical stakeholders
You have strong analytical, organisational, and prioritisation skills, and a belief in writing documentation as part of writing code
What are the desirable skills, qualifications and experience that would be beneficial for the role?
Data Ingestion: AWS Kinesis/Firehose
Data Transformation: DBT (Data Build Tool)
Familiarity with DevOps/data infrastructure tools (Git/Bitbucket, AWS CloudFormation, AWS ECS)
Exposure to analytics or dashboard tools (Metabase and/or PowerBI)
Prior work in a startup, SaaS, or data-intensive environment
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture Athena AWS Banking Bitbucket Blockchain CloudFormation Crypto Dagster Data visualization Data Warehousing dbt DevOps ECS ELT Engineering ETL FinTech Firehose Git Kafka Kinesis KPIs Metabase MySQL NoSQL Pipelines PostgreSQL Power BI Python RDBMS Redshift Security Snowflake Spark SQL
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.