Senior Data Engineer
Remote - UK or Europe
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Location: Remote - UK
Auros is a leading algorithmic trading and market-making firm specialising in digital asset liquidity provision. We trade across 10+ global locations, facilitating 3-4% of global daily volumes, and have through connectivity to over 50 venues.
We’re proud of the strong reputation we’ve built by combining our systematic approach, sophisticated pricing models, and state-of-the-art execution capabilities to provide robust, reliable trading performance and bring liquidity to crypto markets worldwide.
What sets us apart, though, is our culture. Our flat structure means you’ll have autonomy and plenty of opportunity to bring your ideas to life and help shape the systems that will power our business into the future.
Summary of Position:
This is a rare opportunity for an experienced Data Engineer to become both steward and champion for the firm's market and trading data archives and internal data products
You will work with our existing data pipelines and databases while designing and implementing the next generation of Auros data and analytic capabilities. You’ll enjoy taking on responsibilities where you’ll have the opportunity to make a substantial impact on the business outcomes through the work you do every day.
You’ll learn from our experienced trading team and help develop and support systems that execute millions of trades on crypto exchanges across the globe.
Job responsibilities include:
- Develop, test and maintain high throughput, high volume distributed data architectures
- Analyze, define and automate data quality improvements
- Build and improve trading analytics systems
- Create tools to automate the configuration, deployment, building and troubleshooting of the data pipelines.
- Develop strategies to make our data pipeline efficient, timely and robust in a 24/7 trading environment
- Implement monitoring that measures the completeness and accuracy of captured data
- Manage the impact that changes to trading systems and upstream protocols have on the data pipeline
- Collaborate with traders and trading system developers to understand our data analysis requirements, and to continue to improve the quality of our stored data
- Develop tools, APIs and screens to provide easy access to the archived data
Position Requirements:
- Experience with Python, Tick databases (ie Clickhouse and/or Vertica) and Amazon S3
- Experience with developing the collection of real time large scale data pipelines (with petabytes of data)
- Experience with computing cluster management in aws (ray, dask, etc)
- Experience with building research pipelines on these large sets of data
- Extensive experience conducting data analysis and other ad hoc tooling to analyse time series data sets and other large sets of data
- A bachelor's degree (or above) in Computer Science, Software Engineering or similar, with excellent results.
And the following are highly desirable:
- Experience with data lakes, Amazon S3 or similar
- Experience developing in C++ on linux
- Protocol level network analysis experience
- Experience with terraform
- Experience with Clickhouse
- Experience with technologies such as Hive, Hadoop, Snowflake, Presto or similar.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Computer Science Crypto Data analysis Data pipelines Data quality Engineering Hadoop Linux Pipelines Python Research Snowflake Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.