Senior Data Engineer
UK - 135 Bishopsgate - London, United Kingdom
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
TP ICAP Group
The TP ICAP Group is a world leading provider of market infrastructure.
Our purpose is to provide clients with access to global financial and commodities markets, improving price discovery, liquidity, and distribution of data, through responsible and innovative solutions.
Through our people and technology, we connect clients to superior liquidity and data solutions.
The Group is home to a stable of premium brands. Collectively, TP ICAP is the largest interdealer broker in the world by revenue, the number one Energy & Commodities broker in the world, the world’s leading provider of OTC data, and an award winning all-to-all trading platform.
Founded in London in 1866, the Group operates from more than 60 offices in 27 countries. We are 5,200 people strong. We work as one to achieve our vision of being the world’s most trusted, innovative, liquidity and data solutions specialist.
Role Overview
This is a Senior Data Engineer role that sits within the Brokerage & Pricing team within the TP ICAP Technology division. The Senior Data engineer will join an Agile team alongside other engineers, working on the next generation of strategic back office applications, ensuring solutions provide maximum value to users. The team’s focus on Brokerage & Pricing technology is to optimise the management of brokerage data and calculations used to drive all broking activity in our £1billion+ revenue Global Broking organisation, and carrying out commercial analysis on that data to understand revenues and drive client commercial agreements. The Senior Data Engineer is responsible for designing, developing, and maintaining data pipelines and ETL processes to support data integration and analytics. This role requires a deep understanding of data structures and content, ensuring high-quality data through rigorous testing and validation. The engineer collaborates with system owners and stakeholders to understand data requirements, delivering reliable and efficient data solutions. Attention to detail and a commitment to data quality are paramount in maintaining the integrity and reliability of data.
Role Responsibilities
Design, develop, and maintain data pipelines and ETL processes to support data integration and analytics.
Code primarily in Python to build and optimise data workflows.
Implement and manage workflows using Apache Airflow (MWAA).
Ensure high-quality data through rigorous testing and validation processes.
Produce data quality reports to monitor and ensure the integrity of data.
Conduct thorough data exploration and analysis to understand data structure and content before developing ETL pipelines.
Collaborate with system owners and stakeholders to understand data requirements and deliver solutions.
Monitor and troubleshoot data pipelines to ensure reliability and track performance.
Maintain detailed documentation of data processes, workflows, and system configurations.
Familiar with data lakes and their architecture.
Experience / Competences
Strong experience as a Data Engineer, preferably in the finance sector.
Strong understanding of ETL processes and data pipeline design.
Extensive experience coding in Python.
Hands-on experience with Apache Airflow (MWAA) for workflow management.
Experience with AWS Athena/PySpark (Glue) for data querying and processing.
Strong SQL/PLSQL skills, particularly with MS SQL and Oracle databases.
Highly proficient in SQL with experience in both relational and NoSQL databases.
Attention to detail and the ability to work under pressure without being distracted by complexity.
Excellent problem-solving skills and the ability to think critically and creatively.
Strong collaboration skills and the ability to communicate effectively with team members and stakeholders.
Passion for data quality and a commitment to maintaining high standards of data engineering.
Proficiency in Python for data engineering tasks.
proficiency in using AWS Cloud services in the context of data processing.
Strong understanding of ETL processes and data pipeline design.
Familiarity with data lakes, operational databases/data stores and their architecture.
Fluent in using Python CDK for AWS.
Familiarity with version control systems (e.g., Git) and backlog management tools (e.g., JIRA).
Ability to write clear and concise documentation.
Strong communication skills, both written and verbal.
Ability to work effectively as part of a team and independently when required.
Job Band & Level
Manager / Level 6
#LI-Hybrid #LI-MID
Not The Perfect Fit?
Concerned that you may not meet the criteria precisely? At TP ICAP, we wholeheartedly believe in fostering inclusivity and cultivating a work environment where everyone can flourish, regardless of your personal or professional background. If you are enthusiastic about this role but find that your experience doesn't align perfectly with every aspect of the job description, we strongly encourage you to apply. You may be the ideal candidate for this position or another opportunity within our organisation. Our dedicated Talent Acquisition team is here to assist you in recognising how your unique skills and abilities can be a valuable contribution. Don't hesitate to take the leap and explore the possibilities. Your potential is what truly matters to us.
Company Statement
We know that the best innovation happens when diverse people with different perspectives and skills work together in an inclusive atmosphere. That's why we're building a culture where everyone plays a part in making people feel welcome, ready and willing to contribute. TP ICAP Accord - our Employee Network - is a central to this. As well as representing specific groups, TP ICAP Accord helps increase awareness, collaboration, shares best practice, and holds our firm to account for driving continuous cultural improvement.
Location
UK - 135 Bishopsgate - London* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Athena AWS Data pipelines Data quality Engineering ETL Finance Git Jira MS SQL NoSQL Oracle Pipelines PySpark Python SQL Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.