Principal Data Engineer

UK - London

Capco

Capco is a global management and technology consultancy dedicated to the financial services and energy industries.

View all jobs at Capco

Apply now Apply later

Principal Data Engineer

Why Join Capco?

Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry.

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.

We are/have:

  • Experts across the Capital Markets, Insurance, Payments, Retail Banking and Wealth & Asset Management domains.
  • Deep knowledge in various financial services offerings including Finance, Risk and Compliance, Financial Crime, Core Banking etc.
  • Committed to growing our business and hiring the best talent to help us get there.
  • Focused on maintaining our nimble, agile and entrepreneurial culture.

Why Join Capco as a Data Engineer?

  • You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
  • You’ll be part of digital engineering team that develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
  • You’ll be involved in digital and data transformation processes through a continuous delivery model.
  • You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions and enhancing security standards both on cloud and on-premise deployments.
  • You’ll be able to work across different data, cloud and messaging technology stacks.
  • You’ll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set. 

As a Principal Data Engineer at Capco you will/have:

  • Demonstrate practical experience of engineering best practices, while being obsessed with continuous improvement
  • Have expertise in a set of the team's domains, including the breadth of services, how they interact, and data flows between systems
  • Able to work individually or with teams drawing on experience to recommend tooling and solutions aligning with organisational strategies. Influences organisation wide testing strategy
  • Architects’ services and systems using well accepted design patterns to allow for iterative, autonomous development and future scaling. Guides teams in anticipation of future use cases and helps them make design decisions that minimise the cost of future changes
  • Actively contributes to security designs based on the organisation's security strategy. Fosters a security first mindset across teams, and leads by example. Has advanced knowledge of key security technologies, protocols & techniques (e.g. TLS, OAuth, Encryption, Networks)
  • Be comfortable managing engineers ensuring they are tracking the team's efficiency and quality of work, they assist in regularly adjusting processes and timelines to ensure high quality work is delivered
  • Have personally made valuable contributions to products, solutions and teams and can articulate the value to customers.
  • Have played a role in the delivery of critical business applications and ideally customer facing applications.
  • Have the ability to communicate complex ideas to non-experts with eloquence and confidence.
  • Have an awareness and understanding of new technologies being used in finance and other industries and loves to experiment.
  • Have a passion for being part of the engineering team that is forming the future of finance. 

Skills & Expertise: 

Essentials

Event Streaming

  • Able to build near real time data streaming pipelines using technologies such as Kafka, Kafka Connect, Spark Streaming, Google PubSub, Knowledge and experience of using Change Data Capture and associated technologies.
  • Experience with one or more of the following technologies: Apache Flink, Apache Beam, Apache Storm, Spark Streaming and KStreams

Databases

  • Hands on experience with schema design using semi-structured and structured data.  Experienced in data modelling and data warehouse design.
  • Strong experience in SQL, RDBMS Databases and NoSQL Databases with a good understanding of the differences and trade-off between both.
  • Hands on experience building both ETL and ELT based solutions, Experience with using low-code node-code ETL platforms
  • Previous experience inn cloud migration projects with exposure to data lake formation and data warehousing on the cloud and shifting data from on premise to CSP databases such as Big Query, Redshift and Snowflake.

Big Data

  • Experience in traditional Big Data Technologies such as Hadoop, HIVE, Spark, Pig, SQOOP, Flume, Spark, Cloudera, Airflow, Oozie.

Development Languages

  • Solid development experience using Python, Scala and Java

Desirable

Cloud Environments

  • Strong cloud provider’s experience on GCP and with exposure to AWS and Azure

DevOps

  • Experience using version control tool such as GIT
  • Experience in design, build and maintain CI/CD Pipelines on Jenkins, CircleCI
  • Exposure and or experience in using Ansible
  • Experience in using building observable using tools such as Prometheus, Grafana, Elastic, Splunk…

Infrastructure

  • Experience of using IAC to deploy data pipeline infrastructure to cloud environments with tools such as Terraform
  • Exposure to DevOps or DataOps, with experience of productionising data pipelines which can handle high availability and disaster scenarios. Experience building the supporting orchestration, monitoring and alerting features which enable robust and reliable data pipelines.
  • Has a good understanding of high availability and disaster recovery

We offer:

  • A work culture focused on innovation and building lasting value for our clients and employees.
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise.
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients.
  • A diverse, inclusive, meritocratic culture.
  • Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement.

Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can, and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow Ansible AWS Azure Banking Big Data BigQuery CI/CD DataOps Data pipelines Data warehouse Data Warehousing DevOps ELT Engineering ETL Finance Flink GCP Git Grafana Hadoop Java Jenkins Kafka Lake Formation NoSQL Oozie Pipelines Python RDBMS Redshift Scala Security Snowflake Spark Splunk SQL Streaming Terraform Testing

Perks/benefits: Career development Fertility benefits Flat hierarchy Parental leave

Region: Europe
Country: United Kingdom

More jobs like this