KR127B Lead Data Engineer

London

Cybernetic Controls Limited

We are a trusted UK recruitment services provider,offering expert staffing solutions across various sectors.Explore Cybernetic Controls today

View all jobs at Cybernetic Controls Limited

Apply now Apply later

KR127B Lead Data Engineer

Department: Database Engineering

Employment Type: Permanent

Location: London

Reporting To: CTO/CDO


Description

Overview
CCL build high quality automation frameworks and bespoke software solutions for our clients in the finance industry. With an aim to help organisations improve their operational processes and reduce overall risk and increase efficiencies. Another key objective of CCL is to enable greater transparency for firms and regulators. CCL are also passionate about removing waste and making things simple.​We want to help inspire a new generation of Robotics and Automation projects across industry that are focused on delivering a highly skilled virtual workforce with cognitive and robotic capabilities. The CCL team is focused on helping firms improve operational productivity and reduce challenges. Read more on the Cybernetic Controls website.  CompanyKaizen is seeking an experienced Lead Data Engineer to join our growing team. The successful candidate will lead the design, development, and maintenance of our data infrastructure, pipelines, and platforms that power our organisation's data-driven initiatives. You will be responsible for managing and mentoring the data engineering team, establishing best practices and collaborating with cross-functional stakeholders to ensure our data architecture meets both current and future business needs.

Key Responsibilities

  • Lead a team of engineers which is responsible for the design, implementation, and optimisation of scalable data pipelines, ETL processes, and data infrastructure using AWS cloud computing solutions that make data available with robustness, maintainability, efficiency, scalability, availability and security.
  • Architect and develop reliable data solutions that support analytics, reporting, machine learning, and data science initiatives.
  • Define and implement data engineering best practices, coding standards, and architectural guidelines.
  • Provide technical leadership to a team of data engineers, including mentoring, code reviews, and professional development.
  • Collaborate with product owners, data analysts, scientists, and business stakeholders to understand data requirements and deliver solutions.
  • Collectively responsible with product owners & scrum masters for task management for the team through agile principles. Ultimately responsible for delivery of the team.
  • Support all members within the team with a focus on development and learning, technical skills and delivery.
  • Lead complex data projects from conception to implementation and production deployment.
  • Implement data quality monitoring, testing frameworks, and observability solutions to ensure data integrity & efficiency.
  • Develop strategies for data storage, transmission, retention, security, and compliance.
  • Implement and embed the use of AI in the team’s daily work to increase efficiency and integrity.
  • Evaluate and recommend new data technologies and tools to enhance our capabilities.
  • Partner with IT and DevOps teams to ensure infrastructure reliability and scalability.
  • Liaise with other development teams & leads to ensure the integrity of Kaizen’s products and services – with a focus on effective cross team working to achieve business goals.
  • Provide regular reports and key performance indicators on the operation of the team as agreed with management.

Skills, Knowledge and Expertise

 Skills     
  • Ability to provide technical leadership, direction and manage/mentor engineers of varying levels
  • Exceptional analytical and problem-solving abilities for complex data challenges
  • Strong written and verbal communication skills to explain technical concepts to diverse audiences
  • Demonstrated ability to collaborate and work cross-functionally with various stakeholders
  • Experience managing technical projects and prioritising competing demands
  • Strategic thinking balancing immediate needs with long-term architectural vision
  • Excellent Python and PySpark programming (including Pandas/PySpark dataframes, web and database connections)
  • Excellent understanding of ETL processes within Amazon Web Services (AWS)
  • Excellent understanding of serverless computing (AWS Lambda, API Gateway, SQS, SNS, EventBridge, S3, etc.)
  • Apache Spark, AWS Glue, Athena, S3, Step Functions, Lake Formation, Infrastructure as code (CloudFormation)
  • Proficiency with data processing frameworks
  • Strong understanding of software engineering practices (version control with Git/Github, testing, CI/CD) and test-driven development
  • Strong understanding of agile principles, processes and tools
  • Strong SQL coding skills (Spark-SQL, Presto SQL, MySQL, etc.) and experience of SQL and NoSQL database design and management (DynamoDB, MySQL)
Experience 
  • Extensive experience with Amazon Web Services (AWS)
  • Extensive experience with SQL and database design (both relational and NoSQL)
  • Extensive experience in designing, deploying and managing complex production data pipelines that interact with a range of data sources (file systems, web, database, users)
  • Experience with infrastructure-as-code tools
  • Experience with data mesh or data fabric architectural patterns
  • Experience with real-time data processing and streaming architectures
  • Experience implementing data governance and security measures
  • Familiarity with machine learning workflows and MLOps
  • 7+ years of professional experience in data engineering roles
  • 2+ years of leadership or technical lead experience
  • Certifications in relevant cloud platforms or data technologies
Knowledge 
  • Data modelling, data pipeline & ETL architecture, Big Data implementation, container orchestration systems
  • Software development lifecycle best practices
  • Financial knowledge would be an asset
Qualifications/Training 
  • Bachelor’s degree or equivalent in Computer Science, Data Engineering or a related subject

Benefits

  • Company pension (5% of your base salary)
  • 25 days' annual leave, plus UK bank holidays and your birthday
  • Employee Assistance Programme
  • Annual pay review
  • Discretionary bonus (paid across three years, as per contract schedule)
  • Private medical insurance 
  • Discount on private health assessment
  • Upfront cost of an Apple watch covered via the health scheme
  • Medical cashback plan 
  • Wellbeing allowance of up to £50 per month
  • Up to £1,000 per year towards learning any new skill of your choosing
  • Salary sacrifice electric car scheme 
  • Salary sacrifice bike scheme 
  • Childcare contribution of up to £3,000 per year for dependants ages five and under
  • Various retail discounts via Shop st.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile APIs Architecture Athena AWS AWS Glue Big Data CI/CD CloudFormation Computer Science Data governance Data pipelines Data quality DevOps DynamoDB Engineering ETL Finance Git GitHub Lake Formation Lambda Machine Learning MLOps MySQL NoSQL Pandas Pipelines PySpark Python Robotics Scrum Security Spark SQL Step Functions Streaming TDD Testing

Perks/benefits: Career development Health care Medical leave Salary bonus Transparency

Region: Europe
Country: United Kingdom

More jobs like this