Lead Analytics Engineer
London, England, GB
Description
Curve was founded with a rebellious spirit, and a lofty vision; to truly simplify your finances, so you can focus on what matters most in life.
That’s why Curve puts your finances simply at your fingertips, so you can make smart choices on how to spend, send, see and save your money. We help you control your financial life, so you can go out and live the life you want to live.
With Curve you can spend from all your accounts, track spend behaviour and provide insights, and security to protect you from fraud. For the first time giving you bright insights and control of all your money in one beautiful place.
We’re developing a ground-breaking product with our customers at the core. Our user base is growing rapidly and we have exceptional metrics. We have funding from the leading names in tech investment, and a visionary leadership team who wants everyone who joins this remarkable adventure, to have the autonomy to masterfully develop their expertise.
Welcome to Curve. On a mission to help you live inspired.
We’re looking for a capable Lead Analytics Engineer to join our central data team. Our mission is to build a robust, scalable platform that transforms raw data into clean, modelled datasets using dbt, empowering stakeholders across the company with reliable, accessible data. Your focus will be on developing and optimising data models that enable efficient and impactful analytics and reporting. You will work closely with both data engineers and analysts, bridging the gap between raw data collection and actionable insights.
This role is ideal for individuals passionate about data transformation, data modelling, and building scalable, maintainable workflows using dbt in a cloud-based environment. If you thrive on solving complex data challenges, care deeply about data quality, and are excited about implementing best practices in analytics engineering, this is the perfect role for you.
Key Accountabilities:
- Write production-quality ELT code with an emphasis on performance, maintainability, and scalability using dbt.
- Transform, maintain, and model clean datasets within the data warehouse for broader consumption by business teams.
- Apply software engineering best practices such as version control (e.g., Git) and CI/CD pipelines for analytics code deployment.
- Design, develop, and maintain dashboards and reports that communicate key performance indicators (KPIs), particularly for Curve Credit, enabling data-driven decision-making.
- Build and maintain strong, collaborative relationships with cross-functional teams, including product, marketing, finance, and operations.
- Partner with data engineering to develop tools, infrastructure, and data pipelines that improve data accessibility and enable data self-service across the organisation.
- Implement automated data quality checks and monitoring to ensure the integrity and reliability of transformed datasets.
Skills & Experience:
- 5+ years of experience in data/analytics engineering, focusing on data transformation and modelling, particularly using dbt.
- Expertise in ELT processes, transforming raw data into well-structured, high-quality data models in a data warehouse (e.g., BigQuery).
- Strong proficiency with SQL for data modelling and performance optimisation.
- Experience applying software engineering best practices such as version control (Git), CI/CD, and modular code design to analytics workflows.
- Experience building dashboards and reports to track business-critical metrics, with a focus on driving business insights.
- Ability to collaborate effectively with cross-functional teams and communicate technical solutions to non-technical stakeholders.
- Familiarity with orchestration tools like Airflow or Composer and cloud infrastructure, particularly GCP (BigQuery).
- Strong attention to detail with a focus on data accuracy, quality, and process improvement.
- Strong understanding of data quality principles
- A record of learning new technologies and tools
- You are experienced with database technologies. (Best Practice, Performance Optimisation, Fault Finding)
Nice to haves
- Experience with real-time and streaming data pipelines.
- Familiarity with infrastructure as code (Terraform) or container orchestration (Kubernetes).
- Experience mentoring or supporting junior team members in analytics engineering.
- Fintech, Finance, Payments, or Retail Banking industry experience.
Benefits:
- 25 days plus bank holidays
- Bonus days off for Learning & Development, Mental Wellbeing, Birthday, Moving House & Christmas
- Working abroad policy (up to 60 calendar days per year)
- Bupa Health Insurance (YuLife)
- Life insurance powered by AIG (5x Annual Salary)
- Pension Scheme powered by “People’s Pension” (4% Matched)
- EAP (Mental health & wellbeing support, Life coach, Career coach)
- 24/7 GP access (Smart Health via YuLife)
- Annual subscriptions to Meditopia & FIIT for your mind and body (via YuLife)
- Discounted shopping vouchers (via YuLife)
- Enhanced parental leave
- Ride to work scheme & Season ticket loan
- Electric car scheme
- Six nights of Night Nanny for new parents
- Free Curve Metal subscription for you and your +1
Requirements
NoneBenefits
NoneKey Accountabilities
None* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Banking BigQuery CI/CD Data pipelines Data quality Data warehouse dbt ELT Engineering Finance FinTech GCP Git KPIs Kubernetes Pipelines Security SQL Streaming Terraform
Perks/benefits: Career development Health care Parental leave Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.