Data Engineer, VP

Bangalore, India

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

NatWest Group

NatWest Group - Supporting customers, news, investors and sustainability

View all jobs at NatWest Group

Apply now Apply later

Join us as a Data Engineer

  • You’ll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making
  • We’ll look to you to drive the build of effortless, digital first customer experiences
  • If you’re ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you’re looking for
  • We're offering this role at vice president level

What you'll do

As a Data Engineer, you’ll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank’s data, safe and secure.

You’ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You’ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture.

Your responsibilities will also include:

  • Building advanced automation of data engineering pipelines through removal of manual stages
  • Embedding new data techniques into our business through role modelling, training, and experiment design oversight
  • Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets
  • Sourcing new data using the most appropriate tooling for the situation
  • Developing solutions for streaming data ingestion and transformations in line with our streaming strategy

The skills you'll need

To thrive in this role, you must have twelve years expereince and you’ll need a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You’ll also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals.

Additionally, you’ll need:

  • Experience of data engineering toolsets such as Airflow, RDBM tools like PGSQL/Oracle/DB2, Snowflake, S3, EMR/DataBricks and Data Pipelines etc.
  • Proficiency in Python, PySpark, SQL, CICD pipelines, Git version control
  • Experience in reporting tools such as QuickSight would be an added advantage
  • Good understanding of Database, Data Warehouse and ETL concepts
  • Strong communication skills with the ability to proactively engage and manage a wide range of stakeholders

Hours

45

Job Posting Closing Date:

29/07/2025

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Databricks Data pipelines Data warehouse DB2 Engineering ETL Git Oracle Pipelines PySpark Python QuickSight Snowflake SQL Streaming

Region: Asia/Pacific
Country: India

More jobs like this