Data Engineer

United Kingdom

Department for Business and Trade

Export support for UK businesses – great.gov.uk

View all jobs at Department for Business and Trade

Apply now Apply later

About usThe Department for Business and Trade (DBT) has a clear mission - to grow the economy. Our role is to help businesses invest, grow and export to create jobs and opportunities right across the country. We do this in three ways.    Firstly, we help to build a strong, competitive business environment, where consumers are protected and companies rewarded for treating their employees properly.  Secondly, we open international markets and ensure resilient supply chains. This can be through Free Trade Agreements, trade facilitation and multilateral agreements.  Finally, we work in partnership with businesses every day, providing advance, finance and deal-making support to those looking to start up, invest, export and grow.   The Digital, Data and Technology (DDaT) directorate develops and operates tools and services to support us in this mission. The team have been nominated three times in a row for ‘Best Public Sector Employer’ at the Women in Tech awards!     About the role  Data engineers at the DBT are pivotal to our mission of driving economic growth by placing data at the heart of decision-making. We leverage leading cloud and open-source technology to build scalable, business-critical data pipelines, products and services, ensuring our users—whether within DBT, across government, or the public— have access to the right data in the optimal format.  Our cross-functional, agile teams work collaboratively on a range of projects spanning from internal AI agents, to our data science and analytics platform, to our system for managing the UK Tariff, and beyond. Data engineers also belong to a vibrant community (both within DBT and across government) who share and promote best practice. We are seeking a collaborative problem-solver to join our profession and help deliver more for our users. You will be confident in using languages like SQL and Python to deliver robust data products and have some experience supporting more junior members of the team.If successful, you’ll get the opportunity to work with a range of technology including:
  • Airflow and a range of AWS services to build robust, scalable data pipelines
  • Terraform to manage our Postgres backed data platform
  Main responsibilities  You will:
  • Design, build, and test data products: Develop robust data products by integrating feeds from multiple systems, utilising a variety of storage technologies and access methods.
  • Implement resilient and scalable solutions: Select and deploy the most suitable technologies to create data solutions that are resilient, scalable, and future proof.
  • Problem-solving expertise: Use your understanding of common issues in databases, data processes, products, and services, and apply standard solutions effectively.
  • Stakeholder collaboration: Work closely with a diverse range of stakeholders to understand their requirements, establish project goals, and define deliverables.
  • User support and issue resolution: Assist users in utilising data tools, resolve issues promptly, and escalate when necessary.
  • Promote best practices: Develop and advocate for best practices within the team.
  • Team building and development: Organise initiatives to foster team building and professional development.
  • Mentorship and management: Provide line management and support to junior team members, guiding their growth and development.
  • Continuous improvement: Stay informed about the latest trends and best practices in data engineering to drive continuous improvement.
 Skills and experienceIt is essential that you have:
  • The ability to write optimised and maintainable SQL and Python for data ingestion and transformation (LEAD)
  • Demonstratable experience in developing and maintaining enterprise-level data solutions
  • experience in quality assurance and implementing testing, for example using pytest
  • Proficiency with the command line and git
  • The ability to apply automation and use continuous integration / continuous deployment workflows, for example using GitHub actions
  • Experience using agile methodologies on data projects
  • Some experience in direct/indirect line management such as developing and coaching others
 It is desirable that you have experience of working with:
  • Cloud technologies, especially AWS, and infrastructure-as-code tools such as Terraform
  • Docker and container-based workflows
 How to apply As part of the application process you will be asked to upload a two page CV and complete a 500 word personal statement outlining how you meet the essential skills and experience listed above. You can use bullet points and subheadings if you prefer. If we receive a high volume of applications, we will conduct a ‘short sift’ and read one element of your application. For this campaign a short sift would be conducted based on:  The first essential criteria of 'The ability to write optimised and maintainable SQL and Python for data ingestion and transformation'  Sift will be from week commencing 16/04/2025Interviews will be from week commencing 30/04/2025Please note these dates are indicative and may be subject to change.  How we interview At the interview stage for this role, you will be asked to demonstrate relevant Technical Skills and Behaviours from the Success Profiles framework specific to the role.  Technical Skills

 

  • Data modelling. You understand the concepts and principles of data modelling and can produce, maintain and update relevant data models for specific business needs. You know how to reverse-engineer data models from a live system. 
  • Problem resolution (data). You respond to problems in databases, data processes, data products and services as they occur. You initiate actions, monitor services and identify trends to resolve problems. You determine the appropriate remedy and assist with implementation of it as well as preventative measures.
  • Programming and build (data engineering). You can design, code, test, correct and document programs or scripts using Python and SQL. You can use source control to manage your code.
  • Testing. You can review requirements, specifications and define test conditions. You can identify issues and risks associated with work while being able to analyse and report test activities and results.
  • Data development process. You can design, build and test data products based on feeds from multiple systems using a range of different data exchange, storage and access technologies. You can create testable, repeatable and reusable products. 

 

Behaviours 
  • Delivering at Pace
  • Changing and Improving
  • Working Together
 You will be asked to complete a technical task, details of which will be provided closer to the time. How we offer Offers will be made in merit order based on location preferences. If you pass the bar at interview but are not the highest scoring you will be held on a 12-month reserve list in case a role becomes available. If you are judged a near miss at interview, you may be offered a post at the grade below the one you applied for. This role requires SC clearance. DBT’s requirement for SC clearance is to have been present in the UK for at least 3 of the last 5 years. Failure to meet this requirement will result in your application being rejected and your offer will be withdrawn. Checks will also be made against: 
  • departmental or company records (personnel files, staff reports, sick leave reports and security records) 
  • UK criminal records covering both spent and unspent criminal records 
  • your credit and financial history with a credit reference agency 
  • security services record 
  • location details 
 Benefits If you join us, you will get: 
  • learning and development tailored to your role
  • a flexible, hybrid working environment with options like condensed hours
  • a culture encouraging inclusion and diversity
  • a Civil Service pension with an average employer contribution of 28.97%   
  • annual leave starting at 25 days rising to 30 days with service
  • three paid volunteering days a year 
  • an employee benefits programme including cycle to work
 More about us This role can only be worked from within the UK, not overseas. If you are based in London, you will receive London weighting. DBT employees work in a hybrid pattern, spending 2-3 days a week (pro rata) in the office.  Travel to your primary office location will not be paid for by DBT, but costs for travel to an office which is not your main location will be covered.You can find out more about our office locations, how we calculate salaries, our diversity statement and reasonable adjustments, the Recruitment Principles, the Civil Service code and our complaints procedure on our website. Find out more about life at DBT, our benefits and meet the team by watching our video or reading our blog!
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Data pipelines dbt Docker Engineering Finance Git GitHub Open Source Pipelines PostgreSQL Python Security SQL Terraform Testing

Perks/benefits: Career development Equity / stock options Flex hours Startup environment Team events

Region: Europe
Country: United Kingdom

More jobs like this