Specialist Data Engineer

Absa 270 Republic Road

Apply now Apply later

Empowering Africa’s tomorrow, together…one story at a time.

With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

Job Summary

A data engineer is crucial to the Insider Trust Program, delivering data solutions by developing and maintaining data pipelines to support data-driven risk initiatives.

Job Description

Data Architecture & Data Engineering

  • Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
  • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
  • Participate in design thinking processes to successfully deliver data solution blueprints
  • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
  • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
  • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
  • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
  • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
  • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
  • Debug existing source code and polish feature sets.
  • Assemble large, complex data sets that meet business requirements & manage the data pipeline
  • Build infrastructure to automate extremely high volumes of data delivery
  • Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
  • Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
  • Apply general design patterns and paradigms to deliver technical solutions
  • Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
  • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
  • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
  • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short term deployment must align to strategic long term delivery.
  • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerisation etc.
  • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
  • Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes (global best practices & trends) to ensure best practice


People

  • Coach & mentor other engineers
  • Conduct peer reviews, testing, problem solving within and across the broader team
  • Build data science team capability in the use of data solutions


Risk & Governance

  • Identify technical risks and mitigate these (pre, during & post deployment)
  • Update / Design all application documentation aligned to the organization technical standards and risk / governance frameworks
  • Create business cases & solution specifications for various governance processes (e.g. CTO approvals)
  • Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to get to the bottom of major incidents
  • Deliver on time & on budget (always)

Experience/Skills

  • SQL Proficiency: Advanced knowledge of SQL for querying, managing, and manipulating databases.
  • Knowledge of Big Data Technologies: Experience with big data frameworks and tools such as Apache Hadoop, Spark, Kafka, and Hive.
  • Understanding of Cloud Platforms: Familiarity with cloud services like AWS, Azure, or Google Cloud Platform, specifically their data services.
  • Data Modeling and Database Design: Skills in creating efficient database schemas and designing robust data models that support business requirements.
  • ETL/ELT Processes: Expertise in building and managing Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines.
  • Data Governance and Security: Knowledge of data governance practices, data privacy laws, and ensuring data security and compliance.
  • Experience with Data Integration Tools: Proficiency in using data integration tools such as Apache NiFi, Talend, Denodo, or Informatica.
  • Problem-Solving and Analytical Skills: Strong analytical abilities to diagnose issues, optimize performance, and ensure data quality and reliability.

Qualifications

  • Must have
  • Bachelor’s Degree: In Computer Science, Information Technology, Data Science, or a related field.
  • Minimum of 5 years in similar role

Optional Certifications (Not necessary)

  • Certified Data Professional (CDP)
  • Google Cloud Professional Data Engineer
  • AWS Certified Data Analytics – Specialty:
  • Microsoft Certified: Azure Data Engineer Associate
  • Databricks Certified Data Engineer Associate

Education

Bachelor's Degree: Information Technology

Absa Bank Limited is an equal opportunity, affirmative action employer. In compliance with the Employment Equity Act 55 of 1998, preference will be given to suitable candidates from designated groups whose appointments will contribute towards achievement of equitable demographic representation of our workforce profile and add to the diversity of the Bank.

Absa Bank Limited reserves the right not to make an appointment to the post as advertised

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Ansible Architecture AWS Azure Big Data CI/CD Computer Science Data Analytics Databricks Data governance Data pipelines Data quality ELT Engineering ETL GCP Google Cloud Hadoop Informatica Jenkins Kafka NiFi Pipelines Privacy Security Spark SQL Streaming Talend Testing

Perks/benefits: Career development

More jobs like this