Engineer, Data

Johannesburg, South Africa

Standard Bank Group

The Standard Bank group is a leading financial services provider that supports Africa’s growth and development.

View all jobs at Standard Bank Group

Apply now Apply later

Company Description

Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.

Job Description

To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps

  • Acquire datasets that align with business needs and requirements to enable useful and actionable information, providing feedback on the clarity and completeness of data requirements
  • Analyse data elements and systems, data flow, dependencies, and relationships to ensure conceptual physical and logical data models
  • Apply subject matter expertise into decisions relating to data engineering and data integration. Educate internal stakeholders on data engineering and data integration perspectives on new approaches
  • Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using various technologies (i.e. AWS, Azure and SQL technologies)
  • Build, create, manage, and optimise data pipelines, move data pipelines into production, enabling data consumers to utilise data for reporting purposes

Qualifications

Degree - Business Commerce/Information Studies/Information Technology (Required)

Additional Information

Experience Required

5-7 years: Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Deep understanding of cloud technologies and platforms such as AWS, Azure or GCP. Experience in programming languages such as Python, Java, or C++.

8-10 years: Deep understanding of data pipelining and performance optimisation,  data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory

8-10 years: Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools. Experience in using Big Data platforms such as Hadoop.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS Azure Big Data Data pipelines Engineering GCP Hadoop Java Pipelines Python Spark SQL

Perks/benefits: Startup environment

Region: Africa
Country: South Africa

More jobs like this