Head, Data Engineering
Ebene, Mauritius
Standard Bank Group
The Standard Bank group is a leading financial services provider that supports Africa’s growth and development.Company Description
Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.
Job Description
To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps.
Acquire datasets that align with business needs and requirements to enable useful and actionable information, providing feedback on the clarity and completeness of data requirements. Analyse data elements and systems, data flow, dependencies, and relationships to ensure conceptual physical and logical data models
Apply subject matter expertise into decisions relating to data engineering and data integration. Educate internal stakeholders on data engineering and data integration perspectives on new approaches
Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using various technologies (i.e. AWS and SQL technologies). Build, create, manage, and optimise data pipelines, move data pipelines into production, enabling data consumers to utilise data for reporting purposes
Collaborate with insights teams, Technology engineers and Data & Analytics leadership to identify opportunities for process improvements, recommend system modifications and give input to the development of policies for data governance
Contribute to automation initiatives and guide the identification, design, and implementation of internal process improvements, automation of manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc
Design, document and implement processes for executing partner system alignments, communicating these with the relevant stakeholders including Governance. Continually enhance processes to deal with the dynamics of partner systems alignment
Develop across several application platforms, provide capability across application platforms, construct, test and maintain complete data architecture, allowing for data-led and data driven decision making
Enable and execute internal and external data migrations across different databases, applications, servers and define and implement data stores based on system requirements and business consumer requirements
Execute on the design, definition and development of Application Programming Interfaces (API's), aligning to relevant frameworks and guidelines. Create data tooling, enabling data consumers in building and optimising data consumption, taking integration and usage patterns into account
Guide and manage a small team to develop across several application platforms and provide capability across application platforms
Identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimising data delivery, and automate manual processes
Liaise and collaborate with technology colleagues and segment/country data teams to understand viable data solutions within architectural guidelines
Make data accessible by interpreting business requirements, defining and providing technology solutions that collect, manage, and convert raw data into usable information to be interpreted by data consumers and insights teams. Take part in, to ensure, the deployment of solutions in accordance with applicable architecture, solution designs, standards, policies and governance requirements
Participate in the vetting and promoting of content created in the business for business reuse
Partner with Information security teams to safeguard and ensure compliance with data governance and data security requirements while creating, improving and operationalising integrated and reusable data pipelines
Pro-actively perform analyses of data to identify and recommend optimisation and improvements and enable proactive analysis and evaluation of the databases to identify and recommend improvements and optimisation
Research data integration, warehousing and reporting best practices, ensuring applicable standards are adhered to and applied in the solution design
Shape data traceability standards, guidelines and processes to ensure data quality by guiding processes such as keeping track of data sources and setting up appropriate data version control processes and systems ensuring alignment to defined data architecture
Qualifications
Type of Qualification: First Degree
Field of Study: Business Commerce
Type of Qualification: First Degree
Field of Study: Information Studies
Type of Qualification: First Degree
Field of Study: Information Technology
Experience Required
Software Engineering
Technology
5-7 years
Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
8-10 years
Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory
8-10 years
Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information
Behavioural Competencies:
- Adopting Practical Approaches
- Articulating Information
- Checking Things
- Developing Expertise
- Documenting Facts
- Embracing Change
- Examining Information
- Interpreting Data
- Managing Tasks
- Producing Output
- Taking Action
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- IT Knowledge
- Stakeholder Management (IT)
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Big Data Data governance Data pipelines Data quality Engineering Pipelines Research Security Spark SQL
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.