Software Developer/Data Engineer

Altron Campus Cape Town (Plattekloof), South Africa

Altron

At Altron, we harness the power of data, technology and human ingenuity to solve real-world problems – from the everyday to the epic.

View all jobs at Altron

Apply now Apply later

Title

Software Developer/Data Engineer

Job Description

 We are looking for s strong Data Engineer to join our MAC (Margin Assurance Cloud) project. MAC is a bespoke in-house developed  AWS Serverless Cloud Native application.

It’s core functionalities are:

  • Data
    • Extract, load and transform data using familiar scripting language like SQL & Python for the purpose to produce automated, near real time Assurance Controls to the MA Team
  • Self-Service
    • Access to Athena using workgroups & federated roles for end users (MA Team) to run their own queries for the purpose of investigations;
  • Reporting & Alerting  
    • Capability to create reports from Athena views and send via email to the business customer;
    • Alerting capabilities & Monitoring on File Ingestion

Main Components (AWS Services used): S3, Athena, Lambda Functions, GLUE, EC2

Main Coding Language is: Python, PySpark and SQL

In support to our Clients IT strategy we are in progress with a major migration project of the current on-prem Assurance system utilized by business Margin Assurance residing in FINOPS to the MAC Cloud environment.

We are in need of passionate and skilled Data Engineer that can help the team drive the project to fruition.

The scope of the project build includes:

Data Ingestion and Transformation; S3 Loader Output Build; Athena SOX Validation Reports Build; Athena SOX Aggregation Reports Build.

Delivery will be an agile project which will be managed in monthly sprints. Altron will deliver the requirements as listed and prioritized in the backlog by the Vodacom Technical Delivery Lead and agreed to in each Sprint Planning Ceremony.

KEY RESPONSIBILITIES:

  • Implement scalable data pipelines and architectures using technologies like PySpark/Python/SQL
  • Build out distributed data pipelines and compute tier that operates on AWS Lambda and Glue
  • Serve as a technical resource for team members and mentor junior engineers
  • Collaborate with team to deliver high-quality solutions that meet business requirements
  • Ensure that code is well-designed, maintainable, and adheres to best practices and standards
  • Play a key role in shaping the direction of engineering practices through working on a scrum-size team empowered to organize and ensure sprint deliverables are met as committed during sprint planning ceremonies
  • Following the project development tools like JIRA, Confluence and GIT
  • Assist DEVOPS Engineer in automation CICD practices.
  • Evaluate and recommend new technologies and approaches to improve the performance, scalability, and reliability of our software systems.
  • Required to code complex transformations using loader specifications provided by the BA;
  • Be able to work with Big Data sets, very knowledgeable in understanding and solving data problems;
  • Be able to automate ingestion through building ingestion pipelines using Lambda and Glue;
  • Very skilled in coding languages PySpark and Python and SQL;
  • Must have worked on AWS, with the focus on the following services: S3, Athena, Lambda Functions, GLUE, EC2
  • Must have worked on automation with CloudFormation;
  • Must have worked with GIT;

Skills / Requirements:

  • 4 to 8 Years of Data Engineering or Software Development experience working on Data Driven eco systems
  • Required to code complex transformations using loader specifications provided by the BA
  • Be able to work with Big Data sets, very knowledgeable in understanding and solving data problems
  • Be able to automate ingestion through building ingestion pipelines using Lambda or Glue
  • Very skilled in coding languages PySpark/Python/SQL
  • Must have worked on AWS, with the focus on the following services: S3, Athena, Lambda Functions, GLUE, EC2
  • Experience in AWS and Certificate in AWS certification required
  • Experience with data modelling and data architecture design required

Educational Qualifications:

  • BSc Comp Sci/BEng

Professional Qualifications

  • AWS Professional Certification

Years of Experience

  • 4 to 8 Years of Data Engineering or Software Development experience working on Data Driven eco systems

Education

Bachelor's Degree: Computer and Information Science (Required)

Languages

English
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture Athena AWS Big Data CloudFormation Confluence Data pipelines DevOps EC2 Engineering Git Jira Lambda Pipelines PySpark Python Scrum SQL

Region: Africa
Country: South Africa

More jobs like this