Senior / Data Engineer (DataOps)

StarHub Green

Applications have closed

StarHub

StarHub Personal - Check out our new offerings & promos. View our latest phones, broadband plans, and rewards by redeeming your points.

View all jobs at StarHub

JOB PURPOSE

  • The Data Platform Team is responsible for designing, implementing, and managing a modern data platform that embraces the principles of data mesh, empowering teams to create and manage their own data products. Our mission is to deliver high-quality, scalable data solutions that drive business value across the organization.
  • As a key member of this team, you will play a critical role in ensuring the reliability, scalability, and efficiency of our data infrastructure. Your focus will be on enabling seamless data flow and analytics engines, supporting the creation and management of data products that align with business needs.
  • In this role, you will work closely with data engineers and data stewards to design, automate, and optimize data operations, with a strong emphasis on cloud infrastructure and containerization. Your mission is to ensure seamless and scalable data services by implementing robust pipelines, leveraging cloud platforms, and managing containerized environments. You will drive efficiency through automation and continuous improvement, integrating the latest tools and practices to support analytics and GenAI use cases.

 

KEY RESPONSIBILITIES

  • Collaborate with solution architects and the infrastructure team to design and implement cloud-based and on-prem architectures, managing infrastructure across AWS and OpenShift Container Platform (OCP).
  • Design and maintain scalable data pipelines and services, along with the frameworks and workflows for data ingestion and ETL processes. Implement data orchestration to ensure seamless data flow across the platform.
  • Establish and manage CI/CD pipelines for continuous integration, testing, and deployment of data platform components across hybrid environments.
  • Maintain comprehensive documentation of data infrastructure and processes, ensuring all procedures are well-documented and accessible.
  • Offer training and support to internal and external team members on DataOps practices, tools, and processes to ensure consistent and effective use of the data platform.

 

Qualifications

Requirement:

  • Degree in IT, Computer Science, Data Analytics or related field
  • 2 to 4 years of experience in Data Engineering, DevOps, or related fields.
  • Proven experience working in a mature, DevOps-enabled environment with well-established cloud practices, demonstrating the ability to operate in a high-performing, agile team.
  • Familiarity with cloud platforms (AWS, AliCloud, GCP) and experience managing infrastructure across public cloud and on-prem environments, particularly with OpenShift Container Platform (OCP).
  • Knowledge on automation tools such as Ansible, Terraform, and CLI tools across hybrid cloud environments. 
  • Competence in designing and implementing data ingestion, ETL frameworks, dashboard ecosystems, and data orchestration
  • Hands-on experience with Linux systems, Object Storage, Python, SQL, Spark and Presto query engines
  • Working knowledge of CI/CD best practices, with experience in setting up and managing CI/CD pipelines for continuous integration, testing, and deployment.
  • Experience in implementing security measures, including IAM roles, policies, Security Groups, and Network ACLs.
  • Good problem-solving and communication skills, especially in explaining technical concepts to non-technical data users, and the ability to collaborate effectively with data engineers, data stewards, and other stakeholders.
  • Ability to maintain clear documentation of data infrastructure and processes, with some experience in providing training on DataOps practices to team members.

Preferred:

  • Certifications in cloud technology platforms (such as cloud architecture, container platforms, systems, and/or network virtualization).
  • Knowledge of telecom networks, including mobile and fixed networks, will be an added advantage.
  • Familiarity with data fabric and data mesh concepts, including their implementation and benefits in distributed data environments, is a bonus.
     

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Ansible Architecture AWS CI/CD Computer Science Data Analytics DataOps Data pipelines DevOps Engineering ETL GCP Generative AI Linux Pipelines Python Security Spark SQL Terraform Testing

Perks/benefits: Salary bonus

More jobs like this