Senior / Lead DataOps Engineer

StarHub Green

StarHub

StarHub Personal - Check out our new offerings & promos. View our latest phones, broadband plans, and rewards by redeeming your points.

View all jobs at StarHub

Apply now Apply later

JOB PURPOSE

 

  • The Data Platform Team is responsible for designing, implementing, and managing a modern data platform that embraces the principles of data mesh, empowering domain teams to create and manage their own data products. Our mission is to deliver high-quality, scalable data solutions that drive business value across the organization to support analytics and GenAI use cases.
  • As a key member of this team, you will play a critical role in ensuring the reliability, scalability, and efficiency of our data infrastructure. You will work closely with data engineers and data stewards to design, automate, and optimize data operations, with a strong emphasis on cloud infrastructure and containerization.

KEY RESPONSIBILITIES

  • Collaborate with data engineers to design and implement cloud-based and on-prem architectures, managing infrastructure across AWS and OpenShift Container Platform (OCP) with automation tools such as Jenkins and Ansible.
  • Establish and manage CI/CD pipelines and data orchestration for testing, deployment, and monitoring of data platform components across hybrid environments.
  • Maintain comprehensive documentation of data infrastructure and processes. Offer training and support to internal and external team members on DataOps practices.

Qualifications

  • Degree in IT, Computer Science, Data Analytics or related field
  • 2 to 4 years of experience in Data Engineering, DevOps, or related fields.
  • Proven experience working in a mature, DevOps-enabled environment with well-established cloud practices
  • Familiarity with data platforms such as databricks, snowflake or WastonX and experience managing infrastructure across public cloud and on-prem environments, particularly with OpenShift Container Platform (OCP).
  • Knowledge on automation tools such as Ansible, Terraform, and CLI tools across hybrid cloud environments. 
  • Competence in designing and implementing data ingestion, ETL frameworks, dashboard ecosystems, and data orchestration with hands-on coding skills such as Python, spark and SQL
  • Working knowledge of CI/CD best practices, with experience in setting up and managing CI/CD pipelines for continuous integration, testing, and deployment.
  • Ability to maintain clear documentation of data infrastructure and processes, with experience in providing training on DataOps practices.

Preferred:

  • Certifications in cloud technology platforms (such as cloud architecture, container platforms, systems, and/or network virtualization).
  • Knowledge of telecom networks, including mobile and fixed networks, will be an added advantage.
  • Familiarity with data fabric and data mesh concepts, including their implementation and benefits in distributed data environments, is a bonus
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Ansible Architecture AWS CI/CD Computer Science Data Analytics Databricks DataOps DevOps Engineering ETL Generative AI Jenkins Pipelines Python Snowflake Spark SQL Terraform Testing

Perks/benefits: Salary bonus

Region: Asia/Pacific
Country: Singapore

More jobs like this