Senior Data & AI Platform Engineer
Singapore, Singapore
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Singtel
The Singtel Group, Asia's leading communications group provides a diverse range of services including fixed, mobile, data, internet, TV, infocomms technology (ICT) and digital solutions.An empowering career at Singtel begins with a Hello. Our purpose, to Empower Every Generation, connects people to the possibilities they need to excel. Every "hello" at Singtel opens doors to new initiatives, growth, and BIG possibilities that takes your career to new heights. So, when you say hello to us, you are really empowered to say…“Hello BIG Possibilities”.
Be a Part of Something BIG!
• Design and implement AI-ready data architecture that will provide accessibility to data of various sources including cloud and on-prem for AI use cases
• Design, build and operate data virtualization layer to minimize data transfer, improve visibility and controls
• Contribute to design conversation of AIDA’s new data and AI platform
• Collaborate with platform, analytics, and governance teams to deliver high-quality, secure, and well-documented data assets
• Lead team of data engineers to ensure timely availability of accurate, well-documented data for use in AI use cases
Make An Impact By
- Design and implement batch and streaming data ingestion pipelines from diverse sources (e.g., files, APIs, Kafka, databases)
- Develop real-time and near-real-time data workflows using tools like Apache Flink, Kafka Streams
- Optimize performance for high-volume and high-velocity datasets
- Implement data quality checks and automatic monitoring system to ensure consistently accurate and available data
- Design and manage storage solutions such as Microsoft Fabric, Delta Lake, and Databricks
- Design and implement data virtualization layer to centralize visibility and control of data across diverse sources
- Apply best practices for schema design, partitioning, and data lifecycle management
- Support data discovery and cataloguing in coordination with governance tools
- Work with data scientists, analysts, and business users to understand data needs and translate them into technical solutions
- Partner with DevSecOps and platform engineers to automate deployment and orchestration of data pipelines
- Document data flows, transformations, and quality checks in accordance with governance standard
Skills to Succeed
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
- 5 years of experience in data engineering
- 1 year in a leadership or senior technical role delivering data engineering projects
- Deep expertise in Spark, Databricks, and data processing frameworks
- Strong knowledge of streaming technologies such as Apache Kafka, Apache Flink, or Azure Event Hub
- Experience working with data lake and/or lakehouse architectures such as Hadoop, Delta Lake, Iceberg and Microsoft OneLake
- Familiarity with data virtualization tool and architecture involving diverse source
- Proficient in Python and SQL
- Familiar with workflow orchestration (e.g., Apache Airflow) and CI/CD principles
- Analytical mindset with a focus on data quality, performance, and maintainability
- Able to work independently and collaboratively in a dynamic environment
- Strong communication and documentation skills to support cross-functional collaboration
- Knowledge of data types across network and IT in telco environmen
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Azure CI/CD Computer Science Databricks Data pipelines Data quality Engineering Excel Flink Hadoop Kafka Pipelines Python Spark SQL Streaming
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.