Cloud Analytics Engineer
PSA | Kuala Lumpur - Menara Prudential @ TRX 15F, Malaysia
Prudential plc
Prudential plc provides life and health insurance and asset management, with a focus on Asia and Africa. We help people get the most out of life, by making healthcare affordable and accessible and by promoting financial inclusion.Prudentialâs purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our peopleâs career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
This role will be part of Group-Wide Information Security team under Security metrics and Analytics function with primarily responsibility of developing and implementing in-house-driven data analytics and security automation program that are aimed to strengthen the overall information security control capabilities through automated discovery of measurable security risks, automated orchestration of risk reduction exercise and visualization of reduction of risks / risk exposures. The scopes of the automation opportunities cover all aspects of information security such as, but not limited to: Â data security, vulnerabilities management, network security, threats analysis and software development.
To be successful in this role, candidate must have strong passion in both IT security through automation and Machine learning / data analytics programming using Python, Keras and Tensorflow.
We are seeking a highly motivated Cloud Analytics Engineer to join our Cloud Economics and Intelligence team under Group Technology Infrastructure Engineering organization. This role blends data engineering, analytics, and emerging AI technologies to drive insights and build intelligent systems that optimize cloud investments and business outcomes.
You will work mostly on Azure Data Factory, Azure Databricks-based data pipelines, develop analytical models, and contribute to Gen-AI agentic chatbot / LLM initiatives to improve operational efficiencies and internal use cases.
Key Responsibilities:
- Design, build, and optimize data pipelines and analytics workflows using Azure Databricks, Spark, and Delta Lake.
- Develop and operational support of data pipelines, workflows and scheduler, Airflow DAGS, data schemas/tables that feeds into dashboards, reports, and data models to support cloud cost optimization and economic operational stability.
- Collaborate with AI/ML engineers to integrate Gen-AI capabilities into analytics tools and chatbots or forecasting.
- Contribute to the development of agentic AI chatbots for internal use cases (e.g., cloud cost Q&A, forecasting assistants).
- Ensure data quality, governance, and performance across our data pipelines and analytics platform.
- Translate business requirements into scalable data and AI solutions.
Required Skills & Experience:
Technical Skills:
- 4+ years of experience in data engineering, data analytics, with at least 2 years in a cloud environment (Azure & GCP).
- Proficiency in Azure Data Factory, Azure Databricks, PySpark, and SQL, GCP Cloud functions, BigQuery.
- Proficiency of CI/CD, version control (GitHub, Azure DevOps), development of best practices, standards, and guidelines.
- Strong programming skills in Python for data extraction, manipulation and automation.
- Experience in the support of containerization services (Kubernetes) and infrastructure provisioning services using Terraform.
- Exposure to Azure bot service, generative AI and agentic chatbot frameworks (e.g. LangChain, Mosaic AI, MS AutoGen)
- Familiarity with LLMs, embeddings, and prompt engineering preferred.
- Familiarity with data visualization tools (e.g. Power BI) and MS Fabric pipelines is a plusÂ
Preferred Qualifications:
- Bachelorâs or Masterâs degree in Computer Science, Data Science, Engineering, or related field.
- Highly self-driven, demonstrate critical thinking, team player & fast learner.
- Strong analytical/problem solving skills, with the ability to work independently and effectively as a team.
Â
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index đ°
Tags: Airflow Azure BigQuery Chatbots CI/CD Computer Science Data Analytics Databricks Data pipelines Data quality Data visualization DevOps Economics Engineering GCP Generative AI GitHub Keras Kubernetes LangChain LLMs Machine Learning Pipelines Power BI Prompt engineering PySpark Python Security Spark SQL TensorFlow Terraform
Perks/benefits: Career development Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.