AI Engineer
PCALT | San Lien Office, Taiwan
Prudential plc
Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
To organize data in a way that makes it easier for data consumers(Data analysts, Data scientists, System architects, and Business leaders) and other systems to useResponsibilities:
- Current Context:
- Design, develop, and maintain the backend infrastructure for our AI platform, focusing on a multi-agent system architecture.
- Develop and optimize prompts for large language models (LLMs) and other AI models.
- Implement and manage MLOps/LLMOps pipelines for model training, deployment, and monitoring.
- Implement and manage relational (RDB) and NoSQL databases as needed for AI model training and data storage.
- Integrate AI models and services with other platform components and applications.
- Troubleshoot and resolve AI backend issues, ensuring performance, scalability, and reliability.
- Collaborate with Data Engineers, Data Scientists, and other stakeholders to define and deliver AI-powered solutions.
- Partner with cross-functional teams globally, communicating platform updates effectively.
- Role Briefly: Multi-Agent Systems, Prompt Engineering, LLMOps/MLOps, AI Model Integration, RDB, NoSQL, Backend Development
- Expectations for Three Months: Become familiar with our existing technology stacks, not only within your specific role but across the broader data platform ecosystem.
- Expectations Within One Year: Contribute to the development of key components of our AI platform, demonstrating expertise in multi-agent systems and prompt engineering. Specific contributions can be discussed.
Who We're Looking For:
- Non-Technical Skills & Mindset:
- Impact-Driven & Results Focused:
- Value-Oriented: Focused on delivering solutions that generate significant business value (millions USD impact).
- Impact Conscious: Prioritizes work with the greatest technical and business impact. A focus on enabling data consumption through API creation is a plus.
- Growth & Learning Mindset:
- Cross-Functional Learner: Eager to learn and understand cross-functional knowledge beyond core expertise.
- Technology Agnostic Learner: Willing to learn new technologies and adapt to evolving landscapes.
- Efficient Learner: Able to leverage AI tools to maximize productivity and accelerate learning.
- Best Practice Pragmatist: Loves to follow best practices but understands trade-offs and works around limitations when necessary. Demonstrated pro-activeness through contributions to open-source projects is highly valued.
- Collaborative & Global Communicator:
- Team Player: Collaborates effectively in global team environments. Adaptable and comfortable working within an Agile environment.
- Excellent Communicator (English & Chinese): Fluent in both English and Chinese (Mandarin) to effectively communicate with global teams and stakeholders.
- Impact-Driven & Results Focused:
- Technical Concepts: We're looking for candidates with a strong grasp of:
- Fundamental computer science knowledge
- Root cause finding methodologies
- Systematic/architectural thinking
- Clean code/clean architecture principles and an aversion to over-design
- Technical Skills:
- Python: Proficient in Python, with experience in AI/ML backend development.
- SQL: Solid SQL skills for data management and querying.
- Cloud Development: Hands-on experience with GCP, including hybrid environments with on-premises DCs. Experience with AWS or Azure is also acceptable.
- Multi-agent System, prompt engineering and basic knowledge for machine learning / deep learning.
Tech Stacks:
- Compute & Hosting: GKE & GCE (RedHat), GCP Cloud Run & Cloud Functions
- Data Orchestration: GCP Cloud Composer (Airflow)
- Data Lakehouse: BigQuery
- Data Streaming: Kafka Ecosystem (Confluent Cloud, Debezium, Qlik)
- Monitoring & Observability: GCP Monitoring/Logging/Metrics, OpenTelemetry
- CI/CD: GitHub Actions, Jenkins
- Infrastructure as Code: Terraform
- Security: VPC SC & Policy Tags, Customer-Managed Encryption Keys (CMEK), Vault
- Containers: Docker, Kubernetes
- Data Governance: Collibra
- Data Visualization: Power BI
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow APIs Architecture AWS Azure BigQuery CI/CD Computer Science Data governance Data management Data visualization Deep Learning Docker Engineering GCP GitHub Jenkins Kafka Kubernetes LLMOps LLMs Machine Learning MLOps Model training NoSQL Open Source Pipelines Power BI Prompt engineering Python Qlik Security SQL Streaming Terraform
Perks/benefits: Career development Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.