Senior Data Engineer
Malaysia - KL Eco City
FWD Insurance
We’re FWD. A different kind of insurer with a vision to change the way people feel about insurance. Discover our story.About FWD Group
FWD Group is a pan-Asian life and health insurance business with more than 12 million customers across 10 markets, including some of the fastest-growing insurance markets in the world. The company was established in 2013 and is focused on changing the way people feel about insurance. FWD’s customer-led and digitally enabled approach aims to deliver innovative propositions, easy-to-understand products and a simpler insurance experience.
For more information, please visit www.fwd.com
FWD Technology and Innovation Malaysia Sdn. Bhd., known as FWD TIM, was established in late 2019. Strategically located in Kuala Lumpur, FWD TIM serves as a pivotal shared service location within FWD Group, providing services to multiple markets across the Group. FWD TIM houses a diverse and talented workforce focused on essential business and technology services such as information security, cloud operations, IT solutions delivery, digital and data, actuarial, finance, investments, and customer service, among many others. FWD TIM is dedicated to drive and deliver operational excellence and efficiency, foster innovation and ensure regulatory compliance across all business functions as well as maintain a competitive edge in the market.
PURPOSEA role to be responsible for system design, development and implementation of regional frontend systems. Provide maintenance and support on production systems.
KEY ACCOUNTABILITIES
- Design, develop, document and implement end-to-end data pipelines and data integration processes, both batch and real-time. This include data analysis, data profiling, data cleansing, data lineage, data mapping, data transformation, developing ETL / ELT jobs and workflows, and deployment of data solutions.
- Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency and cleanliness, and to optimize and fine-tune ETL / ELT processes.
- Recommend, execute and deliver best practices in data management and data lifecycle processes, including modular development of ETL / ELT processes, coding and configuration standards, error handling and notification standards, auditing standards, and data archival standards.
- Prepare test data, assist to create and execute test plans, test cases and test scripts.
- Collaborate with Data Architect, Data Modeler, IT team members, SMEs, vendors and internal business stakeholders, to understand data needs, gather requirements and implement data solutions to deliver business goals.
- BAU support for any data issues and change requests, document all investigations, findings, recommendations and resolutions.
KEY PERFORMANCE INDICATORS
- Maturity level on technical skillsets of management and development on big data platform including data modeling design, data transformation, big data programming and performance tuning
- Effectiveness of project contribution, including from technical support, project management and team collaboration perspective, to deliver data lake system on assigned local countries
- Level of understanding on business requirements and capability to transform them into technical solution
- Variety of new technologies (such as Azure and AWS Big Data Solution, Power BI, Tableau, Python and etc) and techniques being learnt that can be applied to solution delivery
EXTERNAL & INTERNAL CONTACTS
- Group Infrastructure Team, Security Teams and Operation Team
- Group Project Team
- Local country Project Team (IT and User)
- IT Vendors and/or Service Providers
QUALIFICATIONS / EXPERIENCE
- Bachelor in IT, Computer Science or Engineering.
- At least 3-5 years of using Big Data technologies like Azure and AWS Big Data Solution, Hadoop, Hive, HBase, Spark, Sqoop, Kafka and Spark Streaming.
- Minimum 5 years of professional experience in data warehouse, operational data store, and large scale data architecture implementations in Unix or/and Windows environment.
- At least 3+ years of solid hands-on development experience with ETL development to transform complex data structure in multiple data sources environment.
- At least 5 years data model (relational and/or data warehouse), data mart design and implementation.
- Minimum 3-5 years ETL programming in any of these languages including Python, Scala, Java or R
- Experience on Azure Databricks for ETL/ELT development and big data analytics programming in Python
- Strong Experience with various of ETL/ELT frameworks, data warehousing concepts, data management framework and data lifecycle processes.
- Solid understanding on Azure Data Management Solution including Azure Data Factory, Azure Databricks, Azure Blob storage (Gen2) and Azure Synapse
KNOWLEDGE & TECHNICAL SKILLS
- Familiar with ETL/ELT framework, data warehousing concepts, data management framework and data lifecycle processes.
- Experienced in handling and processing different types of data (structured, semi-structured and unstructured).
- Strong knowledge in various database technologies (RDBMS, NoSQL and columnar).
- Preferably with a good understanding of data analytics and data visualization, strongly prefer Power BI and Tableau.
- Strong understanding of programming languages like Python, Scala, Java, R, Shell and PLSQL
- Good understanding on Master Data Management (MDM) and Data Governance Tools preferring Informatica technologies
- Experienced working in insurance industry will be an added advantage.
- Ability to communicate and present technical information in a clear and unambiguous manner.
- Strong ability to work independently and cooperate with diverse teams in a multiple stakeholder’s environment.
- Strong sense of work ownership, high affinity with anything data and a desire for constant improvements.
- Experience in API development and integration. E.g. SOAP, RestAPI
- Build and modify APIs(Restful) and programs using Phyton and Java
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: API Development APIs Architecture AWS Azure Big Data Computer Science Data analysis Data Analytics Databricks Data governance Data management Data pipelines Data quality Data visualization Data warehouse Data Warehousing ELT Engineering ETL Finance Hadoop HBase Informatica Java Kafka NoSQL Pipelines Power BI Python R RDBMS Scala Security Spark Streaming Tableau
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.