Vice President, Data Engineer, Group Technology
Singapore (City Area), SG, 048624
About UOB
United Overseas Bank Limited (UOB) is a leading bank in Asia with a global network of more than 500 branches and offices in 19 countries and territories in Asia Pacific, Europe and North America. In Asia, we operate through our head office in Singapore and banking subsidiaries in China, Indonesia, Malaysia and Thailand, as well as branches and offices. Our history spans more than 80 years. Over this time, we have been guided by our values – Honorable, Enterprising, United and Committed. This means we always strive to do what is right, build for the future, work as one team and pursue long-term success. It is how we work, consistently, be it towards the company, our colleagues or our customers.
About the Department
Group Technology and Operations (GTO) provides software and system development, information technology support services and banking operations.
We have centralized and standardized the technology components into Singapore, creating a global footprint which can be utilized for supporting our regional subsidiaries and the branches around the world. We operate and support 19 countries with this architecture to provide a secure and flexible banking infrastructure.
Our Operations divisions provide transactional customer services for our businesses while also focusing on cost efficiency through process improvements, automation and straight through processing.
Job Responsibilities
You will be responsible for the end-to-end software development and support for all work related to Enterprise Data Warehouse (EDW) projects, quarterly change requests, L3 production fixes. This includes software product implementation and administration, application design, development, implementation, testing and support. You will be expected to work on Data team.
You will also be responsible for quality assurance of the team’s delivery in conformance with the Bank-defined software delivery methodology and tools. You will partner with other technology functions to help deliver required technology solutions.
- Create frameworks, technical features which helps in faster operationalisation of Data models, Analytical models(including AI/ML) and user generated contents (dashboards, reports etc)
- Effectively partner with citizen data scientists in enabling faster adoption of AL/ML model based systems
- Independently install, customise and integrate software packages and programs
- Carry out POCs involving new data technologies
- Design and develop application frameworks for data integration
- Create technical documents such as solution design, program specifications for target solutions
- Perform design and development of applications which may not be limited to: Software Applications, Data Integration, User Interfaces, Automation Maintain and recommend software improvements to ensure a platform centric management of software applications
- Performance tuning
- Work with production support team members to conduct root cause analysis of issues, review new and existing code and/or perform unit testing
- Perform tasks as part of a cross functional development team using agile or other methodologies and utilising project management software
Job Requirements
- Hands-on experience in implementing large scale data warehouse, Data mart & analytics platforms in financial services industry with good functional knowledge of products & services offered in Retail bank / Wholesale / Global Markets covering some of the following analytics domains:
- Experience in Data Modeling, Data mapping for Data Warehouse and Data Marts solutions
- Expertise in FSLM or similar industry models
- Expertise in Reference data management – Tools experience such as MDM (Master Data Management)
- Experience in functional domain - Retail , Wholesale, Compliance , Digital
- Experience in analytics - Retail Analytics
- Expertise in design of role based fine grained access control
- Designing cloud ready data solutions, Virtualization
- Technical skillsets -ML Model operationalization, Building Data Pipelines and Hadoop based Data marts
- Expertise in implementing Big Data frameworks using multiple SQL engine such as Spark, Impala, Hive, etc
- Expertise in implementing Hadoop based Data mart using Spark based framework (Java, Scala, Pyspark)
- Leverage LLM model to build intelligent data applications (eg: Natural Language based SQL generation)
- Experience in end to end AI / ML life cycle (Data pipeline, model training, Model development, deployment, fine tuning, etc)
- Expertise in building Data federation solution(Trino, Presto, Dremio, Query Grid) along with Data caching / Indexing
- Experience in working with any of Data Catalog tools and Automating Data Quality checks using frameworks
- Good Working experience in core technical area using Python, Java, PySpark and Scala
- Expertise in Cloudera CDP components
- Good knowledge in developing Spark based ingestion framework (Java, Scala, Pyspark)
- Experience in building and operationalising feature pipeline to support AI/ML model execution, data pipelines for supporting large scale data warehouse/data marts
- Additional requirements - 2 to 3 technical certifications from enclosed list:
- Cloudera Hadoop distribution – Hive, Impala, Spark, Kudo, Kafka, Flume
- Teradata – Bteq, Query Grid, GCFR, MDM, TAS, Data Mover, BAR
- Informatica Data Integration – PC, IDR, BDM, MM, IDQ, EDC
- Data modelling tools (Erwin)
- QlikSense
- Microsoft – R
- Data science workbenches – Cloudera Machine Language, Jupyter, DataRobot, H2O.AI, IBM DSX
- Data Virtualization tool (Denodo, Dremio)
- AS400
- Language – SQL, Java, Python, Scala, Pyspark
- Automation / scripting – CtrlM, Shell Scripting, Groovy
Be a part of UOB Family
UOB is an equal opportunity employer. UOB does not discriminate on the basis of a candidate's age, race, gender, color, religion, sexual orientation, physical or mental disability, or other non-merit factors. All employment decisions at UOB are based on business needs, job requirements and qualifications. If you require any assistance or accommodations to be made for the recruitment process, please inform us when you submit your online application.
Apply now and make a difference.
Competencies
1. Strategise2. Engage3. Execute4. Develop5. Skills6. Experience* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture Banking Big Data Data management Data pipelines Data quality DataRobot Data warehouse Hadoop Informatica Java Jupyter Kafka LLMs Machine Learning ML models Model training Pipelines PySpark Python R Scala Shell scripting Spark SQL Teradata Testing
Perks/benefits: Flex hours
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.