Data Analytics Engineer - Associate - Chief Data Office - IT

HK-TWO ES 8/F

HKEX

HKEX Group's official website, covering investor relations, careers, corporate governance, market insights and our work in the community.

View all jobs at HKEX

Apply now Apply later

Company Introduction:

We’re home to Asia's most dynamic and vibrant capital markets.
Connecting capital, ideas, inspiration and innovation for deeper, more diverse and liquid global capital markets; providing greater choice and opportunity for our customers, each and every day.

HKEX is a purpose-driven company. Our commitment to the long-term development of our business and our markets is articulated in our purpose: "To Connect, Promote and Progress our Markets and the Communities they support for the prosperity of all."

Job Summary:

The incumbent, Data Analytics Engineer, Chief Data Office of the Information Technology Division, is an important role for developing data analytics capabilities and enhancing our enterprise data platform.

Job Duties:

The Data Analytics Engineer is responsible for the execution, and delivery of data analytics engineering projects to incrementally build out our enterprise data platform.  Key duties focus on application of engineering skills for end-to-end data analytics technology capabilities from data ingestion, data modelling, data governance, data visualization and analytics solutions implementation. The role requires a strong technical background in cloud experiences and data domain technologies as well as excellent problem-solving skills to ensure solutions are scalable, secure, and efficient.

Key Responsibilities:

  • Work with cross-functional teams to deliver data engineering projects, from development, deployment though to support in ensuring high availability and performance.
  • Design and implement scalable and secure data analytics solutions for data ingestion, data processing, data visualization etc..
  • Collaborate with stakeholders to deliver data platform capabilities, prioritising requirements basing on business benefits and protection against data risks.
  • Promote compliant and quality reuse of data on enterprise platform, through incremental delivering functionalities in data lifecycle, including data access control, data lineage, data modelling, data quality and data analytics.
  • Apply native cloud skills like container, object storage etc. to enable effective data analytics, data management and data governance.
  • Manage development pipeline thru DevSecOps, CICD tooling and Infrastructure as Code.
  • Effectively use hybrid project management techniques, applying agile tools within SDLC phases
  • Coordinate other IT functions in the efficient use of technical tools, internal resources and skills, optimising technical synergies and efficiency across initiatives.
  • Manage third-party data products and vendors for deployment onto the data platform.

Requirements:

  • Bachelor’s degree in computer science, technology, data or related disciplines
  • 5 years of working experience in systems development, preferably having track record in large enterprises.  AVP position will require stronger technology and management experiences.
  • Well versed with SDLC project disciplines, and experienced in Agile methodologies in a cross-functional teams
  • Good understanding of enterprise data architecture and technology stack, covering batch processing and streaming data for analytics usage.
  • Familiar with cloud tecnology and operations in AWS, Azure or China clouds and data processing in a cloud environment.
  • Solid technical background with hands-on experience in software development and data related technologies.
  • Proficiency in programming languages such as Python, Java, C++ or Scala
  • Hands-on experience with big data technologies and ecosystems, including Hadoop, Spark, and Kafka.
  • Knowledge of database systems, both relational and NoSQL.
  • Familiar with big data technologies, e.g., Redshift, HDFS, HBase, Hive etc.
  • Experience in data security and data management tools, e.g., LDAP, OAuth, Keycloak, Lake Formation etc.
  • Experienced in DevSecOps tolling, CI/CD pipeline, Terraform etc
  • Excellent problem-solving skills and the ability to work in a fast-paced environment.

HKEX is committed as an Equal Opportunity Employer. Diversity is one of our core values and we look to support, respect diverse perspectives, abilities, culture and experiences within our workplace.

Location:

HKEX - Exchange Square

Shift:

Standard - 40 Hours (Hong Kong SAR)

Scheduled Weekly Hours:

40

Worker Type:

Permanent
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture AWS Azure Big Data CI/CD Computer Science Data Analytics Data governance Data management Data quality Data visualization Engineering Hadoop HBase HDFS Java Kafka Lake Formation NoSQL Python Redshift Scala SDLC Security Spark Streaming Terraform

Region: Asia/Pacific
Country: Hong Kong

More jobs like this