Data Engineer - Data Lakehouse

Bangkok, Bangkok, Thailand

ZILO

Learn how the ZILO™ global transfer agency system is purpose built for custodians, asset managers, transfer agents, ACDs, and their customers.

View all jobs at ZILO

Apply now Apply later

ZILO is focused on transforming global transfer agency by providing a single global solution that replaces legacy technology and systems. Our mission is to create sustainable value for firms and the customers they serve, all while putting people first. With a design-driven approach and a commitment to innovation, our team of experts has unified the full breadth of global transfer agency into a single, streamlined solution.

At ZILO, we value collaboration, innovation, and continuous improvement. Our team of experienced professionals is dedicated to achieving our goal of being the market-leading solution in global transfer agency. If you are a motivated individual with a passion for technology and a desire to make a real impact in the financial services industry, we would love to hear from you.

About this role

We are seeking a skilled Data Engineer – Data Lakehouse to join our team. The ideal candidate should be experienced in designing and building data pipelines, performing ETL operations, and managing databases. The candidate will also need to have knowledge and experience in using AWS services for data processing, storage, and analysis.

Requirements

Responsibilities

  • Design, build and maintain efficient and scalable data pipelines to support data processing and analysis.
  • Develop and implement ETL processes to ensure data quality and integrity.
  • Manage and optimize databases, including designing schema and indexing strategies,data partitioning and data archival.
  • Work closely with data scientists and business analysts to understand datarequirements and translate them into technical solutions.
  • Collaborate with DevOps and IT teams to deploy, monitor and troubleshoot datasystems.
  • Develop and maintain documentation on data infrastructure and processes.
  • Stay current with the latest data engineering technologies, tools and practices.

Qualifications

  • Bachelor's degree in computer science, engineering or related field.
  • Minimum of 3 years of experience in data engineering roles.
  • Strong proficiency in SQL, ETL processes and database management systems (e.g.,MySQL, PostgreSQL, MongoDB).
  • Hands-on experience with AWS services for data processing, storage and analysis(e.g., S3, Redshift, EMR, Glue).
  • Familiarity with programming languages such as Python or Java.
  • Understanding of data warehousing concepts and data modeling techniques.
  • Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills

Benefits

- 23 annual holidays (start and fix at 23 days)

- 15 public holidays

- Provident fund up to 12%

- Health Insurance (including immediate family)

- etc

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: AWS Big Data Computer Science Data pipelines Data quality Data Warehousing DevOps Engineering ETL Hadoop Java MongoDB MySQL Pipelines PostgreSQL Python Redshift Spark SQL

Perks/benefits: Health care

Region: Asia/Pacific
Country: Thailand

More jobs like this