Associate Data Engineer
Singapore
Razer
Cutting-edge technology✅ Excellent engineering✅ Sustainable✅ Shop Razer's catalogue of headsets, laptops and tech gear for gaming, work and leisure.Joining Razer will place you on a global mission to revolutionize the way the world games. Razer is a place to do great work, offering you the opportunity to make an impact globally while working across a global team located across 5 continents. Razer is also a great place to work, providing you the unique, gamer-centric #LifeAtRazer experience that will put you in an accelerated growth, both personally and professionally.
Job Responsibilities :
This role is required to build and maintain high volume big data processing systems that enable the organization to collect, manage, and convert raw data into usable information for data scientists and business analysts, and enabling the use of Artificial Intelligence (AI) capabilities. He / She is responsible for developing and maintaining data pipelines, data storage systems, and cloud infrastructure, while working closely with data scientists, data analysts and internal stakeholders to utilize data for analytics and AI capabilities.Essential Duties and Responsibilities
- Develop and maintain data systems and data pipelines that enable the organization to store, process, and analyze large volumes of data. This involves developing data pipelines, implementing data storage systems, and ensuring that data is integrated effectively to support of AI applications.
- Manage data lakes and data warehouses by populating and operationalizing them. This involves creating and managing table schemas, views, materialized views, including tokenization and vectorization techniques for Gen AI.
- Develop template data pipelines that enable extract, transform, and load operations from various sources. This involves using cloud computing tools to build streaming and batch processing pipelines that handle large volumes of data which can also be used for AI applications.
- Monitor and troubleshoot data workflows, ensuring timely resolution of failures and rerunning failed jobs to ensure data completeness.
- Leverage modern build tools to enhance automation, data quality, testing, and deployment of data pipelines.
- Develop and implement cloud infrastructure that are in line with company's security policies and practices, as well as cost optimization practices.
- Collaborate with data scientists, data analysts and business users to understand the data needs of various stakeholders across the organization to implement appropriate solutions.
Pre-Requisites :
Requirements
- Bachelor’s or Master’s degree in computer science, engineering, mathematics or similar discipline.
- Ability to write clean, maintainable, scalable and robust code using Python, SQL, Java.
- Experience in building and maintaining ETL/ELT pipelines using Python and optimizing SQL.
- Exposure to cloud data services such as AWS Redshift, Snowflake, BigQuery, or Azure Data Lake.
- Experience using pipeline orchestration tools like Airflow, containerization tools like Dockers.
- Excellent with various data processing techniques (streaming, batch, event-based), optimizing data storage (Data Lake, Data Warehouse, Vector Data Stores and Database, SQL, and NoSQL).
- Experience using data transformation tool like Data Build Tool (DBT).
- Experience with version control (Git) and CI/CD pipelines for data engineering workflows.
- Experience using Infrastructure As Code tools such as Terraform, container orchestration platforms like Kubernetes.
- Excellent problem-solving and analytical skills, with an understanding of AI technologies and their applications.
- Excellent written and verbal communication skills for collaboration across teams.
Are you game?
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow AWS Azure Big Data BigQuery CI/CD Computer Science Data pipelines Data quality Data warehouse dbt ELT Engineering ETL Generative AI Git Java Kubernetes Mathematics NoSQL Pipelines Python Redshift Security Snowflake SQL Streaming Terraform Testing
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.