Data Engineer (Intern)

Singapore

Razer

Cutting-edge technology✅ Excellent engineering✅ Sustainable✅ Shop Razer's catalogue of headsets, laptops and tech gear for gaming, work and leisure.

View all jobs at Razer

Apply now Apply later

Joining Razer will place you on a global mission to revolutionize the way the world games. Razer is a place to do great work, offering you the opportunity to make an impact globally while working across a global team located across 5 continents. Razer is also a great place to work, providing you the unique, gamer-centric #LifeAtRazer experience that will put you in an accelerated growth, both personally and professionally.

Job Responsibilities :

The data engineer intern will be playing a key role in designing databases, storage, improving existing data pipelines and building new pipelines that are essential for data analyst/data scientist to perform analytics and build AI and ML products. By studying and understanding the data and pipelines, the intern will perform data profiling, data processing, and automation to ensure that data is accurate and available to our end users. Throughout the process, the intern will learn the skills of proper documentation for these works.

Pre-requisites :

  • Creative and innovative mindset and ability to work independently
  • Ability to use one or more development language (Python, SQL, etc)
  • Familiar with cloud technologies (Amazon Web Services, Google Cloud Platform)
  • Familiar with orchestration tools. (Airflow)
  • Interest and experience in Data Engineering and Analytics

Learning Objectives

The intern will be able to learn data operations concept, data engineering skills and understand the wider realm of data engineering and applications in AI and analytics. The application of the different cloud computing knowledge can also be picked up.

Learning Outcome

  • Industry best practices in Data Warehousing
  • Industry use of CI/CD for code repository integration and deployment
  • Pipeline design and boiler plating
  • SQL database schema and table structure design
  • Data transformation tools - DBT
  • Cloud computing technology – AWS
  • Infrastructure as code – Terraform
  • Data pipeline orchestration - Airflow
  • Technical requirement gathering and translation to end to end data pipeline

Pre-Requisites :

Are you game?

Apply now Apply later
Job stats:  3  2  0
Category: Engineering Jobs

Tags: Airflow AWS CI/CD DataOps Data pipelines Data Warehousing dbt Engineering GCP Google Cloud Machine Learning Pipelines Python SQL Terraform

Perks/benefits: Career development

Region: Asia/Pacific
Country: Singapore

More jobs like this