Senior Data Engineer

Ho Chi Minh

Razer

Cutting-edge technology✅ Excellent engineering✅ Sustainable✅ Shop Razer's catalogue of headsets, laptops and tech gear for gaming, work and leisure.

View all jobs at Razer

Apply now Apply later

Joining Razer will place you on a global mission to revolutionize the way the world games. Razer is a place to do great work, offering you the opportunity to make an impact globally while working across a global team located across 5 continents. Razer is also a great place to work, providing you the unique, gamer-centric #LifeAtRazer experience that will put you in an accelerated growth, both personally and professionally.

Job Responsibilities :

As a Data Engineer, you'll be part of a cross-function team that's responsible for the full software development life cycle, from conception to deployment. You will be responsible for designing, building, and maintaining the systems and infrastructure that enable the processing, collection, storage, and analysis of large volumes of data.

Responsibilities:

  • Data Pipeline Development: design, build, and maintain data pipelines to collect, process, and transform data from various sources into a usable format for analysis and reporting.
  • Data Integration: integrate data from different sources, including databases, APIs, and third-party services, ensuring data consistency and accuracy.
  • Database Management: design and manage databases, both relational (RDBMS) and non-relational (NoSQL), optimizing for performance and scalability.
  • Data Warehousing: develop and maintain data warehouses and data lakes, ensuring that data storage solutions are efficient and support analytical needs.
  • ETL Processes: implement Extract, Transform, Load (ETL) processes to move and transform data, ensuring that data is clean, accurate, and accessible for analytics.
  • Data Quality and Governance: monitor and maintain data quality, implementing data governance practices to ensure data integrity and compliance with standards and regulations.
  • Performance Optimization: optimize data processing and storage for performance and cost-efficiency, including indexing, partitioning, and query optimization.
  • Data Security: implement and enforce security measures to protect sensitive data from unauthorized access and breaches.
  • Documentation and Reporting: document data processes, architectures, and workflows, and create reports on data pipeline performance and data quality.
  • Improvement:  continuously improving data systems and processes by optimizing pipelines, enhancing data quality, scaling infrastructure, automating workflows, and maintaining best practices
  • Collaboration: work with data scientists, analysts, and other stakeholders to understand data needs and provide the necessary infrastructure and support for data analysis.

    Pre-Requisites :

    Preferred Skills and Qualifications:

    • Bachelor’s degree in Computer Science, Engineering, or related field.
    • At least 6 years of experience as a Data Engineer.
    • Strong experience in database design, data integration, data pipeline development, big data, or business intelligence applications.
    • Strong understanding about the concept of DataLake, DataWarehouse, DataMesh.
    • Experience with real-time data processing tools (Kafka, Spark Streaming, etc.).
    • Strong understanding of SQL and NoSQL databases, such as PostgreSQL and MongoDB.
    • Expertise in programming languages such as Python or Scala.
    • Familiarity with AWS platforms, including Glue (Console, Studio, Crawler, Job, Data Catalog, DataBrew) and Athena.
    • Familiarity with AirFlow, AirByte, ClickHouse.

    Plus:

    • AWS Certification
    • Working experience on a CDP

    Are you game?

    Apply now Apply later

    * Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

    Job stats:  0  0  0
    Category: Engineering Jobs

    Tags: Airflow APIs Architecture Athena AWS AWS Glue DataBrew Big Data Business Intelligence Computer Science Data analysis Data governance Data pipelines Data quality Data Warehousing Engineering ETL Kafka MongoDB NoSQL Pipelines PostgreSQL Python RDBMS Scala SDLC Security Spark SQL Streaming

    Region: Asia/Pacific
    Country: Vietnam

    More jobs like this