Digital Compute-in-Memory (CIM) Based Transformer Accelerator for Embedded Devices

Lund, Sweden

Huawei Consumer Business Group

HUAWEI ist weltweiter Marktführer für Telekommunikation mit einer großen Produktauswahl einschließlich Smarttphones, Tablets, Wearables, Breitbandgeräten und Smart-Home Geräten.

View all jobs at Huawei Consumer Business Group

Apply now Apply later

Location: Lund, Sweden

Preferred starting date: Jan. 2025

Extent: 1-2 students, 30hp.

Thesis description

Transformer models have set new benchmarks in various domains, including natural language processing and computer vision. However, their extensive use of matrix multiplications leads to significant data movement and computational demands, resulting in high latency and energy consumption. Recently, digital compute-in-memory (CIM) has emerged as a promising solution to minimize data movement while maintaining high accuracy. This technique integrates parallel multiply-and-accumulate (MAC) operations directly into static random-access memory (SRAM). This thesis investigates design of CIM based accelerator for the Transformer algorithms.

Objectives:

Gain a comprehensive understanding of Transformer architectures, focusing on their computational and memory requirements. Study the principles of Compute-In-Memory (CIM) technology, particularly its integration with SRAM and its potential for accelerating Transformer models. Develop a hardware architecture that leverages CIM to efficiently accelerate Transformer computations. Implement the designed CIM-based accelerator on a suitable simulation platform. Assess the performance of the CIM-based accelerator in terms of area, power consumption, and latency compared to traditional hardware implementations. Investigate and implement local attention mechanisms to further reduce computational complexity and enhance performance.

Qualifications

Master student in Computer Science, Electrical Engineering or equivalent.

Theoretical background in areas such as computer architecture, embedded systems, machine learning, digital system design.

Understanding of CPU/GPU architecture, RISC-V, ISA design, in-memory compute, or dataflow architecture. Some experience in CIM architecture. Hands-on experience in Python, C/C++ and ML libraries.

Contact person

Johan Hokfelt (Johan.Hokfelt@huawei.com)

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture Computer Science Computer Vision Dataflow Engineering GPU Machine Learning NLP Python

Region: Europe
Country: Sweden

More jobs like this