Digital Compute-in-Memory (CIM) Based Transformer Accelerator for Embedded Devices
Lund, Sweden
Huawei Consumer Business Group
HUAWEI ist weltweiter Marktführer für Telekommunikation mit einer großen Produktauswahl einschließlich Smarttphones, Tablets, Wearables, Breitbandgeräten und Smart-Home Geräten.Location: Lund, Sweden
Preferred starting date: Jan. 2025
Extent: 1-2 students, 30hp.
Thesis description
Transformer models have set new benchmarks in various domains, including natural language processing and computer vision. However, their extensive use of matrix multiplications leads to significant data movement and computational demands, resulting in high latency and energy consumption. Recently, digital compute-in-memory (CIM) has emerged as a promising solution to minimize data movement while maintaining high accuracy. This technique integrates parallel multiply-and-accumulate (MAC) operations directly into static random-access memory (SRAM). This thesis investigates design of CIM based accelerator for the Transformer algorithms.
Objectives:
Gain a comprehensive understanding of Transformer architectures, focusing on their computational and memory requirements. Study the principles of Compute-In-Memory (CIM) technology, particularly its integration with SRAM and its potential for accelerating Transformer models. Develop a hardware architecture that leverages CIM to efficiently accelerate Transformer computations. Implement the designed CIM-based accelerator on a suitable simulation platform. Assess the performance of the CIM-based accelerator in terms of area, power consumption, and latency compared to traditional hardware implementations. Investigate and implement local attention mechanisms to further reduce computational complexity and enhance performance.
Qualifications
Master student in Computer Science, Electrical Engineering or equivalent.
Theoretical background in areas such as computer architecture, embedded systems, machine learning, digital system design.
Understanding of CPU/GPU architecture, RISC-V, ISA design, in-memory compute, or dataflow architecture. Some experience in CIM architecture. Hands-on experience in Python, C/C++ and ML libraries.
Contact person
Johan Hokfelt (Johan.Hokfelt@huawei.com)
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Computer Science Computer Vision Dataflow Engineering GPU Machine Learning NLP Python
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.