PlumX Metrics
Embed PlumX Metrics

Scalable Transformer Accelerator with Variable Systolic Array for Multiple Models in Voice Assistant Applications

Electronics (Switzerland), ISSN: 2079-9292, Vol: 13, Issue: 23
2024
  • 0
    Citations
  • 0
    Usage
  • 2
    Captures
  • 2
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Captures
    2
  • Mentions
    2
    • Blog Mentions
      1
      • Blog
        1
    • News Mentions
      1
      • News
        1

Most Recent News

New Electronics Study Findings Have Been Reported from Sejong University (Scalable Transformer Accelerator with Variable Systolic Array for Multiple Models in Voice Assistant Applications)

2024 DEC 13 (NewsRx) -- By a News Reporter-Staff News Editor at Electronics Daily -- Data detailed on electronics have been presented. According to news

Article Description

Transformer model is a type of deep learning model that has quickly become fundamental in natural language processing (NLP) and other machine learning tasks. Transformer hardware accelerators are usually designed for specific models, such as Bidirectional Encoder Representations from Transformers (BERT), and vision Transformer models, like the ViT. In this study, we propose a Scalable Transformer Accelerator Unit (STAU) for multiple models, enabling efficient handling of various Transformer models used in voice assistant applications. Variable Systolic Array (VSA) centralized design, along with control and data preprocessing in embedded processors, enables matrix operations of varying sizes. In addition, we propose an efficient variable structure and a row-wise data input method for natural language processing where the word count changes. The proposed scalable Transformer accelerator accelerates text summarization, audio processing, image search, and generative AI used in voice assistance.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know