Hardware-Friendly Activation Function Designs and Its Efficient VLSI Implementations for Transformer-Based Applications

Yu Hsiang Huang*, Pei Hsuan Kuo, Juinn Dar Huang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

The activation function is one of key elements in modern machine learning algorithms. However, some broadly-used activation functions are exceptionally complex, e.g., GELU in Transformer-based algorithms, which makes their precise yet efficient VLSI implementations extremely hard. In this paper, two series of hardware-friendly activation function designs, DNR and PWL, and their VLSI implementations are proposed. Both are specifically designed to replace GELU, which is widely used in Transformer-related applications. Instead of utilizing traditional lookup-table (LUT)-based approximation methods, this paper introduces new activation functions that are not only hardware-friendly but successfully alleviate the dying neuron issue. Besides, each series includes a number of members, which can be freely selected through programming to best fit a given application. Experimental results indicate that the proposed new activation functions achieve comparable or even better model accuracy as compared to GELU. Moreover, the highly efficient and flexible VLSI implementations support 16 different Q-formats to maximize the output precision under various input scales. Compared with approximation-based implementation strategies, the proposed activation function designs and the corresponding LUT-free hardware implementations do achieve a significant improvement in speed, area, and power.

Original languageEnglish
Title of host publicationAICAS 2023 - IEEE International Conference on Artificial Intelligence Circuits and Systems, Proceeding
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350332674
DOIs
StatePublished - 2023
Event5th IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2023 - Hangzhou, China
Duration: 11 Jun 202313 Jun 2023

Publication series

NameAICAS 2023 - IEEE International Conference on Artificial Intelligence Circuits and Systems, Proceeding

Conference

Conference5th IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2023
Country/TerritoryChina
CityHangzhou
Period11/06/2313/06/23

Keywords

  • GELU
  • dying neuron issue
  • efficient VLSI implementation
  • hardware-friendly activation function design

Fingerprint

Dive into the research topics of 'Hardware-Friendly Activation Function Designs and Its Efficient VLSI Implementations for Transformer-Based Applications'. Together they form a unique fingerprint.

Cite this