A 1.86mJ/Gb/query bit-plane payload machine learning processor in 90nm CMOS

Fang Ju Ku, Tung Yu Wu, Yen Chin Liao, Hsie-Chia Chang, Wing Hung Wong, Chen-Yi Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

This paper presents an implementation of an energy efficient bit-plane payload design for machine learning processor. The proposed architecture facilitates high parallelism and high data bandwidth and thus improves the model learning/training time of machine learning algorithms. By assembling multiple bits as a bit-plane and enlarging query parallelism with a central compare-flag updater, data processing parallelism can be increased. Binary sequential partition (BSP), a fast density estimation algorithm capable of dealing with high dimensional data sets, is realized. Fabricated in 90nm 1P9M CMOS process, the processing rate can achieve 16.9 Gb/sec with 8 queries for data dimension D=210. The test chip integrates 64 counting cells and provides 5 modes with power consumptions of 1.86mJ/Gb per Query.

Original languageEnglish
Title of host publication2018 International Symposium on VLSI Design, Automation and Test, VLSI-DAT 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-4
Number of pages4
ISBN (Electronic)9781538642603
DOIs
StatePublished - 5 Jun 2018
Event2018 International Symposium on VLSI Design, Automation and Test, VLSI-DAT 2018 - Hsinchu, Taiwan
Duration: 16 Apr 201819 Apr 2018

Publication series

Name2018 International Symposium on VLSI Design, Automation and Test, VLSI-DAT 2018

Conference

Conference2018 International Symposium on VLSI Design, Automation and Test, VLSI-DAT 2018
Country/TerritoryTaiwan
CityHsinchu
Period16/04/1819/04/18

Keywords

  • Bayesian sequential partition
  • big data analysis
  • bit-plane
  • hardware architecture

Fingerprint

Dive into the research topics of 'A 1.86mJ/Gb/query bit-plane payload machine learning processor in 90nm CMOS'. Together they form a unique fingerprint.

Cite this