A CLUSTERING-BASED ML SCHEME FOR CAPACITY APPROACHING SOFT LEVEL SENSING IN 3D TLC NAND

Li Wei Liu*, Yen Ching Liao*, Hsie Chia Chang*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In a 3D TLC solid-state storage system, the LDPC decoding performance is significantly affected by the quality of soft-level sensing. Inspired by the capacity-approaching maximum mutual-information method, this work presents the data-driven approach to collect all the optimal 2-bit soft-read level pairs over the 3D TLC NAND. Due to the data transmission latency and limited configuration resources, a clustering method is proposed to extract the soft-read level pairs in the experiment data. Under the 3K Program Erase Cycles 228-hour data retention at 85°C channel condition, the proposed soft-read level pairs could provide an additional 73-error-bit tolerance in the 2K LDPC decoder.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4078-4082
Number of pages5
ISBN (Electronic)9781665405409
DOIs
StatePublished - 2022
Event47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Virtual, Online, Singapore
Duration: 23 May 202227 May 2022

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2022-May
ISSN (Print)1520-6149

Conference

Conference47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022
Country/TerritorySingapore
CityVirtual, Online
Period23/05/2227/05/22

Keywords

  • 3D NAND
  • Clustering
  • Machine Learning
  • Maximum Mutual Information

Fingerprint

Dive into the research topics of 'A CLUSTERING-BASED ML SCHEME FOR CAPACITY APPROACHING SOFT LEVEL SENSING IN 3D TLC NAND'. Together they form a unique fingerprint.

Cite this