Massive Figure Extraction and Classification in Electronic Component Datasheets for Accelerating PCB Design Preparation

Kuan Chun Chen, Chou Chen Lee, Po-Hung Lin, Yan Jhih Wang, Yi Ting Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Before starting printed-circuit-board (PCB) design, it is usually very time-consuming for PCB and system designers to review a large amount of electronic component datasheets in order to determine the best integration of electronic components for the target electronic systems. Each datasheet may contain over hundred figures and tables, while the figures and tables usually present the most important electronic component specifications. This paper categorizes various figures, including tables, in electronic component datasheets, and proposes the ECS-YOLO model for massive figure extraction and classification in order to accelerate PCB design preparation process. The experimental results show that, compared with the state-of-the-art object detection model, the proposed ECS-YOLO can consistently achieve better accuracy for figure extraction and classification in electronic component datasheets.

Original languageEnglish
Title of host publication2021 ACM/IEEE 3rd Workshop on Machine Learning for CAD, MLCAD 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665431668
DOIs
StatePublished - 30 Aug 2021
Event3rd ACM/IEEE Workshop on Machine Learning for CAD, MLCAD 2021 - Raleigh, United States
Duration: 30 Aug 20213 Sep 2021

Publication series

Name2021 ACM/IEEE 3rd Workshop on Machine Learning for CAD, MLCAD 2021

Conference

Conference3rd ACM/IEEE Workshop on Machine Learning for CAD, MLCAD 2021
Country/TerritoryUnited States
CityRaleigh
Period30/08/213/09/21

Fingerprint

Dive into the research topics of 'Massive Figure Extraction and Classification in Electronic Component Datasheets for Accelerating PCB Design Preparation'. Together they form a unique fingerprint.

Cite this