TY - GEN
T1 - Smart self-checkout carts based on deep learning for shopping activity recognition
AU - Chi, Hong Chuan
AU - Sarwar, Muhammad Atif
AU - Daraghmi, Yousef Awwad
AU - Lin, Kuan Wen
AU - Ik, Tsi-Ui
AU - Li, Yih-Lang
N1 - Publisher Copyright:
© 2020 KICS.
PY - 2020/9/22
Y1 - 2020/9/22
N2 - Fast and reliable communication plays a major role in the success of smart shopping applications. In a 'Just Walk Out' shopping scenario, a video camera is installed on the cart to monitor shopping activities and transmit images to the cloud for processing so that items in the cart can be tracked and checked out. This paper proposes a prototype of a smart shopping cart based on image-based action recognition. Firstly, deep learning networks such as Faster R-CNN, YOLOv2, and YOLOv2-Tiny are utilized to analyze the content of each video frame. Frames are classified into three classes: No Hand, Empty Hand, and Holding Items. The classification accuracy based on Faster R-CNN, YOLOv2, or YOLOv2-Tiny is between 93.0% and 90.3%, and the processing speed of the three networks can be up to 5 fps, 39 fps, and 50 fps, respectively. Secondly, based on the sequence of frame classes, the timeline is divided into No Hand intervals, Empty Hand intervals, and Holding Items intervals. The accuracy of action recognition is 96%, and the time error is 0.119s on average. Finally, we categorize the events into four cases: No Change, placing, Removing, and Swapping. Even including the correctness of the item recognition, the accuracy of shopping event detection is 97.9%, which is higher than the minimal requirement to deploy such a system in a smart shopping environment. A demo of the system and a link to download the data set used in the paper are in Smart Shopping Cart Prototype or found at this URL: https://hackmd.io/abEiC83rQoqxz7zpL4Kh2w.
AB - Fast and reliable communication plays a major role in the success of smart shopping applications. In a 'Just Walk Out' shopping scenario, a video camera is installed on the cart to monitor shopping activities and transmit images to the cloud for processing so that items in the cart can be tracked and checked out. This paper proposes a prototype of a smart shopping cart based on image-based action recognition. Firstly, deep learning networks such as Faster R-CNN, YOLOv2, and YOLOv2-Tiny are utilized to analyze the content of each video frame. Frames are classified into three classes: No Hand, Empty Hand, and Holding Items. The classification accuracy based on Faster R-CNN, YOLOv2, or YOLOv2-Tiny is between 93.0% and 90.3%, and the processing speed of the three networks can be up to 5 fps, 39 fps, and 50 fps, respectively. Secondly, based on the sequence of frame classes, the timeline is divided into No Hand intervals, Empty Hand intervals, and Holding Items intervals. The accuracy of action recognition is 96%, and the time error is 0.119s on average. Finally, we categorize the events into four cases: No Change, placing, Removing, and Swapping. Even including the correctness of the item recognition, the accuracy of shopping event detection is 97.9%, which is higher than the minimal requirement to deploy such a system in a smart shopping environment. A demo of the system and a link to download the data set used in the paper are in Smart Shopping Cart Prototype or found at this URL: https://hackmd.io/abEiC83rQoqxz7zpL4Kh2w.
KW - Action recognition
KW - Faster R-CNN
KW - Frame classification
KW - Smart shopping cart
KW - YOLOv2
KW - YOLOv2-Tiny
UR - http://www.scopus.com/inward/record.url?scp=85096986250&partnerID=8YFLogxK
U2 - 10.23919/APNOMS50412.2020.9237053
DO - 10.23919/APNOMS50412.2020.9237053
M3 - Conference contribution
AN - SCOPUS:85096986250
T3 - APNOMS 2020 - 2020 21st Asia-Pacific Network Operations and Management Symposium: Towards Service and Networking Intelligence for Humanity
SP - 185
EP - 190
BT - APNOMS 2020 - 2020 21st Asia-Pacific Network Operations and Management Symposium
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 21st Asia-Pacific Network Operations and Management Symposium, APNOMS 2020
Y2 - 22 September 2020 through 25 September 2020
ER -