TY - GEN
T1 - Patch-Based Prototypical Cross-Scale Attention Network for Anomaly Detection
AU - Wang, Tung Lin
AU - Hsieh, Jun Wei
AU - Hsieh, Yi Kuan
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2025
Y1 - 2025
N2 - Anomaly detection and localization play crucial roles in industrial manufacturing to help maintain product quality and minimize defects. However, anomalies are rare and challenging to collect, leading to imbalance data that cause a biased model to be trained and sensitive to noisy or irrelevant features. In addition, anomalies are often subtle, diverse, and change over time, making them difficult to differentiate, further complicating the detection and localization tasks. To address these challenges, we propose a new Patch-based Protopical Cross-Scale Attention Network (PPCA-Net) to effectively identify anomaly regions by learning residual features across different scales and sizes, distinguishing abnormal from normal patterns. It consists of two key components: the Scale-Aware Channel Attention Module (SACAM) and the Patch-based Cross-Scale Attention Module (PCSAM). These modules facilitate interactive feature inferences across multiple scales, significantly enhancing the ability to capture abnormal features of various sizes in various environments. Furthermore, we incorporate diverse anomaly generation strategies, including multi-scale prototypes to better represent feature disparities between abnormal and normal patterns, thereby enhancing overall effectiveness. Through extensive experimentation on the challenging MVTec AD [1] benchmark, PPCA-Net demonstrates superior performance in both unsupervised and supervised methods, highlighting its effectiveness in anomaly identification.
AB - Anomaly detection and localization play crucial roles in industrial manufacturing to help maintain product quality and minimize defects. However, anomalies are rare and challenging to collect, leading to imbalance data that cause a biased model to be trained and sensitive to noisy or irrelevant features. In addition, anomalies are often subtle, diverse, and change over time, making them difficult to differentiate, further complicating the detection and localization tasks. To address these challenges, we propose a new Patch-based Protopical Cross-Scale Attention Network (PPCA-Net) to effectively identify anomaly regions by learning residual features across different scales and sizes, distinguishing abnormal from normal patterns. It consists of two key components: the Scale-Aware Channel Attention Module (SACAM) and the Patch-based Cross-Scale Attention Module (PCSAM). These modules facilitate interactive feature inferences across multiple scales, significantly enhancing the ability to capture abnormal features of various sizes in various environments. Furthermore, we incorporate diverse anomaly generation strategies, including multi-scale prototypes to better represent feature disparities between abnormal and normal patterns, thereby enhancing overall effectiveness. Through extensive experimentation on the challenging MVTec AD [1] benchmark, PPCA-Net demonstrates superior performance in both unsupervised and supervised methods, highlighting its effectiveness in anomaly identification.
KW - Anomaly Detection
KW - Anomaly segmentation
KW - defect inspection
UR - http://www.scopus.com/inward/record.url?scp=85211946211&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-78166-7_24
DO - 10.1007/978-3-031-78166-7_24
M3 - Conference contribution
AN - SCOPUS:85211946211
SN - 9783031781650
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 366
EP - 381
BT - Pattern Recognition - 27th International Conference, ICPR 2024, Proceedings
A2 - Antonacopoulos, Apostolos
A2 - Chaudhuri, Subhasis
A2 - Chellappa, Rama
A2 - Liu, Cheng-Lin
A2 - Bhattacharya, Saumik
A2 - Pal, Umapada
PB - Springer Science and Business Media Deutschland GmbH
T2 - 27th International Conference on Pattern Recognition, ICPR 2024
Y2 - 1 December 2024 through 5 December 2024
ER -