A Coarse-Precise Refinement Learning-Based Knowledge Distillation Network for Anomaly Detection
Authors:
Chaoyang Li, Haozheng Zhang, Fei Wang, Chengkun Li, and Yanhong Yang
Conference:
ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages:
272-283
Keywords:
Anomaly detection, Knowledge distillation, Dual-learning, Feature reconstruction
Abstract
Anomaly detection serves a crucial role in large-scale industrial manufacturing. Knowledge distillation KD -based approaches have demonstrated excellent performance, yet their efficacy is constrained by the identical symmetric structures. In this study, we propose an enhanced KD-based architecture with a dual-learning mechanism, called DLKD, to precisely characterize normal samples and improve detection performance. Specifically, we first introduce a coarse decoder into the student network to preliminarily reconstruct the teacher features, in which the SSM-based global feature reconstruction block GFRB and CNN-based local feature reconstruction block LFRB effectively model global and local information.A precise refinement learner is subsequently provided to finely tune the coarse reconstructed features. Extensive experiments on two publicly available anomaly detection datasets demonstrate the effectiveness and potential of the proposed DLKD. This work further explores KD-based methods for anomaly detection and provides a unique yet robust baseline for the community.
BibTeX Citation:
@inproceedings{ICIC2025,
author = {Chaoyang Li, Haozheng Zhang, Fei Wang, Chengkun Li, and Yanhong Yang},
title = {A Coarse-Precise Refinement Learning-Based Knowledge Distillation Network for Anomaly Detection},
booktitle = {Proceedings of the 21st International Conference on Intelligent Computing (ICIC 2025)},
month = {July},
date = {26-29},
year = {2025},
address = {Ningbo, China},
pages = {272-283},
note = {Poster Volume â… }
}