计算机与现代化 ›› 2025, Vol. 0 ›› Issue (02): 58-63.doi: 10.3969/j.issn.1006-2475.2025.02.008

• 人工智能 • 上一篇    下一篇

基于双向多尺度知识蒸馏的异常检测算法

  
  

  1. (四川大学机械工程学院,四川 成都 610065)
  • 出版日期:2025-02-28 发布日期:2025-02-28

Anomaly Detection Algorithm Based on Bidirectional Multi-scale Knowledge Distillation

  1. (School of Mechanical Engineering, Sichuan University, Chengdu 610065, China)
  • Online:2025-02-28 Published:2025-02-28

摘要: 针对当前基于知识蒸馏的异常检测算法中,教师与学生模型之间因异常特征表达差异性不高导致异常检测和定位精度较低的问题,本文提出一种基于双向多尺度知识蒸馏的异常检测算法。用教师模型、学生模型和反向蒸馏学生模型组成的非对称师生网络结构抑制学生对异常特征的泛化。在双向学生模型之间引入特征融合残差模块以融合多尺度特征和减少异常干扰。在正向蒸馏的学生模型内部嵌入注意力模块以增强重要特征的学习能力。在测试阶段通过多尺度异常图融合的方式进行异常评估。在公开数据集MVTec AD上的实验结果表明,以ResNet18作为主干的本文算法在接受者操作特征曲线下面积评估标准上取得了较高的97.7%像素级得分和98.8%图像级得分,有效改善了当前的知识蒸馏算法。

关键词: 异常检测, 知识蒸馏, 特征融合, 注意力, 深度学习

Abstract:

Aiming at the problem of low anomaly detection and localization accuracy in current knowledge distillation-based anomaly detection algorithms due to the low difference in abnormal feature representation between teacher and student models, an anomaly detection algorithm based on bidirectional multi-scale knowledge distillation is proposed. An asymmetric teacher-student network structure composed of a teacher model, a student model and a reverse distillation student model is employed to suppress the student’s generalization to abnormal features. A feature fusion residual module is introduced between the bidirectional distillation student models to integrate multi-scale features and reduce abnormal disturbances. An attention module is introduced within the forward distillation student model to enhance the learning ability of important features. During the testing phase, anomaly assessment is performed through multi-scale anomaly map fusion. Experimental results on the public dataset MVTec AD show that the proposed algorithm, using ResNet18 as the backbone, achieves high scores of 97.7% at the pixel level and 98.8% at the image level on the area under the receiver operating characteristic curve evaluation metric, effectively improving the current knowledge distillation algorithms.

Key words: anomaly detection, knowledge distillation, feature fusion, attention, deep learning

中图分类号: