Computer and Modernization ›› 2024, Vol. 0 ›› Issue (10): 93-99.doi: 10.3969/j.issn.1006-2475.2024.10.015

Previous Articles     Next Articles

Object Detection Models Distillation Technique for Industrial Deployment

  

  1. (1. School of Physics and Technology, Wuhan University, Wuhan 430072, China; 2. State Grid Information & Telecommunication Co.,Ltd., Beijing 102211, China; 3. Fujian Yirong Information Technology Co., Ltd., Fuzhou 350003, China)
  • Online:2024-10-29 Published:2024-10-30

Abstract:  The application scenarios of deep learning object detection models are quite extensive. However, the detection accuracy of deployed models is often low due to the performance limitations of deployment devices. To enhance the performance of detection models, this paper proposes an efficient dynamic distillation training method. This method innovatively introduces a dynamic sample assignment strategy to select high-quality outputs of the teacher model, and pairs this with dynamic weight adjustment of distillation loss, thereby improving the traditional distillation algorithm used in object detection models. Experimental results on a dataset for electrical grid safety construction indicate that, compared to direct training, this method increased the Average Precision (AP) value of the YOLOv6-n model by an average of 2.63 percentage points. The distillation method proposed in this paper does not affect the inference speed of the original deployment model and helps to enhance the detection performance of object detection models in various industrial scenarios.

Key words: deep learning, object detection, knowledge distillation

CLC Number: