Acta Agriculturae Zhejiangensis ›› 2023, Vol. 35 ›› Issue (9): 2250-2264.DOI: 10.3969/j.issn.1004-1524.20221193

• Biosystems Engineering • Previous Articles     Next Articles

A light-weight model for plant disease identification based on model pruning and knowledge distillation

LIU Yuanyuan(), WANG Dingkun, WU Lei, HUANG Dechang, ZHU Lu()   

  1. School of Information Engineering, East China Jiaotong University, Nanchang 330013, China
  • Received:2022-08-14 Online:2023-09-25 Published:2023-10-09

Abstract:

The emergence of deep learning has provided a new method for plant disease identification, but the current deep learning models have many parameters, which are difficult to use on edge devices such as smartphones or embedded sensor nodes with limited storage and computing resources. In the present study, plant leaves are taken as the research objects, and the methods based on knowledge distillation and model pruning are used to construct a light-weight model for plant disease identification. By improving the ResNet model, one or more teaching assistant network training models are introduced into the knowledge distillation. After sparse training, a light-weight student network model is obtained by model pruning; and the student network is retrained by using the teaching assistant network and learning rate rewinding, which can reduce the size of the model and effectively ensure the performance of the model. The experimental results show that, on a dataset including 38 categories of 14 plants, after pruning the model by 90%, the accuracy of the model is 97.78%, with an increase of 1.49 percentage points over the original model. On the dataset including 5 categories of apple leaves, after pruning the model by 70%, the accuracy of the model is 91.94%, which is 4.85 percentage points higher than the original model. The proposed light-weight model can be transplanted on Android platform and run effectively, which provides a new solution for the embedded terminal to accurately identify plant diseases.

Key words: disease identification, model pruning, knowledge distillation, learning rate rewinding, residual network

CLC Number: