浙江农业学报 ›› 2023, Vol. 35 ›› Issue (6): 1462-1472.DOI: 10.3969/j.issn.1004-1524.2023.06.23

• 生物系统工程 • 上一篇    下一篇

基于剪枝的植物病害识别方法

朱冬琴1(), 冯全1,*(), 张建华2   

  1. 1.甘肃农业大学 机电工程学院,甘肃 兰州 730070
    2.中国农业科学院 农业信息研究所,北京 100081
  • 收稿日期:2022-07-04 出版日期:2023-06-25 发布日期:2023-07-04
  • 通讯作者: *冯全,E-mail:fquan@sina.com
  • 作者简介:朱冬琴(1995—),女,甘肃定西人,硕士研究生,主要从事模型压缩研究。E-mail:2339488750@qq.com
  • 基金资助:
    国家自然科学基金(32160421);国家自然科学基金(31971792);甘肃省教育厅产业支撑项目(2021CYZC-57)

Plant disease identification based on pruning

ZHU Dongqin1(), FENG Quan1,*(), ZHANG Jianhua2   

  1. 1. School of Mechanical and Electrical Engineering, Gansu Agricultural University, Lanzhou 730070, China
    2. Agricultural Information Institute, Chinese Academy of Agricultural Sciences, Beijing 100081, China
  • Received:2022-07-04 Online:2023-06-25 Published:2023-07-04

摘要:

为实时自动检测植物病害,需要将病害识别模型部署在边缘/移动设备上,但是,目前在病害识别领域具有优越性能的深度卷积神经网络因受模型体积和计算资源的限制,无法直接进行部署。为解决这个问题,提出基于剪枝的病害识别方法,该方法利用BN层中的γ系数进行通道剪枝,实现对Vgg16、ResNet164和DenseNet40网络的压缩。以PlantVillage植物病害数据集为研究对象,对3种网络进行模型压缩。结果表明,压缩后的Vgg16-80%、ResNet164-80%和DenseNet40-80%的平均准确率分别为97.46%、99.12%和99.68%,DenseNet40-80%准确率最高,且模型的参数量最少,仅为0.27×106;Vgg16-80%的压缩效果最明显,剪掉了97.83%的参数量和96.77%的计算量,且剪枝后的计算量最小,仅有0.01×109;Vgg16-80%和DenseNet40-80%剪枝后的精度高于原始模型。因此,本研究方法能够解决大型神经网络的过参数化问题,降低计算成本,可为现有大网络在小型设备上部署提供思路。

关键词: 卷积神经网络, 病害识别, 剪枝, 模型压缩

Abstract:

In order to automatically detect plant diseases in real time, disease identification model need to be deployed on edge/mobile devices. However, the deep convolutional neural networks with superior performance in the field of disease identification cannot be directly deployed due to the limitation of model size and computing resources. In order to solve this problem, a disease identification method based on pruning was proposed, which used the γ coefficient in the BN layer to perform channel pruning to achieve the compression of Vgg16, ResNet164 and DenseNet40 networks. Taking the PlantVillage dataset as the research object, the 3 networks were compressed. The experimental results showed that the average accuracy of the compressed Vgg16-80%、ResNet164-80% and DenseNet40-80% were 97.46%, 99.12% and 99.68%, respectively, and DenseNet40-80% had the highest accuracy and the least amount of parameters, only 0.27×106. Vgg16-80% had the most obvious compression effect, pruned 97.83% of the parameters and 96.77% of the computation. The computation of the pruned Vgg16-80% were the smallest, only 0.01×109. The accuracy of the pruned Vgg16-80% and DenseNet40-80% were higher than the original model. Therefore, this method could solve the problem of over-parameterization of large-scale neural networks, reduce computing costs, and provide ideas for the deployment of existing large-scale networks on small devices.

Key words: convolutional neural network, disease identification, pruning, model compression

中图分类号: