浙江农业学报 ›› 2021, Vol. 33 ›› Issue (9): 1730-1739.DOI: 10.3969/j.issn.1004-1524.2021.09.17
收稿日期:
2020-11-19
出版日期:
2021-09-25
发布日期:
2021-10-09
通讯作者:
姜晓卫
作者简介:
* 姜晓卫,E-mail: 6191905023@stu.jiangnan.edu.cn基金资助:
JI Xunsheng1(), JIANG Xiaowei1,*(
), XIA Shengkui2
Received:
2020-11-19
Online:
2021-09-25
Published:
2021-10-09
Contact:
JIANG Xiaowei
摘要:
产蛋率是评价蛋鸡产蛋性能的重要指标之一,因其具有时序性和非线性等特点,且其影响变量众多、存在复杂的耦合关系,难以实现精准预测。由于传统神经网络预测过程的非记忆性难以处理时序性问题,该文章提出蛋鸡产蛋率的LSTM-Kalman预测方法,使用主成分分析提取影响蛋鸡产蛋率的关键变量,通过LSTM神经网络预测蛋鸡产蛋率,采用Kalman滤波对LSTM预测的结果进行动态调整,作为最终预测结果。数据分析结果表明:LSTM-Kalman模型预测产蛋率的平均绝对误差、均方误差和皮尔逊相关系数分别为0.312 8、0.435 3和0.975 2,明显优于传统的BP神经网络、极限学习机等预测方法;通过2栋鸡舍生产数据的交叉测试验证,模型的预测准确率达到97.14%和98.71%,表明模型具有较强的泛化能力,能够满足蛋鸡产蛋率预测的实际需要,可以为蛋鸡养殖环境数据的精准调控提供参考。
中图分类号:
吉训生, 姜晓卫, 夏圣奎. 基于LSTM-Kalman模型的蛋鸡产蛋率预测方法[J]. 浙江农业学报, 2021, 33(9): 1730-1739.
JI Xunsheng, JIANG Xiaowei, XIA Shengkui. Research on prediction of laying rate by hens based on LSTM-Kalman model[J]. Acta Agriculturae Zhejiangensis, 2021, 33(9): 1730-1739.
鸡舍 Hen coop | 初始数量 Initial quantity | 结束数量 Ending quantity | 死淘率 Death rate/% | 平均温度 Average temperature/℃ | 平均湿度 Average humidity/% | 只均采食量 Average food/g | 只均饮水量 Average water/mL |
---|---|---|---|---|---|---|---|
5号No.5 | 58 570 | 55 298 | 5.75 | 23.34~29.85 | 62.69~92.12 | 79.69~120.85 | 126.48~365.61 |
6号No.6 | 59 320 | 57 059 | 3.89 | 23.02~29.66 | 67.82~90.99 | 77.27~118.59 | 110.92~490.66 |
表1 鸡舍养殖数据
Table 1 Breeding data of hen coop
鸡舍 Hen coop | 初始数量 Initial quantity | 结束数量 Ending quantity | 死淘率 Death rate/% | 平均温度 Average temperature/℃ | 平均湿度 Average humidity/% | 只均采食量 Average food/g | 只均饮水量 Average water/mL |
---|---|---|---|---|---|---|---|
5号No.5 | 58 570 | 55 298 | 5.75 | 23.34~29.85 | 62.69~92.12 | 79.69~120.85 | 126.48~365.61 |
6号No.6 | 59 320 | 57 059 | 3.89 | 23.02~29.66 | 67.82~90.99 | 77.27~118.59 | 110.92~490.66 |
图2 LSTM神经网络内存块结构 xt为当前输入;ht、ht-1分别为当前输出和前一时刻的输出;Ct、Ct-1分别是当前时刻的记忆内容和前一时刻的记忆内容;σ()是sigmoid函数;Ot表示保存的历史信息;Ct'表示需要更新的信息。
Fig.2 Structure of LSTM cell xt was the current input; ht, ht-1 were output of the current output and previous moment, respectively; Ct, Ct-1 were memory content of the current moment and memory content of the previous moment; σ() was sigmoid function; Ot represented the saved historical information; Ct' indicated the information that needed to be updated.
成分 Component | 方差贡献率 Variance contribution rate | 累计方差贡献率 Cumulative variance contribution rate |
---|---|---|
1 | 65.855 9 | 65.855 9 |
2 | 14.086 9 | 79.942 8 |
3 4 5 | 11.154 9 6.432 8 2.469 4 | 91.097 8 97.530 6 100 |
表2 主成分方差贡献率与累计方差贡献率
Table 2 Principal component variance contribution rate and cumulative variance contribution rate %
成分 Component | 方差贡献率 Variance contribution rate | 累计方差贡献率 Cumulative variance contribution rate |
---|---|---|
1 | 65.855 9 | 65.855 9 |
2 | 14.086 9 | 79.942 8 |
3 4 5 | 11.154 9 6.432 8 2.469 4 | 91.097 8 97.530 6 100 |
成分 Component | 主成分1 Principal component 1 | 主成分2 Principal component 2 | 主成分3 Principal component 3 |
---|---|---|---|
只均采食量Average feed | 0.118 698 | -0.237 459 | 0.487 089 |
温差Temperature difference | 0.166 156 | 0.522 511 | 0.167 772 |
平均湿度Average humidity | 0.302 565 | -0.071 124 | 0.143 339 |
平均温度Average temperature | 0.238 405 | -0.119 832 | -0.111 712 |
只均饮水量Average water | -0.174 172 | 0.049 097 | 0.090 088 |
表3 成分矩阵
Table 3 Component matrix
成分 Component | 主成分1 Principal component 1 | 主成分2 Principal component 2 | 主成分3 Principal component 3 |
---|---|---|---|
只均采食量Average feed | 0.118 698 | -0.237 459 | 0.487 089 |
温差Temperature difference | 0.166 156 | 0.522 511 | 0.167 772 |
平均湿度Average humidity | 0.302 565 | -0.071 124 | 0.143 339 |
平均温度Average temperature | 0.238 405 | -0.119 832 | -0.111 712 |
只均饮水量Average water | -0.174 172 | 0.049 097 | 0.090 088 |
试验 Test | 节点数 Nodes number | 学习率 Learning rate | 批尺寸 Batch- size | 时间步长 Time step/d |
---|---|---|---|---|
1 | 50 | 0.010 | 10 | 1 |
2 | 50 | 0.010 | 10 | 2 |
3 | 50 | 0.010 | 10 | 3 |
4 | 50 | 0.010 | 50 | 3 |
5 | 50 | 0.001 | 10 | 3 |
表4 五次试验LSTM模型参数对比
Table 4 LSTM neural network parameter comparison
试验 Test | 节点数 Nodes number | 学习率 Learning rate | 批尺寸 Batch- size | 时间步长 Time step/d |
---|---|---|---|---|
1 | 50 | 0.010 | 10 | 1 |
2 | 50 | 0.010 | 10 | 2 |
3 | 50 | 0.010 | 10 | 3 |
4 | 50 | 0.010 | 50 | 3 |
5 | 50 | 0.001 | 10 | 3 |
试验 Test | 均方误差 MSE | 平均绝对误差 MAE | 皮尔逊相关系数 r | 准确率 Accuracy% |
---|---|---|---|---|
1 | 0.561 8 | 0.583 7 | 0.522 4 | 60.47 |
2 | 0.364 3 | 0.464 3 | 0.807 9 | 75.38 |
3 | 0.119 4 | 0.283 5 | 0.953 4 | 94.21 |
4 | 0.168 8 | 0.329 9 | 0.905 6 | 89.17 |
5 | 0.233 7 | 0.396 1 | 0.898 7 | 85.39 |
表5 LSTM预测模型的精度分析
Table 5 Accuracy analysis of LSTM prediction model
试验 Test | 均方误差 MSE | 平均绝对误差 MAE | 皮尔逊相关系数 r | 准确率 Accuracy% |
---|---|---|---|---|
1 | 0.561 8 | 0.583 7 | 0.522 4 | 60.47 |
2 | 0.364 3 | 0.464 3 | 0.807 9 | 75.38 |
3 | 0.119 4 | 0.283 5 | 0.953 4 | 94.21 |
4 | 0.168 8 | 0.329 9 | 0.905 6 | 89.17 |
5 | 0.233 7 | 0.396 1 | 0.898 7 | 85.39 |
模型 Model | 均方误差MSE | 平均绝对误差MAE | 皮尔逊相关系数r | 准确率Accuracy/% |
---|---|---|---|---|
BP神经网络 | 1.284 8 | 0.860 8 | 0.776 7 | 85.47 |
极限学习机 | 1.093 4 | 0.802 2 | 0.792 6 | 87.34 |
LSTM | 0.827 1 | 0.698 9 | 0.893 4 | 92.68 |
LSTM-Kalman | 0.312 8 | 0.435 3 | 0.975 2 | 98.71 |
表6 各模型预测结果精度分析
Table 6 Precision analysis of forecast results for each model
模型 Model | 均方误差MSE | 平均绝对误差MAE | 皮尔逊相关系数r | 准确率Accuracy/% |
---|---|---|---|---|
BP神经网络 | 1.284 8 | 0.860 8 | 0.776 7 | 85.47 |
极限学习机 | 1.093 4 | 0.802 2 | 0.792 6 | 87.34 |
LSTM | 0.827 1 | 0.698 9 | 0.893 4 | 92.68 |
LSTM-Kalman | 0.312 8 | 0.435 3 | 0.975 2 | 98.71 |
模型Model | 均方误差MSE | 平均绝对误差MAE | 皮尔逊相关系数r | 准确率Accuracy/% |
---|---|---|---|---|
BP神经网络 | 1.557 1 | 1.054 3 | 0.701 6 | 81.54 |
极限学习机 | 1.135 9 | 0.875 4 | 0.802 7 | 88.16 |
LSTM | 0.795 4 | 0.702 9 | 0.900 7 | 91.76 |
LSTM-Kalman | 0.235 7 | 0.457 2 | 0.974 3 | 97.14 |
表7 各模型预测结果精度分析
Table 7 Precision analysis of forecast results for each model
模型Model | 均方误差MSE | 平均绝对误差MAE | 皮尔逊相关系数r | 准确率Accuracy/% |
---|---|---|---|---|
BP神经网络 | 1.557 1 | 1.054 3 | 0.701 6 | 81.54 |
极限学习机 | 1.135 9 | 0.875 4 | 0.802 7 | 88.16 |
LSTM | 0.795 4 | 0.702 9 | 0.900 7 | 91.76 |
LSTM-Kalman | 0.235 7 | 0.457 2 | 0.974 3 | 97.14 |
[1] | 郑可锋, 祝利莉, 胡为群, 等. 数字农业技术研究进展[J]. 浙江农业学报, 2005, 17(3):170-176. |
ZHENG K F, ZHU L L, HU W Q, et al. Introduction on technology for digital agriculture[J]. Acta Agriculturae Zhejiangensis, 2005, 17(3):170-176.(in Chinese with English abstract) | |
[2] | 朱钰, 郑屹然, 尹默. 统计学意义下的多重共线性检验方法[J]. 统计与决策, 2020, 36(7):34-36. |
ZHU Y, ZHENG Y R, YIN M. Multicollinearity test under statistical significance[J]. Statistics & Decision, 2020, 36(7):34-36.(in Chinese with English abstract) | |
[3] | 潘迪夫, 刘辉, 李燕飞. 基于时间序列分析和卡尔曼滤波算法的风电场风速预测优化模型[J]. 电网技术, 2008, 32(7):82-86. |
PAN D F, LIU H, LI Y F. A wind speed forecasting optimization model for wind farms based on time series analysis and Kalman filter algorithm[J]. Power System Technology, 2008, 32(7):82-86.(in Chinese with English abstract) | |
[4] | 刘先旺. 基于改进BP神经网络的鸡舍环境与产蛋性能关系模型研究[D]. 合肥: 安徽农业大学, 2018. |
LIU X W. Research on the model of the relationship between the environmental factors and the laying performance of layer house based on improved BP neural network[D]. Hefei: Anhui Agricultural University, 2018. (in Chinese with English abstract) | |
[5] | 李飞, 蒋敏兰. 基于极限学习机的蛋鸡产蛋性能预测[J]. 中国家禽, 2019, 41(2):62-64. |
LI F, JIANG M L. Prediction of laying performance of laying hens based on extreme learning machine[J]. China Poultry, 2019, 41(2):62-64.(in Chinese) | |
[6] | HONG W C, DONG Y C, CHEN L Y, et al. Seasonal support vector regression with chaotic genetic algorithm in electric load forecasting[C]. Sixth International Conference on Genetic and Evolutionary Computing, IEEE, 2012:124-127. |
[7] | LV Y, DUAN Y J, KANG W W, et al. Traffic flow prediction with big data: a deep learning approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2015, 16(2):865-873. |
[8] |
HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780.
DOI URL |
[9] | WU T, LIU C C, HE C. Prediction of egional temperature change trend based on LSTM algorithm[J]. IEEE Information Technology Networking Electronic and Automation Control Conference, 2020, 34(4):62-66. |
[10] | HUANG H H, WANG T Y, LIU J, et al. Predicting urban rail traffic passenger flow based on LSTM[J]. IEEE Information Technology Networking Electronic and Automation Control Conference, 2019, 24(3):616-620. |
[11] | 陈卓, 孙龙祥. 基于深度学习LSTM网络的短期电力负荷预测方法[J]. 电子技术, 2018, 47(1):39-41. |
CHEN Z, SUN L X. Short-term electrical load forecasting based on deep learning LSTM networks[J]. Electronic Technology, 2018, 47(1):39-41.(in Chinese with English abstract) | |
[12] | 王鑫, 吴际, 刘超, 等. 基于LSTM循环神经网络的故障时间序列预测[J]. 北京航空航天大学学报, 2018, 44(4):772-784. |
WANG X, WU J, LIU C, et al. Exploring LSTM based recurrent neural network for failure time series prediction[J]. Journal of Beijing University of Aeronautics and Astronautics, 2018, 44(4):772-784.(in Chinese with English abstract) | |
[13] | 于家斌, 尚方方, 王小艺, 等. 基于遗传算法改进的一阶滞后滤波和长短期记忆网络的蓝藻水华预测方法[J]. 计算机应用, 2018, 38(7):2119-2123. |
YU J B, SHANG F F, WANG X Y, et al. Cyanobacterial bloom forecast method based on genetic algorithm-first order lag filter and long short-term memory network[J]. Journal of Computer Applications, 2018, 38(7):2119-2123.(in Chinese with English abstract) | |
[14] |
KALMAN R E, BUCY R S. New results in linear filtering and prediction theory[J]. Journal of Basic Engineering, 1961, 83(1):95-108.
DOI URL |
[15] | BUCY R S, RENNE K D. Digital synjournal of nonlinear filter[J]. IEEE Transactions on Automatic Control, 2000, 45(3):287-289. |
[16] |
SUNAHARA Y. An approximate method of state estimation for nonlinear dynamical systems[J]. Journal of Basic Engineering, 1970, 92(2):385-393.
DOI URL |
[17] |
JULIER S, UHLMANN J, DURRANT-WHYTE H F. A new method for the nonlinear transformation of means and covariances in filters and estimators[J]. IEEE Transactions on Automatic Control, 2000, 45(3):477-482.
DOI URL |
[18] | SONG X J, HUANG J J, SONG D W. Air quality prediction based on LSTM-Kalman model[C]// 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC). May 24-26, 2019, Chongqing. IEEE, 2019: 695-699. |
[19] | 王钰, 赵晓艳, 杨杏丽, 等. 基于K折交叉验证Beta分布的AUC度量的置信区间[J]. 系统科学与数学, 2020, 40(9):1564-1577. |
WANG Y, ZHAO X Y, YANG X L, et al. Confidence interval of AUC measure based on K-fold cross-validated beta distribution[J]. Journal of Systems Science and Mathematical Sciences, 2020, 40(9):1564-1577.(in Chinese with English abstract) | |
[20] |
HIROSE Y, YAMASHITA K, HIJIYA S. Back-propagation algorithm which varies the number of hidden units[J]. Neural Networks, 1991, 4(1):61-66.
DOI URL |
[21] | 陈啸, 王红英, 孔丹丹, 等. 基于粒子群参数优化和BP神经网络的颗粒饲料质量预测模型[J]. 农业工程学报, 2016, 32(14):306-314. |
CHEN X, WANG H Y, KONG D D, et al. Quality prediction model of pellet feed basing on BP network using PSO parameters optimization method[J]. Transactions of the Chinese Society of Agricultural Engineering, 2016, 32(14):306-314.(in Chinese with English abstract) | |
[22] | 王伟, 张玉, 吴应淼, 等. BP神经网络模型在香菇中SO2含量分析中的应用[J]. 浙江农业学报, 2011, 23(5):1012-1016. |
WANG W, ZHANG Y, WU Y M, et al. Application of sulfur dioxide content detection in mushrooms based on the back-propagation neural networks[J]. Acta Agriculturae Zhejiangensis, 2011, 23(5):1012-1016.(in Chinese with English abstract) | |
[23] |
HUANG G B, ZHU Q Y, SIEW C K. Extreme learning machine: theory and applications[J]. Neurocomputing, 2006, 70(1/2/3):489-501.
DOI URL |
[24] | HUANG G B, ZHU Q Y, SIEW C K. Extreme learning machine: a new learning scheme of feedforward neural networks[J]. IEEE International Joint Conference on Neural Networks, 2004, 2:985-990. |
[25] | HUANG G B, ZHOU H M, DING X J, et al. Extreme learning machine for regression and multiclass classification[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2012, 42(2):513-529. |
No related articles found! |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||