Technology of automated monitoring of the vineyard condition
https://doi.org/10.32634/0869-8155-2023-368-3-109-116
Abstract
Relevance. Proactive management of the processes of effective realization of the varietal potential of grapes is associated with the need to introduce innovative digital technologies for automated monitoring of heterogeneous data sources characterizing agro-climatic conditions and degradation processes of the biological state of plants. Currently, there is a steady trend aimed at digitalization of the viticulture and winemaking industry. There is a whole complex of scientific, practical, technical, technological tasks associated with the introduction of digital technologies for collecting the necessary information, aggregating it and creating a pre-processing technique for implementing procedures for multifactorial data analysis with their further use in decision support systems. The solution of the above-described tasks of a systemic nature requires the creation of scientific and methodological foundations for the implementation of intelligent adaptive automated monitoring of various objects and processes of agricultural enterprises.
Methods. The above technology is based on the complex use of methods of technical vision, neural network classification and detection of grape leaves, evaluation of the quality of training neural network algorithms, video recording methods when using unmanned aerial vehicles (UAVs).
Results. The results of the development of information technology for automated neural network detection of signs of deterioration of grape plantations for proactive management of the processes of effective realization of the varietal potential of grapes are presented. The technology allows the vineyard service personnel to promptly receive information about signs of deterioration of the condition of grape plantations based on video recording data of grape plants obtained using UAVs in static and dynamic mode. The results of testing the accuracy of detecting affected leaves showed that the mAP value of the trained neural network is at least 91%, which is sufficient to identify problem areas.
Keywords
About the Authors
P. N. KuznetsovRussian Federation
Pavel Nikolaevich Kuznetsov, Candidate of Technical Sciences
33 Universitetskaya Str., Sevastopol, 299053, Russian Federation
31 Kirova Str., Yalta, 298600, Russian Federation
D. Yu. Kotelnikov
Russian Federation
Dmitry Yurievich Kotelnikov, Junior Researcher
33 Universitetskaya Str., Sevastopol, 299053, Russian Federation
31 Kirova Str., Yalta, 298600, Russian Federation
D. Yu. Voronin
Russian Federation
Dmitry Yurievich Voronin, candidate of technical sciences, associate professor
33 Universitetskaya Str., Sevastopol, 299053, Russian Federation
References
1. Liu Y., Wang X. Promoting competitiveness of green brand of agricultural products based on agricultural industry cluster. Wireless Communications and Mobile Computing. 2022. https://doi.org/10.1155/2022/7824638
2. Basso B., Antle J. Digital agriculture to design sustainable agricultural systems. Nature Sustainability. 2020; 3: 254–256. https://doi.org/10.1038/s41893-020-0510-0
3. Tian H., Wang T., Liu Y., Qiao X., Li Y. Computer vision technology in agricultural automation — A review. Information Processing in Agriculture. 2020; 7(1): 1–19. https://doi.org/10.1016/j.inpa.2019.09.006
4. Rao R.N., Sridhar B. IoT based smart crop-field monitoring and automation irrigation system. 2nd International Conference on Inventive Systems and Control (ICISC). 2018; 478–483.
5. Jha K., Doshi A., Patel P., Shah M. A comprehensive review on automation in agriculture using artificial intelligence. Artificial Intelligence in Agriculture. 2019; 2: 1–12.
6. Raeva P. L., Šedina J., Dlesk A. Monitoring of crop fields using multispectral and thermal imagery from UAV. European Journal of Remote Sensing. 2019; 52(1): 192–201.
7. Kapania S., Saini D., Goyal S., Thakur N., Jain R. Multi object tracking with UAVs using deep SORT and YOLOv3 RetinaNet detection framework. Proceedings of the 1st ACM Workshop on Autonomous and Intelligent Mobile Systems. 2020; 1–6.
8. Pereira R., Carvalho G., Garrote L., Nunes U.J. Sort and Deep-SORT Based Multi-Object Tracking for Mobile Robotics: Evaluation with New Data Association Metrics. Applied Sciences / An Open Access Journal from MDPI. 2022; 12(3): 1319.
9. Süzen A.A., Duman B., Şen B. Benchmark Analysis of Jetson TX2, Jetson Nano and Raspberry PI using Deep-CNN. IEEE 2. International Congress on Human-Computer Interaction, Optimization and Robotic Applications. At: Turkey. 2020; 1–5.
10. Zoev I.V., Markov N.G., Ryzhova S.E. Intelligent computer vision system for unmanned aerial vehicles for monitoring technological objects of oil and gas industry. Bulletin of the Tomsk Polytechnic University. Geo Assets Engineering. 2019; 330(11): 34–49. DOI: 10.18799/24131830/2019/11/2346 (In Russian).
11. Aposporis P. Object detection methods for improving UAV autonomy and remote sensing applications. 2020 IEEE/ACM. International Conference on Advances in Social Networks Analysis and Mining (ASONAM). 2020; 845–853.
12. Russakovsky O. et al. Imagenet large scale visual recognition challenge. International journal of computer vision. 2015; 115(3): 1–43.
13. Russakovsky O. et al. Imagenet large scale visual recognition challenge. International journal of computer vision. 2015; 115(3): 1–43.
14. Kuznetsov P.N., Kotelnikov D.Yu. Technology of automated monitoring of the photovoltaic modules of a solar power plant. Monitoring. Science and Technology. 2022; 2(52): 65–72. DOI 10.25714/MNT.2022.52.008 (In Russian).
15. Kuznetsov P.N., Kotelnikov D.Yu. Automated technological complex for monitoring and diagnostics of vineyard. Don agrarian science bulletin (Vestnik agrarnoj nauki Dona). 2021; 4(56): 16–23. eLIBRARY ID: 47806373 (In Russian).
Review
For citations:
Kuznetsov P.N., Kotelnikov D.Yu., Voronin D.Yu. Technology of automated monitoring of the vineyard condition. Agrarian science. 2023;(3):109-116. (In Russ.) https://doi.org/10.32634/0869-8155-2023-368-3-109-116