with the collaboration of Iranian Society of Mechanical Engineers (ISME)

Document Type : Research Article

Authors

1 BSc Student, Department of Geomatics, Babol Noshirvani University of Technology, Babol, Iran

2 Department of Geomatics, Babol Noshirvani University of Technology, Babol, Iran

Abstract

Introduction
Rice is one of the most important main food sources in Iran and the world. The correct identification of the type of pest in the early stages of preventive action has a significant role in reducing the damage to the crop. Traditional methods are not only time-consuming but also provide inaccurate results, As a result, precision agriculture and its associated technology systems have emerged. Precision agriculture utilizes information technology such as GPS, GIS, remote sensing, and machine learning to implement agricultural inter-farm technical measures to achieve better marginal benefits for the economy and environment. Machine learning is a division of artificial intelligence that can automatically progress based on experience gained. Deep learning is a subfield of machine learning that models the concepts of using deep neural networks with several high-level abstract layers. This capability has led to careful consideration in agricultural management. The diagnosis of disease and predicting the time of destruction, with a focus on artificial intelligence, has been the subject of much research in precision agriculture. This article presents, in the first step, a trained model of the Chilo suppressalis pest using data received from the smartphone, validated with the opinion of experts. In the second step, we introduce the developed system based on the smartphone. By using this system, farmers can share their pest images through the Internet and learn about the type of pest on their farm, and finally, take the necessary measures to combat it. This operation is done quickly and efficiently using the developed artificial intelligence. In the continuation of the article, the second part introduces the materials and methods, and the third part presents the results. The fourth section also discusses and concludes the research.
Materials and Methods
Chilo suppressalis is one of the most important pests of rice in temperate and subtropical regions of Asia. The conventional approach employed by villagers to gather the Chilo suppressalis pest entails setting up a light source above a pan filled with water infused with a pesticide. At sunset, these insects are attracted to the light and fall into the water in the pan. This method is known as optical trapping. After catching the pest using optical traps, they are collected from the water surface, and their photo is taken with a mobile phone based on the location of the optical trap.
The proposed method in this research consists of three main steps. Firstly, the farmer utilizes the software provided by the extended version known as Smart Farm. The farmer captures an image of the Chilo suppressalis pest and sends it along with its location to the system. The Smart Farm software program carries out image processing and pest range detection operations. The user then verifies the accuracy of the pest detection. In the second step, the images sent by the farmer are processed by the pre-trained model within the system. The model analyzes the images and determines the presence of the pest. Finally, after identifying the type of pest, the results, along with recommended methods for pest control, are sent back to the farmer.
In summary, In this method, farmers employ the Smart Farm software to capture and transmit images of the Chilo suppressalis pest. The captured images then undergo image processing and pest range detection as the next steps in the process. The results, including pest identification and control methods, are then returned to the farmer.
Results and Discussion
The model has been designed with 400 artificial neural network processing units (APCs), achieving accuracy percentages of 88% and 92%. To conduct a more detailed study of the proposed model, the statistical criteria of recall and F-score were used. Based on the calculations, the trained model demonstrated a recall score of 91%. This criterion shows that the model was able to identify a large percentage of what was expected to be identified by the model. Additionally, the F-score, with an acceptable percentage of 88%, confirmed the accuracy of the trained model.
Conclusion
Researchers have always been highly interested in the valuable data freely provided by farmers for their studies and analyses. In this study, an intelligent system was designed for identifying types of pests such as worms and stalk eaters, which can automatically determine the pest type from the image sent by the farmer using artificial intelligence and deep learning. By utilizing the developed system, farmers can be informed of the type of pest present on their farm in the shortest possible time, with minimal required software training.

Keywords

Main Subjects

©2022 The author(s). This article is licensed under Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source.

  1. Ba, J., & Frey, B. (2013). Adaptive dropout for training deep neural networks. Advances in Neural Information Processing Systems, 26, 3084-3092.
  2. Baldi, P., & Sadowski, (2014). The dropout learning algorithm. Artificial Intelligence, 210, 78-122. https://doi.org/10.1016/j.artint.2014.02.004
  3. Chen, J., Liu, Q., & Gao, L. (2021). Deep Convolutional Neural Networks for Tea Tree Pest Recognition and Diagnosis. Symmetry, 13(11), 2140. https://doi.org/10.3390/sym13112140
  4. Chung, C. L., Huang, K. J., Chen, S. Y., Lai, M. H., Chen, Y. C., & Kuo, Y. F. (2016). Detecting Bakanae disease in rice seedlings by machine vision. Computers and Electronics in Agriculture, 121, 404-411. https://doi.org/10.1016/j.compag.2016.01.008
  5. Dale, D. (1994). Insect pests of the rice plant–their biology and ecology. Biology and Management of Rice Insects, 438, 442.
  6. Deshpande, T., Sengupta, S., & Raghuvanshi, K. (2014). Grading & identification of disease in pomegranate leaf and fruit. International Journal of Computer Science and Information Technologies, 5, 4638-4645.
  7. Ebrahimi, M., Khoshtaghaza, M. H., Minaei, S., & Jamshidi, B. (2017). Vision-based pest detection based on SVM classification method. Computers and Electronics in Agriculture, 137, 52-58. https://doi.org/10.1016/j.compag.2017.03.016
  8. Elfwing, S., Uchibe, E., & Doya, K. (2018). Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural Networks, 107, 3-11. https://doi.org/10.1016/j.neunet.2017.12.012
  9. Gutierrez, A., Ansuategi, A., Susperregi, L., Tubío, C., Rankić, I., & Lenža, L. (2019). A Benchmarking of Learning Strategies for Pest Detection and Identification on Tomato Plants for Autonomous Scouting Robots Using Internal Databases. Journal of Sensors, 5219471. https://doi.org/10.1155/2019/5219471
  10. Hornberg, A. (2017). Handbook of Machine and Computer Vision: The Guide for Developers and Users. Wiley-VCH. Germany.
  11. Jearanaiwongkul, W., Anutariya, C., Racharak, T., & Andres, F. (2021). An Ontology-Based Expert System for Rice Disease Identification and Control Recommendation. Applied Sciences, 11(21), 10450. https://doi.org/10.3390/app112110450
  12. Jha, K., Doshi, A., Patel, P., & Shah, M. (2019). A comprehensive review on automation in agriculture using artificial intelligence. Artificial Intelligence in Agriculture, 2, 1-12. https://doi.org/10.1016/j.aiia.2019.05.004
  13. Karabatak, M., & Ince, M. C. (2009). An expert system for detection of breast cancer based on association rules and neural network. Expert Systems with Applications, 36, 3465-3469. https://doi.org/10.1016/j.eswa.2008.02.064
  14. Khanramaki, M., Askari Asli-Ardeh, E., Kozegar, E., & Loni, R. (2021). Detection of common citrus pests in northern Iran using an artificial neural network. Journal of Food Science and Technology (Iran), 17(109), 143-152. https://fsct.modares.ac.ir/article-7-43117-en.html
  15. Kusrini, K., Suputa, S., Setyanto, A., Agastya, I. M. A., Priantoro, H., Chandramouli, K., & Izquierdo, E. (2020). Dataset for pest classification in Mango farms from Indonesia. Mendeley Data, V1. https://doi.org/10.17632/94jf97jzc8.1
  16. Lu, C. Y., Rustia, D. J. A., & Lin, T. T. (2019). Generative adversarial network based image augmentation for insect pest classification enhancement. IFAC-PapersOnLine, 52, 1-5. https://doi.org/10.1016/j.ifacol.2019.12.406
  17. Miranda, J. L., Gerardo, B. D., & Tanguilig, B. T. (2014). Pest detection and extraction using image processing techniques. International Journal of Computer and Communication Engineering, 3, 189. https://doi.org/10.7763/IJCCE.2014.V3.317
  18. Mitchell, T. M., Carbonell, J. G., & Michalski, R. S. (1986). Machine learning: a guide to current research.
  19. Najaf-Zadeh, A., & Ghaffari, H. R. (2020). A Two-Dimensional Convolutional Neural Network for Brain Tumor Detection from MRI. Original. Quarterly of Horizon of Medical Sciences, 26(4), 398-413. https://doi.org/10.32598/hms.26.4.3303.1
  20. Ngugi, L. C., Abdelwahab, M., & Abo-Zahhad, M. (2020). Tomato leaf segmentation algorithms for mobile phone applications using deep learning. Computers and Electronics in Agriculture, 178, https://doi.org/10.1016/j.compag.2020.105788
  21. Opara, L. (2004). Emerging technological innovation triad for smart agriculture in the 21st century. Part I. Prospects and impacts of nanotechnology in agriculture.
  22. Pantazi, X. E., Tamouridou, A. A., Alexandridis, T., Lagopodi, A. L., Kontouris, G., & Moshou, D. (2017). Detection of Silybum marianum infection with Microbotryum silybum using VNIR field spectroscopy. Computers and Electronics in Agriculture, 137, 130-137. https://doi.org/10.1016/j.compag.2017.03.017
  23. Pathak, M. (1968). Ecology of common insect pests of rice. Annual Review of Entomology, 13, 257-294.
  24. Pew Research Center. Feb. 05, 2019. Smartphone Ownership Is Growing Rapidly Around the World, but Not Always Equally. pewresearch.org/global/2019/02/05/smartphone-ownership-is-growing-rapidly-around-the-world-but-not-always-equally
  25. Pohl, C., Kanniah, K. D., & Loong, C. K. (2016). Monitoring oil palm plantations in Malaysia. Pages 2556-2559. 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS): IEEE.
  26. Praveen, B., & Sharma, P. (2020). A review: The role of geospatial technology in precision agriculture. Journal of Public Affairs, 20, e1968. https://doi.org/10.1002/pa.1968
  27. Silveira, M., & Monteiro, A. (2009). Automatic recognition and measurement of butterfly eyespot patterns. Biosystems, 95, 130-136. https://doi.org/10.1016/j.biosystems.2008.09.004
  28. Soomro, T. R. (2015). GIS enabling smart agriculture. Smart agriculture: An approach towards better agriculture management: 1-6.
  29. Soomro, T. R., Naqvi, M. R., & Zheng, K. (2001). GIS: A Weapon to Combat the Crime. Pages 228-230. Proceedings of the World Multiconference on Systemics, Cybernetics and Informatics: Information Systems Development-Volume I-Volume I.
  30. Tajbakhsh, N., Shin, J. Y., Gurudu, S. R., Hurst, R. T., Kendall, C. B., Gotway, M. B., & Liang, J. (2016). Convolutional neural networks for medical image analysis: Full training or fine tuning? IEEE Transactions on Medical Imaging, 35, 1299-1312. https://doi.org/10.1109/TMI.2016.2535302
  31. Wen, C., Wu, D., Hu, H., & Pan, W. (2015). Pose estimation-dependent identification method for field moth images using deep learning architecture. Biosystems Engineering, 136, 117-128. https://doi.org/10.1016/j.biosystemseng.2015.06.002
  32. Yan, Y., Feng, C. C., & Chang, K. T. T. (2017). Towards enhancing integrated pest management based on volunteered geographic information. ISPRS International Journal of Geo-Information, 6, 224. https://doi.org/10.3390/ijgi6070224
  33. Zhong, Y., Gao, J., Lei, Q., & Zhou, Y. (2018). A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture. Sensors, 18(5),1489. https://doi.org/10.3390%2Fs18051489
  34. Zhu, Z., Zhang, R., & Sun, J. (2009). Research on GIS-based agriculture expert system. Pages 252-255. 2009 WRI World Congress on Software Engineering: IEEE.
CAPTCHA Image