with the collaboration of Iranian Society of Mechanical Engineers (ISME)

Document Type : Research Article

Authors

Department of Mechanics of Biosystem, Faculty of Engineering & Technology, College of Agriculture & Natural Resources, University of Tehran, Karaj, Iran

Abstract

Introduction
Increasing the production efficiency is an important goal in precision farming. The use of precision farming requires a lot of labor work. Also, due to the risk of agricultural operations, it is not recommended to do it directly by humans. Therefore, it is necessary for agricultural operations to be carried out automatically. For this reason, the application of robotics in agricultural environments, especially in the greenhouse, is increasing. The first step in automatic farming is autonomous navigation. For autonomous navigation, a robot must be the ability to understand its environment and recognize its position. In other words, a robot must be able to create a map of an unknown environment, locate itself on this map and finally plane for the path. This problem is solvable by Simultaneous Localization and Mapping (SLAM). The SLAM problem is a recursive estimation process. In the other words, when a robot moves in an unknown environment, mapping and localization errors increase incrementally. To reduce these two errors, a recursive estimation process is used to solve the SLAM problem.
Materials and Methods
In this research, two webcams, made by Microsoft Corporation with the resolution of 960×544, are connected to the computer via USB2 in order to produce a stereo parallel camera. For this study, we used a greenhouse that was located the Arak, Iran. Before taking stereo images, a camera path was designed in the greenhouse. This path may be either straight or curved. The designed path was implemented in the greenhouse. The entire path traversed by a stereo camera was 32.7 m and 150 stereo images were taken. Graph-SLAM algorithm was used for Simultaneous Localization and Mapping in the greenhouse. Using the ROS framework, the SLAM algorithm was designed with nodes and network for connecting the nodes.
Results and Discussion
For evaluation, the stereo camera locations, every step was measured manually and compared with the stereo camera locations that were estimated in the graph-SLAM algorithm. The position error was calculated through the Euclidean distance (DE) between the estimated points and the actual points. The results of this study showed that, the proposed algorithm has an average of error 0.0679412, standard deviation of 0.0456431 and root mean square error (RMSE) of 0.0075569 for camera localization.
In this research, only a stereo camera was used to prepare a map of the environment, but other researches have used multiple sensor combinations. Another advantage of this research related to others was created a 3D map (point cloud) of the environment and loop closer detection. In the 3D map, in addition to determining the exact location of the plant, the height of the plant can also be estimated. Plant height estimate is important in some agricultural operations such as spot spray, harvesting and pruning.
Conclusion
Due to the risk of agricultural activities, the use of robotics is essential. Autonomous navigation is one of the branches of the robotics. For autonomous navigation, a map of environment and localization in this map is need. The purpose of our research was to provide simultaneous localization and mapping (SLAM) in agricultural environments. ROS is a strong framework for solving the SLAM problem. So that, this problem can be solved by combining different nodes in ROS. The method depended only on the information from the stereo camera because stereo camera provided exact distance information. We believe that this study will contribute to the field of autonomous robot applications in agriculture. In future studies, it is possible to use an actual robot in the greenhouse with various sensors for SLAM and path planning.

Keywords

Open Access

©2020 The author(s). This article is licensed under Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source.

1. Audras, C., A. Comport, M. Meilland, and P. Rives. 2011. Real-time dense appearance-based slam for rgb-d sensors. In: Australasian Conf. on Robotics and Automation.
2. Auta Cheein, F. A., G. Steiner, G. P. Paina, and R. Carelli. 2011. Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection. Computers and Electronics in Agriculture 78: 195-207.
3. Barth, R., J. Hemming, and E. J. V. Henten. 2016. Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosystems Engineering 146: 71-84.
4. Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. 2008. Speeded-up robust features (SURF). Computer Vision and Image Understanding 110: 346-359.
5. Bhatti, A. 2011. Global 3D Terrain Maps for Agricultural Applications. Pages 227-242 in Rovira-Mas F, ed. Advances in theory and applications of stereo vision. InTech. Croatia.
6. Borenstein, J., and Y. Koren. 1991. The vector field histogram-fast obstacle avoidance for mobile robots. IEEE Transactions on Robotics and Automation 7 (3): 278-288.
7. Bradski, G., and A. Kaehler. 2008. Learning OpenCV: Computer vision with the OpenCV library. O'Reilly Media, Inc. Sebastopol, CA.
8. Craig, J. J. 2005. Introduction to robotics: mechanics and control. Pearson Prentice Hall. Upper Saddle River, New Jersey, USA.
9. Cyganek, B., and J. P. Siebert. 2009. An Introduction to 3D Computer Vision Techniques and Algorithms. John Wiley & Sons, Ltd. United Kingdom.
10. Diebel, J., K. Reutersward, S. Thrun, J. Davis, and R. Gupta. 2004. Simultaneous Localization and Mapping with Active Stereo Vision. IEEE/RSJ International Conference on Intelligent Robots and Systems pp: 3437-3443.
11. Elfes, A. 1990. Occupancy grids: Astochastic spatial representation for active robot perception. In: Proceedings of the Sixth Conference on Uncertainty in AI.
12. Eliazar, A. 2003. DP-SLAM: Fast, robust simultaneous localization and mapping without predetermined landmarks. International Joint Conference on Artificial Intelligence.
13. Eliazar, A. I., and R. Parr. 2004. DP-SLAM 2.0. Robotics and Automation. Proceedings. ICRA ’04. 2004 IEEE International Conference.
14. Estrada, C., J. Neira, and J. D. Tardos. 2005. Hierarchical SLAM: Real-Time Accurate Mapping of Large Environments. Robotics, IEEE Transactionson 21 (4): 588-596.
15. Grisetti, G., C. Stachniss, and W. Burgard. 2005. Improving grid based SLAM with Rao blackwellized particle filters by adaptive proposals and selective resampling. In Proceedings of the IEEE international conference on robotics and automation pp: 2432-2437.
16. Grisetti, G., C. Stachniss, and W. Burgard. 2007. Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Transactions in Robotics 23: 34-46.
17. Grisetti, G., R. Kuemerle, C. Stachniss, and W. Burgard. 2010. A tutorial on graph-based SLAM. IEEE Transactions on Intelligent Transportation Systems 2: 31-43.
18. Kim, G. H., J. S. Kim, and K. S. Hong. 2005. Vision-based Simultaneous Localization and Mapping with Two Cameras. IEEE/RSJ international Conference on Intelligent Robots and Systems.
19. Kitt, B., A. Geiger, and H. Lategahn. 2010. Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme. In Intelligent Vehicles Symposium. University of California, San Diego, CA, USA.
20. Kohlbrecher, S., J. Meyer, O. von Stryk, and U. Klingauf. 2011. A flexible and scalable SLAM system with full 3D motion estimation. In Proceedings of the 2011 IEEE international symposium on safety, security and rescue robotics. Japan, pp: 155-160.
21. Kohlbrecher, S., J. Meyer, T. Graber, K. Petersen, U. Klingauf, and O. Stryk. 2013. Hector open source modules for autonomous mapping and navigation with rescue robots. TU Darmstadt, Germany, Department of Computer Science.
22. Labbe, M., and F. Michau. 2014. Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM. in IEEE/RSJ International Conference on Intelligent Robots and Systems.
23. Labbe, M., and F. Michaud. 2011. Memory management for real-time appearance-based loop closure detection. in IEEE/RSJ International Conference on Intelligent Robots and Systems.
24. Labbe, M., and F. Michaud. 2013. Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation. IEEE Transactions on Robotics 29: 734-745.
25. Langaniere, R. 2011. OpenCV 2 Computer Vision Application Programming Cookbook.
26. Leonard, J., and H. Durrant-Whyte. 1991. Mobile robot localization by tracking geometric beacons. IEEE Transactions on Robotics and Automation 7: 376-382.
27. Lepej, P., and J. Rakun. 2016. Localization and mapping in a complex field environment. Biosystems engineering 150: 160-169.
28. Li, M. H., B. R. Hong, Z. S. Cai, S. H. Piao, and Q. C. Huang. 2008. Novel indoor mobile robot navigation using monocular vision. Engineering Applications of Artificial Intelligence 21: 485-497.
29. Longuet-Higgins, H. 1987. A computer algorithm for reconstructing a scene from two projections. Readings in Computer Vision: Issues, Problems, Principles, and Paradigms, MA Fischler and O. Firschein, eds pp: 61-62.
30. Milella, A., B. Nardelli, D. Di Paola, and G. Cicirelli. 2009. Robust Feature Detection and Matching for Vehicle Localization in Uncharted Environments. In Proceedings of the IEEE/RSJ IROS Workshop Planning, Perception and Navigation for Intelligent Vehicles. Saint Louis, USA.
31. Montemerlo, M., S. Thrun, D. Koller, and B. Wegbreit. 2002. FastSLAM: a factored solution to the simultaneous localization and mapping problem. In AAAI National Conference on Artificial Intelligence
32. Mousazadeh, H. and S. Javan bakht. 2015. Mechatronics and Intelligent Systems for Off-road Vehicles. University of Tehran (1th ed.). (In Farsi).
33. Nasiri, A. 2017. Creation of pathway map in a greenhouse environment using localization of cultivation platform based on stereo machine vision. Ph. D. dissertation. University of Tehran. (In Farsi).
34. Nasiri, A., H. Mobli, S. Hosseinpour, and Sh. Rafiee. 2016. Creation greenhouse environment map using localization of edge of cultivation platforms based on stereo vision. Journal of Agricultural Machinery 7 (2): 336-349. (In Farsi).
35. Nister, D. 2003. An efficient solution to the five-point relative pose problem. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2: II-195.
36. Nister, D., O. Naroditsky, and J. Bergen. 2004. Visual odometry. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 1: I-652.
37. Pierzchała, M., P. Giguère, and R. Astrupa. 2018. Mapping forests using an unmanned ground vehicle with 3D LiDAR and graph-SLAM. Computers and Electronics in Agriculture 145: 217-225.
38. ROS camera_calibration_ros software stack. Retrieved from: http://wiki.ros.org/ camera_calibration_ros.
39. Rovira-Mas, F., Q. Zhang, and J. F. Reid. 2008. Stereo vision three-dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture 60: 133-143.
40. Shalal, N., T. Low, Ch. McCarthy, and N. Hancock. 2015. Orchard mapping and mobile robot localization using on-board camera and laser scanner data fusion – Part B: Mapping and localization. Computers and Electronics in Agriculture xxx: xxx–xxx.
41. Thrun, S., W. Burgard, and D. Fox. 1998. A probabilistic approach to concurrent mapping and localization for mobile robots. Autonomous Robots 3: 18.
42. Thrun, S., W. Burgard, and D. Fox. 2005. Probabilistic robotics. Cambridge, USA: The MIT Press.
43. Vazquez-Arellano, M., D. Reiser, D. S. Paraforos, M. Garrido-Izard, Me. C. Burce, and H. W. Griepentrog. 2018. 3-D reconstruction of maize plants using a time-of-flight camera. Computers and Electronics in Agriculture 145: 235-247.
44. Zhang, Z. 1998.A flexible new technique for camera calibration. Available at: http://citeseer. IST. Psu. Edu/316762. html.
45. Zhang, Z. 1999. Flexible camera calibration by viewing a plane from unknown orientations. In Computer Vision. The Proceedings of the Seventh IEEE International Conference on. Kerkyra, Greece.
CAPTCHA Image