with the collaboration of Iranian Society of Mechanical Engineers (ISME)

Document Type : Research Article

Authors

Mechanics of Agricultural Machinery Dept., Faculty of Agricultural Engineering and Technology, University of Tehran, Karaj, Iran

Abstract

Introduction
Stereo vision means the capability of extracting the depth based on analysis of two images taken from different angles of one scene. The result of stereo vision is a collection of three-dimensional points which describes the details of scene proportional to the resolution of the obtained images.
Vehicle automatic steering and crop growth monitoring are two important operations in agricultural precision. The essential aspects of an automated steering are position and orientation of the agricultural equipment in relation to crop row, detection of obstacles and design of path planning between the crop rows. The developed map can provide this information in the real time. Machine vision has the capabilities to perform these tasks in order to execute some operations such as cultivation, spraying and harvesting.
In greenhouse environment, it is possible to develop a map and perform an automatic control by detecting and localizing the cultivation platforms as the main moving obstacle. The current work was performed to meet a method based on the stereo vision for detecting and localizing platforms, and then, providing a two-dimensional map for cultivation platforms in the greenhouse environment.
Materials and Methods
In this research, two webcams, made by Microsoft Corporation with the resolution of 960×544, are connected to the computer via USB2 in order to produce a stereo parallel camera.
Due to the structure of cultivation platforms, the number of points in the point cloud will be decreased by extracting the only upper and lower edges of the platform. The proposed method in this work aims at extracting the edges based on depth discontinuous features in the region of platform edge.
By getting the disparity image of the platform edges from the rectified stereo images and translating its data to 3D-space, the point cloud model of the environments is constructed. Then by projecting the points to XZ plane and putting local maps together based on the visual odometry, global map of the environment is constructed.
To evaluate the accuracy of the obtained algorithm in estimation of the position of the corners, Euclidian distances of coordinates of the corners achieved by Leica Total Station and coordinates and resulted from local maps, were computed.
Results and Discussion
Results showed that the lower edges have been detected with better accuracy than the upper ones. Upper edges were not desirably extracted because of being close to the pots. In contrast, due to the distance between lower edge and the ground surface, lower edges were extracted with a higher quality. Since the upper and lower edges of the platform are in the same direction, the lower edges of the platform have been only used for producing an integrated map of the greenhouse environment. The total length of the edge of the cultivation platforms was 106.6 meter, that 94.79% of which, was detected by the proposed algorithm. Some regions of the edge of the platforms were not detected, since they were not located in the view angle of the stereo camera.
By the proposed algorithm, 83.33% of cultivation platforms’ corners, were detected with the average error of 0.07309 meter and mean squared error of 0.0076. Non- detected corners are due the fact that they were not located in the camera view angle. The maximum and minimum errors in the localization, according to the Euclidian distance, were 0.169 and 0.0001 meters, respectively.
Conclusion
Stereo vision is the perception of the depth of 3D with the disparity of the two images. In navigation, stereo vision is used for localizing the obstacles of movement. Cultivation platforms are the main obstacle of movement in greenhouses. Therefore, it is possible to design an integrated map of greenhouse environment and perform automatic control by localization of the cultivation platforms. In this research, the depth discontinuity feature in the locations of the edges, was used for the localization of the cultivation platforms’ edges. Using this feature, the size of the points required for establishing the point cloud model and also the associated processing time decreased, resulting improvement in the accuracy of determining coordination of the platforms’ corners.

Keywords

Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. 2008. Speeded-up robust features (SURF). Computer vision and image understanding 110: 346-359.
2. Benson, E., J. Reid, and Q. Zhang. 2003. Machine vision–based guidance system for an agricultural small–grain harvester. Transactions of the ASAE 46: 1255-1264.
3. Bhatti, A. 2011. Global 3D Terrain Maps for Agricultural Applications. Pages 227-242 in Rovira-Mas F, ed. Advances in theory and applications of stereo vision. InTech. Croatia.
4. Bradski, G., and A. Kaehler. 2008. Learning OpenCV: Computer vision with the OpenCV library. O'Reilly Media, Inc. Sebastopol, CA.
5. Brand, C., M. J. Schuster, H. Hirschmüller, and M. Suppa. 2014. Stereo-vision based obstacle mapping for indoor/outdoor SLAM. In IEEE/RSJ International Conference on Intelligent Robots and Systems. Chicago, IL, USA.
6. Canton, J., J. Donaire, and J. Sanchez-Hermosilla. 2012. Stereovision based software to estimate crop parameters in greenhouses. In Infomation Technology, Automation and Precision Farming. International Conference of Agricultural Engineering-CIGR-AgEng: Agriculture and Engineering for a Healthier Life. Valencia, Spain.
7. Civera, J., O. G. Grasa, A. J. Davison, and J. Montiel. 2009. 1-point RANSAC for EKF-based structure from motion. In IEEE/RSJ International Conference on Intelligent Robots and Systems. Louis, USA.
8. Craig, J. J. 2005. Introduction to robotics: mechanics and control. Pearson Prentice Hall. Upper Saddle River, New Jersey, USA.
9. Cyganek, B., and J. P. Siebert. 2009. An Introduction to 3D Computer Vision Techniques and Algorithms. John Wiley & Sons, Ltd. United Kingdom.
10. Hirschmuller, H. 2005. Accurate and efficient stereo processing by semi-global matching and mutual information. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05). San Diego, CA, USA.
11. Kise, M., and Q. Zhang. 2008. Development of a stereovision sensing system for 3D crop row structure mapping and tractor guidance. Biosystems Engineering 101: 191-198.
12. Kise, M., Q. Zhang, and F. R. Mas. 2005. A stereovision-based crop row detection method for tractor-automated guidance. Biosystems Engineering 90: 357-367.
13. Kitt, B., A. Geiger, and H. Lategahn. 2010. Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme. In Intelligent Vehicles Symposium. University of California, San Diego, CA, USA.
14. Milella, A., B. Nardelli, D. Di Paola, and G. Cicirelli. 2009. Robust Feature Detection and Matching for Vehicle Localization in Uncharted Environments. In Proceedings of the IEEE/RSJ IROS Workshop Planning, Perception and Navigation for Intelligent Vehicles. Saint Louis, USA.
15. Peyman, S. H., A. B. Ziaratgahi, and A. Jafari. 2016. Exploring the possibility of using digital image processing technique to detect diseases of rice leaf. Journal of Agricultural Machinery 6 (1): 69-79. (In Farsi).
16. Rosell, J., and R. Sanz. 2012. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Computers and Electronics in Agriculture 81: 124-141.
17. Rovira-Mas, F., Q. Zhang, and J. Reid. 2005. Creation of three-dimensional crop maps based on aerial stereoimages. Biosystems Engineering 90: 251-259.
18. Rovira-Mas, F., Q. Zhang, and J. F. Reid. 2008. Stereo vision three-dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture 60: 133-143.
19. So, G. J., S. H. Kim, and J. Y. Kim. 2014. The Extraction of Depth Discontinuities Using Disparity Map for Human Visual Fatigue. International Journal of Computer Theory and Engineering 6: 330-335.
20. Trucco, E., and A. Verri. 1998. Introductory techniques for 3-D computer vision. Prentice Hall. Englewood Cliffs, New Jersey, USA.
21. Xia, C., Y. Li, T. S. Chon, and J. M. Lee. 2009. A stereo vision based method for autonomous spray of pesticides to plant leaves. In Industrial Electronics, ISIE. IEEE International Symposium on. Seoul Olympic Parktel, Seoul, Korea.
22. Yang, L., and N. Noguchi. 2012. Human detection for a robot tractor using omni-directional stereo vision. Computers and Electronics in Agriculture 89: 116-125.
23. Yeh, Y. H. F., T. C. Lai, T. Y. Liu, C. C. Liu, W. C. Chung, and T. T. Lin. 2014. An automated growth measurement system for leafy vegetables. Biosystems Engineering 117: 43-50.
24. Zhang, Z. 1999. Flexible camera calibration by viewing a plane from unknown orientations. In Computer Vision. The Proceedings of the Seventh IEEE International Conference on. Kerkyra, Greece.
CAPTCHA Image