MATEC Web Conf.
Volume 68, 20162016 The 3rd International Conference on Industrial Engineering and Applications (ICIEA 2016)
|Number of page(s)||6|
|Section||Design and Development of Robots|
|Published online||01 August 2016|
Objects Classification for Mobile Robots Using Hierarchic Selective Search Method
1 Information Science and Engineering, Hunan University, 410082 Changsha, China
2 Civil Engineering, Hunan University, 410082 Changsha, China
Aiming at determining the category of an image captured from mobile robots for intelligent application, classification with the bag-of-words model is proved effectively in near-duplicate/planar images. When it comes to images from mobile robots with complex background, does it still work well? In this paper, based on the merging criterion improvement, a method named hierarchical selective search is proposed hierarchically extracting complementary features to form a combined and environment-adaptable similarity measurement for segmentation resulting a small and high-quality regions set. Simultaneously those regions rather than a whole image are used for classification. As a result, it well improved the classification accuracy and make the bog-of-word model still work well on classification for mobile robots. The experiments on hierarchical selective search show its better performance than selective search on two task datasets for mobile robots. The experiments on classification shows the samples from regions are better than those original whole images. The advantage of less quantity and higher quality object regions from hierarchical selective search is more prominent when it comes to those special tasks for mobile robots with scarce data.
© The Authors, published by EDP Sciences, 2016
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.