You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
29 August 2016 Spatiotemporal saliency detection using border connectivity
This paper proposes a border connectivity-based spatiotemporal saliency model for videos with complicated motion and complex scenes. Based on the superpixel segmentation results of video frames, feature extraction is performed to obtain the three features, including motion orientation histogram, motion amplitude histogram and color histogram. Then the border connectivity is exploited to evaluate the importance of three features for distance fusion. Finally the background weighted contrast and saliency optimization are utilized to generate superpixel-level spatiotemporal saliency maps. Experimental results on a public benchmark dataset demonstrate that the proposed model outperforms the state-of-the- art saliency models on saliency detection performance.
Tongbao Wu,Zhi Liu, andJunhao Li
" Spatiotemporal saliency detection using border connectivity", Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003344 (29 August 2016); https://doi.org/10.1117/12.2244867
The alert did not successfully save. Please try again later.
Tongbao Wu, Zhi Liu, Junhao Li, " Spatiotemporal saliency detection using border connectivity," Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003344 (29 August 2016); https://doi.org/10.1117/12.2244867