Translator Disclaimer
Paper
3 October 1995 Best-next-view algorithm for three-dimensional scene reconstruction using range images
Author Affiliations +
Abstract
The primary focus of the research detailed in this paper is to develop an intelligent sensing module capable of automatically determining the optimal next sensor position and orientation during scene reconstruction. To facilitate a solution to this problem, we have assembled a system for reconstructing a 3D model of an object or scene from a sequence of range images. Candidates for the best-next-view position are determined by detecting and measuring occlusions to the range camera's view in an image. Ultimately, the candidate which will reveal the greatest amount of unknown scene information is selected as the best-next-view position. Our algorithm uses ray tracing to determine how much new information a given sensor perspective will reveal. We have tested our algorithm successfully on several synthetic range data streams, and found the system's results to be consistent with an intuitive human search. The models recovered by our system from range data compared well with the ideal models. Essentially, we have proven that range information of physical objects can be employed to automatically reconstruct a satisfactory dynamic 3D computer model at a minimal computational expense. This has obvious implications in the contexts of robot navigation, manufacturing, and hazardous materials handling. The algorithm we developed takes advantage of no a priori information in finding the best-next-view position.
© (1995) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
J. E. Banta, Yu Zhien, X. Z. Wang, G. Zhang, M. T. Smith, and Mongi A. Abidi "Best-next-view algorithm for three-dimensional scene reconstruction using range images", Proc. SPIE 2588, Intelligent Robots and Computer Vision XIV: Algorithms, Techniques, Active Vision, and Materials Handling, (3 October 1995); https://doi.org/10.1117/12.222691
PROCEEDINGS
12 PAGES


SHARE
Advertisement
Advertisement
Back to Top