Paper
4 March 2019 Endoscopic orientation by multimodal data fusion
Author Affiliations +
Proceedings Volume 10931, MOEMS and Miniaturized Systems XVIII; 1093114 (2019) https://doi.org/10.1117/12.2508470
Event: SPIE OPTO, 2019, San Francisco, California, United States
Abstract
To improve the feasibility of endoscopic inspection processes we developed a system that provides online information about position, orientation and viewing direction of endoscopes, to support the analysis of endoscopic images and to ease the operational handling of the equipment. The setup is based on an industrial endoscope consisting of a camera, various MEMS and multimodal data fusion. The software contains algorithms for feature and geometric structure recognition as well as Kalman filters. To track the distal end of the endoscope and to generate 3D point cloud data in real time the optical and photometrical characteristics of the system are registered and the movement of the endoscope is reconstructed by using image processing techniques.
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Silvio Pulwer, Richard Fiebelkorn, Christoph Zesch, Patrick Steglich, Claus Villringer, Franccesco Villasmunta, Egbert Gedat, Jan Handrich, Sigurd Schrader, and Ralf Vandenhouten "Endoscopic orientation by multimodal data fusion", Proc. SPIE 10931, MOEMS and Miniaturized Systems XVIII, 1093114 (4 March 2019); https://doi.org/10.1117/12.2508470
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Endoscopes

Cameras

Endoscopy

Filtering (signal processing)

Sensors

Visualization

Data fusion

Back to Top