The measurement of breathing biomechanics, such as tidal volume, can be used to assess both the breathing performance and the respiratory health of individuals. State-of-the-art methods like spirometry or body plethysmography require a mouthpiece or facemask., which can be uncomfortable to the test person. As an alternative, we propose to use the change of the geometric shape of the subject’s torso while breathing. By acquiring 3D point clouds of the person with a real-time near-infrared (NIR) 3D scanner, we measure those changes in a comfortable, irritation-free, and contact-free manner. Accordingly, two continuously measuring structured light 3D sensors, using a GOBO-based aperiodic sinusoidal pattern projector at a wavelength of 850 nm, simultaneously capture the upper front and side torso of the subject at a frame rate of 200 Hz. Both 3D scanners are calibrated and operated in a sensor network fashion, yielding a unified data stream within a global coordinate system. This results in increased coverage and reduced occlusion of the patient’s body shape, enabling robust measurements even in the presence of loose clothing and varying body figure. We collected data from 16 healthy participants in an upright sitting position, wearing everyday clothing during the measurements. For reference, we simultaneously recorded spirometry readings. An algorithm (“OpTidal”) tracks the volume of the subject’s torso from the 3D data. Comparison whith the reference data shows high correlation and low mean error for the absolute tidal volume readings. As such, our method is a viable, safe, and accurate alternative to spirometry and plethysmography.
Pattern projection-based stereo 3D sensors are widely used for contactless, non-destructive optical 3D shape measurements. In previous works, we have shown 3D measurement systems based on stereo matching between two cameras with GOBO-projected aperiodic fringe patterns. We have also demonstrated a low latency 3D reconstruction algorithm (BICOS), which can be used for real time 3D measurements. We showed an optimization method for the projected aperiodic fringe patterns with the purpose of making the measurements more robust and to reduce the pattern sequence length without sacrificing 3D model completeness. In this contribution, we demonstrate a sensor for a medical application which aggregates these developments. Our sensor is used to monitor patient movement during radiation therapy. In this application a low measurement latency is of high importance. A significant part of this latency is caused by image acquisition. We show that we can reduce the number of required image pairs to 6 when optimizing the projected aperiodic fringe patterns. In combination with our BICOS algorithm, we can achieve total measurement latencies of below 80 ms at an accuracy of 355 μm in a measurement field of 1 m × 2 m.
Optical 3D measurement using active pattern projection is well known for its high precision and high 3D point density. Recently, increasing the reconstruction frame rate and the number of active sensors in a simultaneous and continuous operation regime used for sensor networks has become more important. Traditionally, light modulators such as LCoS, DMD, or GOBO (GOes Before Optics) have been used, which generate the projected pattern by blocking the light at dark areas of the pattern. In order to further increase the measurement speed and/or the number of time-sequential continuously active sensors, brighter light sources must be chosen to achieve sufficient short exposure times. Alternatively, as we show in this paper, a more efficient pattern modulator can be used. By applying an optical freeform element to generate an aperiodic sinusoidal fringe pattern, up to 100 % of the available light can be utilized. In our prototype, we show how to employ a freeform element moved in a linear bearing to create a compact low-cost, high-speed projection unit. Furthermore, to reduce the computational burden in processing numerous simultaneous image streams, we have implemented the rectification step of the 3D reconstruction pipeline into the field programmable gate array (FPGA) sensor module. Both approaches enable us to use structured light sensors for continuous high-speed 3D measurement tasks for industrial quality control. The presented prototype utilizes a single irritation-free near-infrared (NIR) LED to illuminate and reconstruct within a measurement field of approximately 300 mm × 300 mm at a measurement distance of 500 mm.
For continuous, low-latency, irritation-free 3D measurements in large-volumes, dot-pattern- or time-of-flight-based sensors have been traditionally used. However, measurement accuracy and temporal stability limits the application in demanding medical or industrial contexts. Practical solutions also need to remain cost-effective. To meet these requirements, we started from a simple GOBO-based, aperiodic sinusoidal pattern projection (using a near-infrared (NIR) LED) 3D sensor for medium-sized measurement volumes. By tuning the system for large-volume operation, we were able to obtain a reasonable combination of measurement accuracy and speed. The current realization covers a volume of up to 4.0 m x 2.2 m x 1.5 m (width x height x depth). The 3D data is acquired at < 20 fps at resolutions of < 1000 x 500 px and true end-to-end latencies of < 140 ms. We present the system architecture consisting of GigE Vision cameras, a high-power LED-driven projection unit using a GOBO wheel, and the compute backend for the online GPU-based, temporal pattern correlation 3D calculation and filtering. To compensate for the low pattern intensity due to the short exposure time, we operate the cameras in 2x2 binning. Furthermore, the optics are tuned for large apertures to maximize light throughput. We characterize the sensor system with respect to measurement quality by quantitative evaluations including probing error, sphere-spacing error, and flatness measurement error. By comparison with another 3D sensor as a baseline, we show the benefits of our approach. Finally, we present measurement examples from human-machine interface (HMI).
With recent advances in high speed 3D measurement sensor technologies, focus changes from merely acquiring 3D sensor data fast. An advanced application area is to fusion multiple sensor streams into a complete object representation without occlusions. Even more challenging is how to process the high speed 3D streams online, instead of the current offline processing approaches.
To this end we combine our cost-effective GOBO slide-based pattern projector (GOes Before Optics) with commodity GigE vision network sensors to a multi sensor system for complete online monitoring capabilities. The targeted use-case has to deal with partial occlusions and low latency requirements for machine control. Specifically, three active NIR stereo 3D sensors are aggregated through a 10Gb-Ethernet-switch and processed by a single GPU assisted workstation. Thus a combined continuous data-stream of up to 78 million 3D points is calculated online per second out of a raw 2D data-stream of up to approximately 1250 Mb/s. The systems latency for simpler 3D analysis task, like movement tracking, is ≤ 200 ms.
High-resolution contactless optical 3D measurements are well suited for determination of state and position of gas turbine vane cooling-holes during maintenance rework. The air flow through the cooling-holes protects the turbine vanes from the high temperatures. However, the coating needs to be renewed during repair of the vanes. The renewal process can lead to partially or completely filled cooling-holes. This paper describes a newly developed procedure to automatically detect and reopen such holes by laser-drilling for an effective new repair process. The turbine vane is scanned by a fringe projection based optical 3D scanner. The resulting 3D pointcloud delivers plenty of detail to automatically detect the cooling-holes. Poorly detected or undetected cooling-holes are interpolated from properly detected neighboring cooling-holes and reference default cooling-holes. For the resulting laser-drilling process the precise orientation in the vane mount must be known. To this end, position and orientation of the scanned vane in relation to the reference vane is determined. To validate the approach, numerous experiments regarding the cooling-hole extraction-performance were satisfactorily conducted. Real drilling experiments confirmed those findings and were used to validate the entire process.
Different concepts of correspondence findings in contactless optical three-dimensional (3-D) measurement systems using fringe projection are analyzed concerning the accuracy of the 3-D point calculation. These concepts are different concerning the kind of performance of the triangulation procedure in order to calculate the resulting 3-D points and the use of geometric constraints versus second projection sequence. Triangulation may be alternatively performed between camera pixels and the phase origin of the projection, between one camera pixel in the prior camera and the image of the corresponding measured phase value in the second camera, or between the image points of certain raster phase values in the two observation cameras. Additionally, triangulation procedures can be distinguished concerning the use of two perpendicular projection directions of the fringes versus the use of geometric constraints, i.e., epipolar geometry instead of the second projection direction. Advantages and disadvantages of the different techniques are discussed. In addition, a theoretical analysis of the application of synthetic data has been simulated as well as experiments performed on real measurement data. Both simulations and real data experiments confirm the theoretical assumptions concerning the magnitudes of the random errors occurring in 3-D point determination.
Portable 3D scanners with low measurement uncertainty are ideally suited for capturing the 3D shape of objects
right in their natural environment. However, elaborate manual post processing was usually necessary to build a
complete 3D model from several overlapping scans (multiple views), or expensive or complex additional hardware
(like trackers etc.) was needed. On the contrary, the NavOScan project[1] aims at fully automatic multi-view
3D scan assembly through a Navigation Unit attached to the scanner.
This light weight device combines an optical tracking system with an inertial measurement unit (IMU)
for robust relative scanner position estimation. The IMU provides robustness against swift scanner movements
during view changes, while the wide angle, high dynamic range (HDR) optical tracker focused on the measurement
object and its background ensures accurate sensor position estimations. The underlying software framework,
partly implemented in hardware (FPGA) for performance reasons, fusions both data streams in real time and
estimates the navigation unit’s current pose. Using this pose to calculate the starting solution of the Iterative
Closest Point registration approach allows for automatic registration of multiple 3D scans. After finishing the
individual scans required to fully acquire the object in question, the operator is readily presented with its finalized
complete 3D model!
The paper presents an overview over the NavOScan architecture, highlights key aspects of the registration
and navigation pipeline and shows several measurement examples obtained with the Navigation Unit attached
to a hand held structured-light 3D scanner.
Three different methods to realize point correspondences in 3D measurement systems based on fringe projection are
described and compared concerning accuracy, sensitivity, and handling. Advantages and disadvantages of the three
techniques are discussed. A suggestion is made to combine the principles in order to achieve an improved completeness
of the measurements.
The principle of a virtual image point raster which is the basis of the combination of the methods is explained. A model
to describe the random error of a 3D point measurement for the three methods is established and described. Simulations
and real measurements confirm this error model. Experiments are described and results are presented.
For a 360-deg 3D measurement of an object the optical 3D sensor scan the object from different positions and the
resulting single patches have to transform into a common global coordinate system so that these point clouds are
patched together to generate the final complete 3D data set. Here we summarize and give some system realizations
for the method, which we called "method of virtual landmarks" /1, 2/ realizing this local-global coordinate
transformation without accurate mechanical sensor handling, sensor tracking, markers fixed on the object or point-
cloud based registration techniques.
For this the calculation of the co-ordinates, orientation of the sensor and local-global coordinate transformation is
done by bundle adjustment methods, whereby the pixel of the so called connecting camera form 'virtual landmarks'
for the registration of the single views in order to obtain a complete all around image. The flexibility makes the
method useful for a wide range of system realizations which will be shown in the paper, like robot guided, handheld
/3/ and tripod based systems for the flexible measurement of complex and/or large objects.
Here a new set-up of a 3D-scanning system for CAD/CAM in dental industry is proposed. The system is designed for direct scanning of the dental preparations within the mouth. The measuring process is based on phase correlation technique in combination with fast fringe projection in a stereo arrangement. The novelty in the approach is characterized by the following features: A phase correlation between the phase values of the images of two cameras is used for the co-ordinate calculation. This works contrary to the usage of only phase values (phasogrammetry) or classical triangulation (phase values and camera image co-ordinate values) for the determination of the co-ordinates. The main advantage of the method is that the absolute value of the phase at each point does not directly determine the coordinate. Thus errors in the determination of the co-ordinates are prevented. Furthermore, using the epipolar geometry of the stereo-like arrangement the phase unwrapping problem of fringe analysis can be solved.
The endoscope like measurement system contains one projection and two camera channels for illumination and observation of the object, respectively. The new system has a measurement field of nearly 25mm × 15mm. The user can measure two or three teeth at one time. So the system can by used for scanning of single tooth up to bridges preparations. In the paper the first realization of the intraoral scanner is described.
A new mobile optical 3D measurement system using phase correlation based fringe projection technique will be
presented. The sensor consist of a digital projection unit and two cameras in a stereo arrangement, whereby both
are battery powered. The data transfer to a base station will be done via WLAN. This gives the possibility to
use the system in complicate, remote measurement situations, which are typical in archaeology and architecture.
In the measurement procedure the sensor will be hand-held by the user, illuminating the object with a sequence
of less than 10 fringe patterns, within a time below 200 ms. This short sequence duration was achieved by a new
approach, which combines the epipolar constraint with robust phase correlation utilizing a pre-calibrated sensor
head, containing two cameras and a digital fringe projector.
Furthermore, the system can be utilized to acquire the all around shape of objects by using the phasogrammetric
approach with virtual land marks introduced by the authors1, 2. This way no matching procedures or
markers are necessary for the registration of multiple views, which makes the system very flexible in accomplishing
different measurement tasks. The realized measurement field is approx. 100 mm up to 400 mm in diameter.
The mobile character makes the measurement system useful for a wide range of applications in arts, architecture,
archaeology and criminology, which will be shown in the paper.
Here we propose a method for 3D shape measurement by means of phase correlation based fringe projection in a
stereo arrangement. The novelty in the approach is characterized by following features. Correlation between phase
values of the images of two cameras is used for the co-ordinate calculation. This work stands in contrast to the sole
usage of phase values (phasogrammetry) or classical triangulation (phase values and image co-ordinates - camera
raster values) for the determination of the co-ordinates. The method's main advantage is the insensitivity of the 3D-coordinates
from the absolute phase values. Thus it prevents errors in the determination of the co-ordinates and
improves robustness in areas with interreflections artefacts and inhomogeneous regions of intensity. A technical
advantage is the fact that the accuracy of the 3D co-ordinates does not depend on the projection resolution. Thus the
achievable quality of the 3D co-ordinates can be selectively improved by the use of high quality camera lenses and
can participate in improvements in modern camera technologies.
The presented new solution of the stereo based fringe projection with phase correlation makes a flexible, errortolerant
realization of measuring systems within different applications like quality control, rapid prototyping, design
and CAD/CAM possible. In the paper the phase correlation method will be described in detail. Furthermore, different
realizations will be shown, i.e. a mobile system for the measurement of large objects and an endoscopic like system
for CAD/CAM in dental industry.
3D measurement of the shape of rough structures can be realised with structured light illumination techniques. Several
problems can arise while measuring complex object geometries with these techniques. Complex objects are characterized,
f.e. by deep holes, walls, concave and convex corner-like shaped surface structures. When illuminating the object, one
part of the object can "illuminate" another one, yielding locally spurious fringe patterns. Due to these spurious fringe
patterns the phase values are strongly distorted significantly increasing the measurement noise locally. Here we propose
methods how to detect and to avoid these spurious fringe patterns. The idea is to use the overestimated information which
is contained in the graycode and the sinusoidal intensity distribution. On the basis of this procedure, an operator is defined
which results in a mask operation. With this new method we can reduce the noise amplitude. In this paper, the detection and
reduction of the illumination effect using this operator will be demonstrated while measuring different object geometries.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.