Open Access
1 March 2007 Study of ambient light influence for three-dimensional scanners based on structured light
Sophie Voisin, Sebti Foufou, Frédéric Truchetet, David L. Page, Mongi A. Abidi
Author Affiliations +
Abstract
Ambient light in a scene can introduce errors into range data from most commercial three-dimensional range scanners, particularly scanners that are based on projected patterns and structured lighting. We study the effects of ambient light on a specific commercial scanner. We further present a method for characterizing the range accuracy as a function of ambient light distortions. After a brief review of related research, we first describe the capabilities of the scanner we used and define the experimental setup for our study. Then we present the results of the range characterization relative to ambient light. In these results, we note a systematic error source that appears to be an artifact due to a structured light pattern. We conclude with a discussion of this error and the physical meaning of the results overall.

1.

Introduction

Three-dimensional (3D) scanners are used in more and more applications, and range accuracy is the main challenge for manufacturers. The knowledge of this accuracy helps to better exploit the results and the information from the scanners. Usually when the manufacturer does not provide enough information, the user has to characterize the scanner before using it. This letter describes some preliminary results we have obtained during the characterization of a commercial 3D scanner based on structured light.1 More precisely, we will focus on the study of range accuracy with respect to an illuminant that our reproducible experimental setup has provided.

This letter is organized as follows: Section 2 presents some related works. The scanner and experimental setup are described in Sec. 3. We present and discuss the results in Sec. 4. Conclusions and future works are given in Sec. 5.

2.

Related Works

Two categories of methods emerge from the literature on range accuracy: methods based on the optical transfer function2, 3, 4 and methods based on measurement of known objects.5, 6, 7, 8 We have also reviewed color accuracy characterization methods, which can also be classified in two categories: the colorimetric-based camera characterization9 and the spectral-based characterization.10, 11, 12 Considering this knowledge, we have investigated the influence of color on range accuracy of a 3D scanner based on structured light. During our initial acquisitions, we have also observed that ambient light influences the results of color information, which is similar to the result provided by two-dimensional imaging experimentation. So far this phenomenon has not been studied for 3D scanners based on structured light.

3.

Experiments

Usually 3D scanners are chosen based on their abilities to digitize under particular conditions. It is notable that the scanner used, in this study, is designed to capture objects under ambient illuminant and give a 3D textured mesh as the output. It is based on structured light and more details can be found in Refs. 13, 14. Basically, the projected pattern is composed of vertical spatiotemporal modulated stripes and the field of view is a boxlike area, 510×400×300mm3 (W×H×D) , whose center is 1m away from the front of the scanner.

The setup for the system was constant for all the experiments. It consisted of placing a Macbeth ColorChecker chart, a color grid composed of 24 colored patches, in a light booth displaying different illuminants. The chart was placed following the proprietary recommended setup in the light booth as perpendicular as possible with respect to the scanner. In addition, we fixed this chart on a special support to keep it flat. The experiments were done sequentially without modifying the system setup. The only varying parameter was the illuminant.

4.

Results and Discussion

To evaluate the results for each patch, we manually selected the faces in the middle area ( 30×30mm2 instead of 40×40mm2 per patch) to avoid human perception bias and possible inaccurate overlaying of the color information. For a concise presentation, we only show in Fig. 1 the results of 2 acquisitions among 7 and a single patch among 24.

Fig. 1

Graphical representations of the bluish-green patch of the Macbeth ColorChecker under two different illuminants: (a) dark night 1, (b) daylight.

030502_1_1.jpg

For meaningful measurements of the range accuracy, we had to know the exact orientation of the chart with respect to the scanner. Due to the design of the commercial scanner we used, we could not perfectly know its relative position. Therefore, we statistically chose a reference patch and considered its orientation to be the same as the chart patch with respect to the scanner. From this reference patch, we obtained a reference plane PR that we used to compute the geometric deviation Δ for each patch under each illuminant using Eq. 1. In this equation, δ(PR,M) is the signed distance between the reference plane PR and a point M on the surface patch S . The deviation Δ shows the range accuracy of the scanner because it represents the deviation of the points with respect to their theoretical positions; the results are shown in Table 1. Δ should be equal to 0 if the scanner is perfect.

Eq. 1

Δ=maxM+S[0,δ(PR,M+)]minMS[δ(PR,M),0].

Table 1

The deviation Δ of each patch under each illuminant with respect to the reference plane PR (given in millimeters).

Patch ColorDarkNight 1DaylightCoolWhiteUVU30DarkNight 2Horizon
Dark skin1.210.961.191.111.060.985.46
Light skin0.650.750.640.550.500.46247.42
Blue sky0.530.540.650.570.570.3355.57
Foliage0.850.780.670.891.070.733.49
Blue flower0.420.690.430.530.600.42226.77
Bluish green0.311.320.440.500.690.362.37
Orange0.650.740.790.660.770.68231.18
Purplish blue0.580.590.650.540.650.703.00
Moderate red0.580.610.580.480.710.60105.70
Purple0.760.710.800.720.910.6910.43
Yellow green0.450.560.590.460.700.489.41
Orange yellow0.451.710.600.440.860.50118.85
Blue1.020.991.040.930.840.903.55
Green0.650.600.780.500.600.572.53
Red0.540.640.680.560.580.5456.21
Yellow0.381.040.840.280.890.36173.68
Magenta0.520.670.610.520.410.51181.66
Cyan0.590.810.770.600.470.621.98
White 9.51.1723.2416.141.1312.131.23232.39
Neutral 80.431.310.600.410.860.599.79
Neutral 6.30.290.440.490.350.400.551.29
Neutral 50.460.450.530.450.590.792.33
Neutral 3.50.900.840.890.820.950.953.39
Black 22.851.882.343.022.223.019.33
Range2.5622.815.712.7411.732.68246.13

We have observed that the illuminant more or less influences the range accuracy depending on the original color. As we can see in Table 1, the daylight illuminant creates a small deviation Δ equal to 0.44mm for the neutral 6.3 patch and a large deviation equal to 23.24mm for the white 9.5 patch. We have also observed that the deviation range varies with the illuminant. For instance, the dark night 1 illuminant induces a range of 2.56mm from 0.29mm for the neutral 6.3 patch to 2.85mm for the black 2 patch, the horizon illuminant induces a huge range of 246.13mm from 1.29mm for the neutral 6.3 patch to 247.42mm under the light skin patch, and the cool white illuminant implies a range of 15.71mm from 0.43mm for the blue flower patch to 16.14mm for the white 9.5 patch.

In addition, we also have observed that a systematic error appears for certain colors under certain illuminants as shown in Fig. 1. We denote by systematic error the repetitive wavy effect that we can see in Fig. 1b along the patch surface, which is a distinct contrast to the quasi-flat appearance of the same colored patch under a different illuminant in Fig. 1a. This error seems to come from the projected pattern itself and to be dependent on the color but more statistical studies are necessary to be conclusive. However, this phenomenon can be physically explained with the response model of the digital camera given in Eq. 2. This equation represents the digital camera response rni,j for each channel n (in our case, the three red-green-blue channels) at the pixel (i,j) with respect to the spectral power distribution (how a light source is distributed across the different wavelengths) E(λ) , the surface reflectance (amount of light reflected by a surface) Si,j(λ) at the pixel (i,j) and the spectral sensitivity of the sensor (sensor sensitivity with respect to wavelength) Rn(λ) over the visible spectrum Λ . In our study case, the spectral power distribution, Eq. 3, is composed of two components: the illuminant spectral power distribution EI(λ) and the projector spectral power distribution EP(λ)

Eq. 2

rni,j=ΛE(λ)Si,j(λ)Rn(λ)dλ,

Eq. 3

E(λ)=EI(λ)+EP(λ).

As long as a scanner uses only one wavelength to reconstruct the 3D information, E(λ) is the same for each pixel under the illuminant. This case study had been investigated by Clark and Robson.6 When different wavelengths are projected or, in our case, vertical stripes of different gray level, E(λ) nonlinearly varies with each pixel column. The correspondence between the wavelength λk and the angle θk is no longer the same as the manufacturer calibration. Therefore, the computation of the depth information D , the distance between a point on the object surface and the sensor, is no longer accurate. For instance, D is computed with Eq. 4, which is used for standard triangulation

Eq. 4

D=Bsinθksin(αi,j+θk).
To summarize, a false correspondence between the wavelength λk and the angle θk leads to a wrong computation of the depth information D .

5.

Conclusion and Future Work

In this letter, we have shown that, illumination exhibits a strong influence on range accuracy from structured light, in addition to its well-known influence on color accuracy, scanners. We have statistically evaluated the orientation of the Macbeth ColorChecker with respect to the scanner to estimate the range accuracy as the deviation Δ . We have proposed a physical explanation for the systematic error we have observed from some colored patches. Future work will investigate this systematic error in more detail to define eventual reduction or elimination actions.

Acknowledgments

This work is supported by the University Research Program in Robotics under Grant No. DOE-DE-FG52-2004NA25589 and by the DOD/RDECOM/NAC/ARC Program under Grant No. W56HZV-04-2-2001.

References

1. 

J. Salvi, J. Pages, and J. Battle, “Pattern codification strategies in structured light systems,” Pattern Recogn., 37 (2), 827 –849 (2004). https://doi.org/10.1016/j.patcog.2003.10.002 0031-3203 Google Scholar

2. 

S. Dore and Y. Goussard, “Experimental determination of CT point spread function anisotropy and shift-variance,” 788 –791 (1997). Google Scholar

3. 

M. Goesele, C. Fuchs, and H.-P. Seidel, “Accuracy of 3D scanners by measurement of the slanted edge modulation transfer function,” 37 –44 (2003). Google Scholar

4. 

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisition devices,” Opt. Eng., 30 (2), 170 –177 (1991). https://doi.org/10.1117/12.55783 0091-3286 Google Scholar

5. 

J.-A. Beraldin and M. Gaiani, “Evaluating the performance of close range 3D active vision systems for industrial design applications,” Proc. SPIE, 5665 7 –77 (2005). 0277-786X Google Scholar

6. 

J. Clark and S. Robson, “Accuracy of measurements made with a Cyrax 2500 laser scanner against surfaces of known colour,” 1031 –1036 (2004) Google Scholar

7. 

S. El-Hakim, J.-A. Beraldin, and F. Blais, “A comparative evaluation of the performance of passive and active 3D vision systems,” Proc. SPIE, 2646 14 –25 (1995). 0277-786X Google Scholar

8. 

G. Sansoni, M. Carocci, and R. Rodella, “Calibration and performance evaluation of a 3D imaging sensor based on the projection of structured light,” IEEE Trans. Instrum. Meas., 49 (3), 628 –636 (2000). https://doi.org/10.1109/19.850406 0018-9456 Google Scholar

9. 

T. Johnson, “Methods for characterizing colour scanners and digital cameras,” Displays, 16 (4), 183 –192 (1996). https://doi.org/10.1016/0141-9382(96)01012-8 0141-9382 Google Scholar

10. 

G. D. Finlayson, S. Hordley, and P. M. Hubel, “Recovering device sensitivities with quadratic programming,” J. Imaging Sci. Technol., 6 90 –95 (1998). 1062-3701 Google Scholar

11. 

J. Y. Hardeberg, H. Brettel, and F. Schimitt, “Spectral characterization of electronic cameras,” 100 –109 (1998). Google Scholar

12. 

L. MacDonald and W. Ji, “Colour characterization of a high-resolution digital camera,” J. Imaging Sci. Technol., 1 433 –437 (2002). 1062-3701 Google Scholar

13. 

J. Geng, P. Zhuang, P. May, S. Yi, and D. Tunnell, “3D FaceCam™: A fast and accurate 3D facial imaging device for biometrics applications,” 316 –327 (54042004). Google Scholar

14. 

Z. J. Geng, “Rainbow 3-dimensional camera: New concept of high-speed 3-dimensional vision system,” Opt. Eng., 35 (2), 376 –383 (1996). https://doi.org/10.1117/1.601023 0091-3286 Google Scholar
©(2007) Society of Photo-Optical Instrumentation Engineers (SPIE)
Sophie Voisin, Sebti Foufou, Frédéric Truchetet, David L. Page, and Mongi A. Abidi "Study of ambient light influence for three-dimensional scanners based on structured light," Optical Engineering 46(3), 030502 (1 March 2007). https://doi.org/10.1117/1.2717126
Published: 1 March 2007
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
3D scanning

Scanners

Structured light

Laser scanners

Error analysis

Manufacturing

Sensors

Back to Top