Open Access
6 June 2013 Three-dimensional imaging using fast micromachined electro-absorptive shutter
Yong-Hwa Park, Yong-Chul Cho, Jang-Woo You, Chang-Young Park, Hee-Sun Yoon, Sang-Hun Lee, Jong-Oh Kwon, Seung-Wan Lee, Byung Hoon Na, Gun Wu Ju, Hee Ju Choi, Yong Tak Lee
Author Affiliations +
Abstract
A 20-MHz switching high-speed light-modulating device for three-dimensional (3-D) image capturing and its system prototype are presented. For 3-D image capturing, the system utilizes a time-of-flight (TOF) principle by means of a 20-MHz high-speed micromachined electro-absorptive modulator, the so-called optical shutter. The high-speed modulation is obtained by utilizing the electro-absorption mechanism of the multilayer structure, which has an optical resonance cavity and light-absorption epilayers grown by metal organic chemical vapor deposition process. The optical shutter device is specially designed to have small resistor–capacitor–time constant to get the high-speed modulation. The optical shutter is positioned in front of a standard high-resolution complementary metal oxide semiconductor image sensor. The optical shutter modulates the incoming infrared image to acquire the depth image. The suggested novel optical shutter device enables capturing of a full high resolution-depth image, which has been limited to video graphics array (VGA) by previous depth-capturing technologies. The suggested 3-D image sensing device can have a crucial impact on 3-D–related business such as 3-D cameras, gesture recognition, user interfaces, and 3-D displays. This paper presents micro-opto-electro-mechanical systems-based optical shutter design, fabrication, characterization, 3-D camera system prototype, and image evaluation.

1.

Introduction

Three-dimensional (3-D) imaging is an emerging differentiator that provides consumers with more realistic and immersive experiences in user interfaces, games, 3-D-virtual reality, and 3-D displays. It has depth information (distance from camera to object) together with the conventional color image so that full information of real objects that human eyes experience can be captured, recorded, and reproduced. So far, stereo type 2 lens 3-D cameras capturing two separate color images have been introduced in the market especially for the application of stereo-vision displays. However, 3-D content will eventually expand to multi- and volumetric views, so-called realistic 3-D contents. To capture the realistic 3-D scene, depth information and high-definition color image should be captured simultaneously to generate views from arbitrary directions as the scene generated from real objects does.

Depth-capturing devices have been developed for games, industry, automobiles, and military applications so far.113 Among the existing depth-capturing technologies, structured light method (recently well known as Kinect)2,3 and TOF (time of flight) sensors based on the silicon image sensor [charge-couple device (CCD), complementary metal oxide semiconductor (CMOS) image sensor (CIS)] technology49 are commercialized or close to market release. The structured light method basically utilizes the well-known triangulation principle: depth is calculated by analyzing the image of a particular pattern on the object projected from the projector. Because the projector is placed apart from the image sensor a certain distance, we can extract depth information from the captured pattern image by the triangulation principle.2,3 The advantage of this method is that it can be realized with relatively low cost. Its disadvantage is that it needs a certain distance, normally several centimeters, which makes the overall system size larger in lateral direction than a normal one-lens camera. TOF sensors49 have small form factor: they use infrared (IR) light sources around the imaging lens so that they show camera-like form factors. Their disadvantage compared to pattern projection is that complex pixel structure is needed such as single photon detector4 or phase demodulation pixel structure,59 which is relatively expensive and delivers a low resolution compared with standard image sensors. The depth image resolution (i.e., number of pixels in depth image) obtained by the above two technologies have been limited to video graphics array (VGA) because of either limitation of pattern (in structured light methods) or complexity of the depth pixel structure (in TOF sensors). Among other TOF technologies, high-speed modulation has been developed in the field of gesture recognition or professional studio applications of depth capturing.1013 For example, a depth image with high resolution up to high definition (HD) can be obtained utilizing a high-speed light modulation with an image intensifier combined with an HD standard image sensor. But the image intensifier is very expensive and bulky.11,12

This work’s approach stems from the high-speed modulation method1013: depth image up to full high definition (FHD) resolution is obtained utilizing a suggested novel image modulator that has a small form factor and is easily made by mass production processes. A TOF sensor detects the phase delay of the reflected light from the emitted light by the time of flight of the modulated light. For this purpose, the speed of modulation should be sufficiently large, for example 20 MHz, to detect the phase delay of the modulated light traveling up to 15 m (the 20-MHz amplitude modulation of light makes a wavelength of 15 m). In this example, we can resolve the depth 0 to 7.5 m (half of total travel distance of the light) to be detected with full phase range of 0 to 360 deg. This maximizes the resulting depth accuracy. Otherwise, if we use 1 kHz low-speed modulation, for example, the depth 0 to 7.5 m can be detected with a partial phase range of 0 to only 1.8 deg, which results in low depth accuracy. The relevant detailed TOF principle is explained in the next section. This high-speed modulation of image is definitely difficult to achieve by conventional image-modulating devices such as liquid crystal and mechanical camera shutters. In this paper, a novel TOF method using a micro-opto-electro-mechanical systems (MOEMS)-based high-speed light modulator, the so-called optical shutter, is presented for high-resolution depth image capturing. A novel multilayered film structure is designed and fabricated to realize 20-MHz light modulation for TOF operation.

For the commercialization, 3-D image (color plus depth) should be easily captured by a camera-like system with high image quality and affordable price. For this purpose, one lens/two sensor system architecture is prototyped in this work for simultaneous capturing of 14 Mp color and FHD depth image. The optical shutter is positioned in front of a standard complementary metal oxide semiconductor (CMOS) image sensor to modulate the incoming IR images for depth image extraction. The optical shutter design, fabrication, characterization, 3-D camera prototype, and its image test are presented.

2.

TOF Principle

Depth capturing is based on the TOF principle as depicted in Fig. 1. The 3-D camera has an IR light source (e.g., 850-nm wavelength) with sinusoidal amplitude modulation (e.g., 20 MHz). It illuminates an object and the reflected IR light comes back to the camera imaging lens. Because the IR light travels two times the distance between the object and camera (so-called depth, d), there is a time delay of the reflected light from that of the illuminated light (so called time of flight, tTOF) such that

Eq. (1)

d=c2tTOF,
where c is the speed of light. There is a corresponding phase delay of the modulated reflected light (φTOF) whose relationship with tTOF is

Eq. (2)

tTOF=φTOF2πf,
where f is IR modulation frequency. Extraction of phase delay is a key process of the TOF system. The phase delay of each pixel can be identified with the homodyne mixing technique developed by this group.1416 Detailed description of the entire TOF process is abbreviated here: the reflected IR image is modulated by the optical shutter with the same modulation frequency (i.e., 20 MHz) before being captured by the CMOS image sensor. By allowing additionally controlled phase shifts, for example (0, 90, 180, and 270 deg) between IR light source and optical shutter driving signal, we can get four different IR images (I0°,I90°,I180°,I270°) sequentially. The phase delay due to TOF of each pixel can be identified by using the sequentially captured four IR images as

Eq. (3)

φTOF=tan1(I270°I90°I0°I180°).

Fig. 1

Schematic of time-of-flight (TOF) operation. Infrared (IR) source illuminates object with sinusoidal intensity modulation. Phase shifts of (0, 90, 180, 270 deg) are applied in IR modulation sequentially, and optical shutter is modulated with a fixed phase shift (0 deg). Modulated IR images through the optical shutter are sequentially captured by a standard high-resolution contact image sensor (CIS) resulting in four images (I0°,I90°,I180°,I270°). Depth image is extracted by using these four images as shown in Eqs. (3) and (4).

JM3_12_2_023011_f001.png

By combining Eqs. (1)–(3), we can finally get depth information of each pixel by utilizing the modulated IR images as

Eq. (4)

d=c4πftan1(I270°I90°I0°I180°).

It is notable that the resolution of the depth image (i.e., the number of pixels) is determined by that of the CMOS image sensor, which can have more than FHD resolution in current image sensor technology.

3.

High-Speed Optical Shutter

3.1.

Design

To enable the depth capturing, the optical shutter should modulate the incoming IR image with 20-MHz on-off speed as explained in the previous section. Conventional image shutters such as mechanical or liquid crystal cannot have 20-MHz modulation speed, since these have moving components. To get extraordinary high speed, the optical shutter is composed of nonmoving solid-state multilayer films, which is the novel concept of this work. The core mechanism of the optical shutter is controllable electro-absorption in multiple quantum well (MQW) combined with Fabry–Perot optical resonance.1719 Figure 2 shows its complete layer structure: from the top side, the optical shutter consists of p-doped electrode; p-doped distributed Bragg reflector (DBR); intrinsic MQW; n-doped DBR; and n-doped electrode. The shutter device is optically a Fabry–Perot narrow bandpass filter whose center wavelength is designed to have the same wavelength of IR light source of the 3-D camera (e.g., 850 nm).

Fig. 2

Layer structure of optical shutter.

JM3_12_2_023011_f002.png

The upper p-DBR and lower n-DBR mutually work as a pair of resonating mirrors, and the middle i-MQW works as a resonance cavity whose optical thickness is a multiple of half of the center wavelength (850 nm).1720 Control voltage is applied across the p- and n-electrodes with backward bias so that the light absorption of the i-MQW region is controlled by the well-known quantum confined stark effect (QCSE),21,22 as shown in Fig. 3. As the control voltage increases, the maximum absorption peak, so-called exciton peak, moves from 837 nm (if the electric field is zero) to 850 nm (if the electric field is 8.1V/μm). Consequently, at the center wavelength 850 nm of the optical shutter, we have variable transmittance of the input IR image by controlling the applied voltage as simulation shows in Fig. 4.

Fig. 3

Light absorption characteristics of i-multiple quantum well (i-MQW) region according to control electric field across p- and n-electrodes ruled by quantum confined stark effect (QCSE). Simulated in matlab by theoretical model provided with empirical data.

JM3_12_2_023011_f003.png

Fig. 4

Simulated IR transmittance of the optical shutter versus applied voltages (Macleod multilayer diffractive optics solver used).

JM3_12_2_023011_f004.png

In the design process, design parameters are material composition, thickness of each layer, and number of layers to maximize transmittance variation at the center wavelength of the optical shutter (850 nm). Figures 2 and 4 show the design result with 52% transmittance variation: when the applied electric field is 0V/μm, there is a small light absorption at 850 nm as shown in Fig. 3 and the transmittance of the optical shutter becomes maximum (67%); otherwise, when the applied electric field is 8.11V/μm, there is a large absorption at 850 nm and the transmittance of the shutter becomes minimum (15%) for a modulated 850-nm IR image through the optical shutter with 52% transmittance variation.

To get high speed and uniform control of transmittance over the transmitting area, the optical shutter was designed to have a special pattern of p-electrode to realize low sheet resistance. Also, the whole device is divided into electrically separated cells to reduce the capacitance of unit cell. Each cell is driven by an individual external voltage source.23,24 Figure 5 shows the device structure of the shutter having individually controllable cells. Each cell is separated by 10-μm-wide, 4-μm-deep trenches. The shape of the p-electrode (metal) should have small resistance over the top surface (i.e., many metals needed) together with large fill factor, i.e., the portion of the light transmittive area to entire top surface over the IR-receiving window of the shutter needs to be large (i.e., small metal needed). This is a trade-off situation, since metal electrodes make shadows. As a result, we did design optimization to determine the shape of the p-electrode.

Fig. 5

Optical shutter device: structural view with fish-bone p-electrodes pattern and multiple cells for reduction of capacitance of each cell.

JM3_12_2_023011_f005.png

For this, electro-optic coupling analysis with 3-D mesh modeling was accomplished to find the optimum shape of the p-electrode as shown in Fig. 6. The design parameter was set to a number of fingers of fishbone-shape metal electrode. By computing the fill factor and the modulation cutoff frequency of the cell structure, we can get the optimum result as shown in Table 1. As the number of fingers increases, the resistance decreases, so the cutoff frequency increases, whereas the fill factor of the active window decreases. As result, 10 fingers is chosen as the optimum design because it shows relatively uniform speed of 19.9 to 20.6 MHz over the entire cell with good fill factor of 95%.

Fig. 6

Three-dimensional mesh modeling of the unit cell and electro-optic coupling simulation to find optimum design of p-metal electrode. Full model is developed in Silvaco to get good accuracy. Three-dimensional reduced model is built by COMSOL for computational efficiency.

JM3_12_2_023011_f006.png

Table 1

Design optimization result of p-metal electrode. Design parameter is number of fingers of the p-electrode. The finger is metal so the increase of fingers results in a decrease in overall sheet resistance of the device, and in turn, an increase of cutoff frequency, which is inversely proportional to RC-time constant.

Fingers (n)Fill factor (%)Cutoff frequency (MHz)
3976.3 to 21.4
109519.9 to 20.6
209022.2 to 22.7

3.2.

Fabrication

The fabrication process has three steps as shown in Fig. 7. First, epitaxial growth of multilayer films on GaAs substrate by metal organic chemical vapor deposition (MOCVD). The thickness and composition of each layer were precisely controlled to have 837-nm exciton peak and 850-nm Fabry–Perot resonance peak; second, cell isolation with dry etching of trenches and p-electrode metal patterning; and third, GaAs substrate at the active window area through which the IR image is transmitted is opened by wet etching of substrate. To get accurate control of substrate etching thickness, an InGaP etch-stop layer between GaAs substrate and epitaxial multilayer films is utilized.24 As result of substrate etching, the multifilm constitutes an electrically controllable IR-transmittive membrane (so-called active window). Figure 8 shows the fabricated device. The size is 7.3×5.5mm. The multifilm structure has an active window of 6×4.5mm. The active window transmits and modulates IR images transferring to grayscale CMOS image sensor (CIS). The entire device is divided into 56 cells with fishbone metal width 10 μm and number of fingers 10 for each cell. Device fill factor is 95%.

Fig. 7

Fabrication process of the optical shutter.

JM3_12_2_023011_f007.png

Fig. 8

Fabricated optical shutter device with 95% fill-factor and 56 cells (top view, right: microscope picture). (a) Device design (mask). (b) Fabricated device structure (microscope).

JM3_12_2_023011_f008.png

3.3.

Characterization

Following are four characterization results of the optical shutter. The optical shutter is a kind of large-area p–i–n diode device working in backward bias condition. First, for stable operation of the shutter, breakdown voltage should be sufficiently larger than the operating voltage range. Figure 9 shows current–voltage (IV)-curve measurement for characterization of breakdown voltage. Average 27V was measured. The operating voltage range is 15 to 0 V within nonbreakdown range 27 to 0 V.

Fig. 9

Current–voltage (IV) measurement for characterization of breakdown voltage of the optical shutter. Average 27V was measured. The operating range is 15 to 0 V.

JM3_12_2_023011_f009.png

Second, to get enough IR intensity and good signal-to-noise ratio, the amount of modulation of IR light should be sufficiently large. This point is characterized by measuring the difference of transmittances between maximum and minimum so-called transmittance variation. When depth is calculated as in Eq. (4), transmittance variation, which makes the difference between intensities of IR images, directly influences the accuracy of the depth. Figure 10 shows measured transmittance spectrum of the optical shutter for different control voltages. Maximum 65% (at 0 V) and minimum 14% (at 9.3 V) transmittances were measured. The transmittance variation is 51% (target 50%). It is notable that the simulation (transmittance variation 52% in Fig. 4) predicted the measurement (Fig. 10) very well.

Fig. 10

Transmittance measurement for characterization of transmittance variation of the optical shutter. Maximum 65% (0 V), minimum 14% (9.3 V) transmittance were measured. The transmittance variation is 51% (target 50%).

JM3_12_2_023011_f010.png

Third, 20-MHz high speed modulation of the optical shutter was evaluated by its electro-optic frequency response, in which transmitted IR light with frequency-varying sinusoidal electric input is measured. Figure 11 shows the frequency response of the optical shutter. Cutoff frequency (3dB attenuation) of 20.3 MHz was achieved (target 20 MHz), which is close to simulation prediction summarized in Table 1.

Fig. 11

Frequency response measurement for characterization of speed of the optical shutter. Cutoff (3dB) frequency of 20.3 MHz or larger was measured (target 20 MHz).

JM3_12_2_023011_f011.png

Fourth, the spatial resolution of the transmitted IR image through the optical shutter should be properly preserved to get high-resolution depth image up to FHD (1920×1080, 2 M pixels). Figure 12 shows the ISO 12233 chart for evaluation of spatial resolution of the optical shutter. The IR image resolution is preserved up to 14 Mp before and after the optical shutter is placed in front of 14-Mp CMOS image sensor (target is 1920×1080 FHD, 2 Mp). As a result, the optical shutter does not degrade spatial resolution of the optical system. However, the experiments showed that the optical shutter should be placed 1 to 3 mm away from the focal plane of the CMOS image sensor to blur out the shadow image of metal electrodes on the photo-sensitive area of the image sensor.

Fig. 12

Modulation transfer function (MTF) measurement for characterization of spatial resolution of the optical shutter using ISO 12233 resolution chart. The IR image resolution is preserved up to 14 Mp before and after optical shutter is placed in front of CMOS image sensor [target is 1920×1080 full high definition (FHD), 2 Mp].

JM3_12_2_023011_f012.png

4.

3-D Camera System

4.1.

Architecture

For commercialization, 3-D images should be easily captured by a camera-like system with high image quality and affordable price. For this purpose, a one-lens/two-sensor system architecture as shown in Fig. 13 was designed and prototyped for simultaneous capturing of 14-Mp color and FHD depth images. The 3-D capturing system is based on the TOF scheme. The 3-D camera system consists of illumination and imaging modules: the illumination module is composed of 850-nm IR LD sources with 20-MHz amplitude modulation and collimating optics for efficient IR beam shaping.25,26 The imaging module consists of imaging lens set, splitter, depth channel, i.e., optical shutter plus black-and-white (BW) CMOS image sensor (CIS), and color channel, i.e., RGB CMOS image sensor. Incoming color and IR images are separated by the splitter according to their wavelength bands, i.e., visible and IR bands, then redirected toward color channel and depth channel simultaneously as shown in Fig. 13. The transmittance spectrum of the fabricated splitter is plotted in Figure 14. The visible band (400 to 700 nm) is reflected with reflectance about 98%, i.e., transmittance below 2%, and IR band around 850 nm is transmitted with transmittance 99%. Since the wavelength division efficiency of the splitter is about 98% to 99% by the filter design shown in Fig. 14, very high separation efficiency of the color and depth images can be achieved without a critical loss of each band’s light energy.

Fig. 13

One-lens/two-sensor 3-D camera architecture: simultaneous capturing of 14-Mp color and FHD depth images by dividing visible and IR bands.

JM3_12_2_023011_f013.png

Fig. 14

Transmittance spectrum of splitter. Visible band (400–700 nm) is reflected with reflectance 98%, i.e., transmittance 2%, and IR band around 850 nm is transmitted with transmittance 99%.

JM3_12_2_023011_f014.png

In the depth channel, the optical shutter modulates IR images in 20 MHz, and the resulting modulated image is recorded by the grayscale BW CIS. The depth image is calculated utilizing the modulated images based on the novel homodyne-mixing technique1416 as explained in Sec. 2. It is notable that the resolution of depth image is larger than FHD since the system utilized full resolution of the standard grayscale CIS (FHD to 14 Mp). In the color channel, the color image is captured by a standard color CIS, simultaneously. Color and depth images are processed by unified processor named 3-D image signal processor placed on the back end of the system. As result, FHD depth and 14-Mp color images are captured and processed simultaneously.

4.2.

Prototype

The 3-D camera architecture designed in Figure 13 was prototyped using the commercial lens/body, off-the-shelf CIS sets, driver electronics, IR sources; plus developed optical shutter device. Figure 15 shows the structure of the prototype. All components are integrated with Samsung NX-body setup, and 14-Mp color and FHD IR image are captured simultaneously with one-lens/two-sensor setup. Color and IR images are captured and displayed in two LCD displays.

Fig. 15

Three-dimensional camera prototype with one-lens/two-sensor architecture for capturing of 14-Mp color and FHD IR images.

JM3_12_2_023011_f015.png

5.

Depth Image Evaluation

The optical shutter approach was evaluated by capturing color and depth images of a series of test objects (Julian, Venus, and flowers: those are 2 to 3 m away from the 3-D camera) with the 3-D camera prototype shown in Fig. 16. An array of 850-nm IR sources having total 500-mW optical output with 20-MHz amplitude modulation is illuminated to the objects. FHD IR images are captured via modulation by the optical shutter. The depth image is extracted using the captured IR images under the suggesting homodyne-mixing scheme1416 described in Eq. (4). The resulting depth image shown in Fig. 17 successfully has FHD resolution (1920×1440), whereas competing technologies210 generate up to VGA depth images. Depth error (standard deviation) on the Julian face is 0.44 cm at 2 m distance. Bit resolution of BW CIS and color CIS is 10-bit (1024 grayscale steps).

Fig. 16

Three-dimensional image capturing test setup: color and depth images of objects are captured by suggesting 3-D camera prototype with optical shutter, and the color/depth images are converted to multiview format of 3-D display.

JM3_12_2_023011_f016.png

Fig. 17

Captured 14-Mp color (a) and FHD depth (b) images.

JM3_12_2_023011_f017.png

One of the technical challenges in TOF-based depth capturing technology is image stability under sunlight, i.e., outdoor environment. The TOF principle works if provided with sufficient IR illumination to the captured objects, but in case of strong sunlight, the captured IR image will have relatively small signal (i.e., IR illumination) to noise (i.e., sunlight) ratio. Therefore the suppression of sunlight is critical to get stable capturing of the depth images in an outdoor environment. To do that, three novel sunlight-suppression techniques have been applied in this work as follows. First (wavelength domain approach), the optical shutter itself is an 850-nm monochrome filter as shown in transmittance spectrum of Fig. 10, so that the sunlight component outside 850±5nm band is filtered out whereas most of the light energy of the IR source is transmitted. Second (frequency domain approach), since sunlight is constant in time (DC) compared with the speed of 20-MHz modulating IR signal, the influences of sunlight in modulated images (I0°,I90°,I180°,I270°) are almost equal so that the sunlight effect is canceled by subtracting IR images as in Eq. (4) under the scheme of the novel homodyne-mixing scheme.1416

Third (time domain approach), the IR source illuminates intensively during a short time interval and is turned off in the rest of time; and the optical shutter closes at that IR turn-off interval by synchronizing the optical shutter to the IR source, then sunlight at the turn-off interval is blocked. This approach is called synchronized burst IR-shutter mode, which comes from the benefit of the developed optical shutter that can open and close the image plane globally,16 whereas most existing CMOS image sensors apply rolling shutters.

By applying above three approaches suppressing the influence of sunlight, depth image was captured and compared under the room light condition (0.3 klux) and sunlight condition (27 klux: outdoor) as shown in Fig. 18. As usual, the sunlight condition yields overexposed or malfunction-of-depth cameras in competing technologies. Depth error in room light condition is 0.9% (2.3 cm at 2.5 m), including the noise reduction image process. This work’s depth image in sunlight condition is successfully stable to get a fair depth error of 1.2% (3.1 cm at 2.5 m) compared to room light condition.

Fig. 18

FHD IR image (a), depth images captured under room light (under 0.5 klux) (b), and sunlight condition (under 27 klux) (c).

JM3_12_2_023011_f018.png

6.

Conclusion

A novel 20-MHz high-speed image-shuttering device and its application to 3-D image capturing system were presented. To get extraordinary high speed of the image shutter, solid-state multilayer structure having electro-absorption mechanism combining with optical resonance cavity is designed, fabricated, and characterized. Cell structure and electrode shape are optimized with a simulation-based modeling of electro-optical mechanisms. MOEMS-based etching and patterning technology were utilized for device fabrication. As result, transmittance variation 51% and switching speed 20 MHz were obtained, which are required for time-of-flight operation of the 3-D camera. It is notable that systematic modeling and simulation of electro-optic mechanism predicts the real behavior of the optical shutter well.

One-lens/two-sensor architecture for simultaneous capturing of color and depth images was prototyped using commercially available body/lens sets and CIS components. The suggested optical shutter approach enables capturing of a FHD resolution of depth image, which results in the highest resolution among the state-of-the-art depth camera technologies. Especially, the depth image is stable under the sunlight conditions, which solves a critical technical challenge in the TOF depth sensing field.

Deliverables can enable 3-D business such as 3-D image capturing, user interface, and 3-D display, especially for camera and display business. A graphic multiview generation using color and depth information is underway for the interface to stereo and multiview 3-D displays. Further, optimization of the quality of the depth image, application-specific integrated circuit (ASIC)-based noise cancellation, will be added on the current achievements toward commercialization in near future.

Acknowledgments

This paper was previously published as a proceedings paper at SPIE MOEMS and Miniaturized Systems XI, San Francisco, CA, USA, 2013 as “Micro optical system based 3-D imaging for FHD depth image capturing.”1

References

1. 

Y. H. Parket al., “Micro optical system based 3-D imaging for full HD depth image capturing,” Proc. SPIE, 8252 82520X (2012). Google Scholar

2. 

L. XiaC. C. ChenJ. K. Aggarwal, “Human detection using depth information by Kinect,” in IEEE Comp. Soc. Conf. on Computer Vision and Pattern Recognition Workshops, 15 –22 (2011). Google Scholar

3. 

Primesense home page, (2012) http://www.primesense.com July ). 2012). Google Scholar

4. 

C. Niclasset al., “A 128×128 single-photon image sensor with column-level 10-bit time-to-digital converter array,” IEEE J. Solid-State Circuits, 43 (12), 2977 –2989 (2008). http://dx.doi.org/10.1109/JSSC.2008.2006445 IJSCBC 0018-9200 Google Scholar

5. 

R. LangeP. Seitz, “Solid state time-of-flight range camera,” IEEE J. Quantum Electron., 37 (3), 390 –397 (2001). http://dx.doi.org/10.1109/3.910448 IEJQA7 0018-9197 Google Scholar

6. 

T. Oggieret al., “Novel pixel architecture with inherent background suppression for 3-D time-of-flight imaging,” Proc. SPIE, 5665 1 –8 (2005). http://dx.doi.org/10.1117/12.586933 PSISDG 0277-786X Google Scholar

7. 

D. Stoppaet al., “A range image sensor based on 10-μm lock-in pixels in 0.18-μm CMOS imaging technology,” IEEE J. Solid-State Circuits, 46 (1), 248 –258 (2011). http://dx.doi.org/10.1109/JSSC.2010.2085870 IJSCBC 0018-9200 Google Scholar

8. 

S. J. Kimet al., “A CMOS image sensor based on unified pixel architecture with time-division multiplexing scheme for color and depth image acquisition,” IEEE J. Solid-State Circuits, 47 (11), 2834 –2845 (2012). http://dx.doi.org/10.1109/JSSC.2012.2214179 IJSCBC 0018-9200 Google Scholar

9. 

Mesa Imaging home page, (2012) http://www.mesa-imaging.ch July ). 2012). Google Scholar

10. 

G. YahavG. J. IddanD. Mandelboum, “3-D imaging camera for gaming application,” in Consumer Electronics, 2007. ICCE 2007. Digest of Technical Papers. International Conference on, 1 –2 (2007). Google Scholar

11. 

M. Kawakitaet al., “High-definition real-time depth-mapping TV camera; HDTV Axi-Vision camera,” Opt. Express, 12 (12), 2781 –2794 (2004). http://dx.doi.org/10.1364/OPEX.12.002781 OPEXFF 1094-4087 Google Scholar

12. 

T. Aidaet al., “High-speed depth-mapping Axi-Vision camera with compact optical system,” Proc. SPIE, 6805 680511 (2008). http://dx.doi.org/10.1117/12.767460 PSISDG 0277-786X Google Scholar

13. 

A. A. Dorringtonet al., “Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera,” Meas. Sci. Technol., 18 (9), 2809 –2816 (2007). http://dx.doi.org/10.1088/0957-0233/18/9/010 MSTCEP 0957-0233 Google Scholar

14. 

ParkY. H.YouJ. W., “Method and apparatus for calculating a distance between an optical apparatus and object,” KP2010-0005753, U.S. Patent 12/837,814 (2012).

15. 

ParkY. H.YouJ. W.ChoY. C., “Three-dimensional image acquisition apparatus and method of extracting depth information in the 3-D image acquisition apparatus,” KP2010-0133720, U.S. Patent 13/160,135 (2011).

16. 

ParkY. H.YouJ. W.YoonH. S., “3-D camera with ambient light suppression,” KP2011-0109431, U.S. Patent 13/594,094 (2012).

17. 

ChoY. C.et al., “Optoelectronic shutter, method of operating the same and optical apparatus including the optoelectronic shutter,” KP2009-0049475, U.S.-2010-0308211-A1 (2010).

18. 

ChoY. C.et al., “Optical modulator,” KP2010-0006052, U.S.-2011-0181936-A1 (2011).

19. 

ChoY. C.et al., “Optical modulator using multiple Fabry–Perot resonant modes and apparatus for capturing 3-D image including the optical modulator,” KP2010-0137229, U.S. Patent 13/163,202 (2010).

20. 

G. R. Fowles, Introduction to Modern Optics, 86 –103 Dover Books, Mineola, NY (1989). Google Scholar

21. 

S. L. Chuang, Physics of Optoelectronic Devices, 557 –569 Wiley, New York (1995). Google Scholar

22. 

K. W. GoossenJ. E. SunninghamW. Y. Jan, “Electroabsorption in ultranarrow-barrier GaAs/AlGaAs multiple quantum well modulators,” Appl. Phys. Lett., 64 (9), 1071 –1073 (1994). http://dx.doi.org/10.1063/1.110935 APPLAB 0003-6951 Google Scholar

23. 

ChoY. C.et al., “Optical image modulator and method of manufacturing the same,” KP2010-0122678, U.S. Patent 13/167,486 (2010).

24. 

LeeS. H.ParkC. Y.LeeJ. H., “Method of manufacturing the optical image modulator,” KP2011-0096984, U.S. Patent 13/531,964 (2012).

25. 

ParkY. H.et al., “Illumination optical system and 3-D image acquisition apparatus including the same,” KP2010-0127866, U.S. Patent 13/244,329 (2010).

26. 

ParkY. H.et al., “Optical system having integrated illumination and imaging optical systems, and 3-D image acquisition apparatus including the optical system,” KP2010-0127867, U.S. Patent 3/156,789 (2010).

Biography

JM3_12_2_023011_d001.png

Yong-Hwa Park received BS, MS, and PhD degrees in mechanical engineering from the Korea Advanced Institute of Science and Technology in 1991, 1993, and 1999, respectively. In 1998, he was selected as a future frontier scientist from the Korea Research Foundation. In 2000, he joined the MEMS research group in the University of Colorado at Boulder as a research associate. From 2003 to 2005, he was working in the visual display division in Samsung Electronics, Co. Ltd. From 2005, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology as a principal researcher in optical MEMS design and applications to imaging and display systems. His major research activities in the MEMS area include dynamic analysis and design of RF/optical MEMS, and imaging and display systems. He is a member of the Society for Information Display, SPIE, the Society for Automotive Engineers, the Society of Experimental Mechanics, and IEEE. Among them, especially he works as a conference chair of MOEMS and miniaturized systems in SPIE Photonics West from 2012.

JM3_12_2_023011_d002.png

Yong-Chul Cho received MS and PhD degrees in mechanical engineering from the Korea Advanced Institute of Science and Technology in 1986 and 1992, respectively. From 1992 to 1998, he was a member of the research staff in Samsung Electronics, Co. Ltd, working on various machine vision systems. In 1999, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology, and has been engaged in research and development of microsystems such as gyroscope sensors, optical scanners, and optical modulators. His major research activities in the MEMS area includes micropackaging, reliability analysis, and characterization. He is a member of the Institute of Control, Robotics, and Systems.

JM3_12_2_023011_d003.png

Jang-Woo You received MS and PhD degrees in mechanical engineering from the Korea Advanced Institute of Science and Technology in 2002 and 2009, respectively. From 2005 to 2006, he joined the Wellman Center for Photomedicine at Boston, MA, as a research fellow to study optical coherence tomography for in vivo human retina imaging. In 2009, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology as a researcher in optical device and system design. His major interest is developing advanced TOF camera for the user interface.

JM3_12_2_023011_d004.png

Chang-Young Park received MS and PhD degrees in optoelectronics from the Gwangju Institute of Science and Technology in 2004 and 2011, respectively. His research fields are III–V compound semiconductor device epitaxial growth and characterization such as laser diode, photodetector, optical modulator, tandem solar cell, quantum dot, nanorods, etc. In 2011, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology and major research activities are high-speed optical modulator (optical shutter) design, simulation, fabrication and characterization.

JM3_12_2_023011_d005.png

Hee-Sun Yoon received MS and PhD degrees in mechatronics from the Gwangju Institute of Science and Technology in 2006 and 2011, respectively. In 2011, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology, and has been engaged in research and development of 3-D camera system. His major research activities are design and analysis of imaging system, electric circuit design and characterization.

JM3_12_2_023011_d006.png

Sang-Hun Lee received the BS, MS, and PhD degrees in electrical engineering from Seoul National University in 1991, 1993, and 1998, respectively. In 1998, he joined Samsung Electronics, Co. and Samsung Advanced Institute of Technology, where he has studied the field of optical, inertial, RF, Bio MEMS. His current research interests are design and fabrication of micro/nano devices, especially sensors and optical devices.

JM3_12_2_023011_d007.png

Jong-Oh Kwon received BS and MS degrees in Ceramic engineering from the Yon Sei University in 1995 and 1997, respectively. From 1997 to 2005, he had been a packaging engineer in Samsung Electro-Mechanics Company, working in the wireless device division. In February 2005, he joined the microsystems laboratory in the Samsung Advanced Institute of Technology, and has been engaged in research and development of microsystems package such as wafer level package of RF device. His major research activities in the package area includes chip scale package, wafer level package and system in package design, processing and evaluation.

JM3_12_2_023011_d008.png

Seung-Wan Lee received MS degrees in electro-material engineering from the Kyeong-Buk University in 1994. He has worked for Samsung Electronics Co. Ltd since 1986. He helped develop new products in fiber optics until 2000, special fiber and optical passive devices. He has been engaged in research and development of micro-systems such as the MEMS camera for mobile phone and the fluxgate sensor in the Samsung Advanced Institute of Technology. Since 2000, he has been in charge of the optical system design for medical device.

JM3_12_2_023011_d009.png

Byung Hoon Na received his MS degree in Information and Communications from Gwangju Institute of Science and Technology (GIST), Gwangju, Korea, in 2008. Since 2008, he has been pursuing his PhD degree at the School of Information and Mechatronics, GIST, and is a member of the optoelectronics laboratory, GIST. He is also working with Samsung Advanced Institute of Technology. His current research interests include electro-optic modulator, molecular beam epitaxial growth and their applications in optoelectronic devices.

JM3_12_2_023011_d010.png

Gun Wu Ju received BS degrees in electrical engineering from Kyungpook National University in 2009. He is now a pursuing a PhD degree in Gwangju Institute of Science and Technology (GIST) and is a member of the optoelectronics laboratory, GIST. His research interests include design, growth and characterization of electro-absorption modulator for optical shutter, RCEPD and VCSEL for optical sensor applications.

JM3_12_2_023011_d011.png

Hee Ju Choi received a BS degree in semiconductor at Chonbuk national university in 2009, and an MS degree in photonics and applied physics from Gwangju Institute of Science and Technology (GIST) in 2011. Since 2011, he has been with the department of photonics and applied physics at GIST, where he is pursuing a PhD degree and is a member of the optoelectronics laboratory at GIST. His research interests include fabrication of semiconductor devices, nanotechnology, and biosensors.

JM3_12_2_023011_d012.png

Yong Tak Lee received BS degree from Seoul National University, Korea, in 1975, and MS and PhD degrees (with honors) from Korea Advanced Institute of Science and Technology, all in applied physics, in 1979, and 1990, respectively. He joined in ETRI in 1979, and headed the optoelectronics section from 1987 to 1990, the compound semiconductor department from 1991 to 1992. He was a visiting scientist with the Department of Electronic Engineering, University of Tokyo, Japan from 1986 to 1987, and with the microelectronics laboratory, University of Illinois at Urbana-Champaign, USA, from 1993 to 1994. Since 1994, he has been a professor with the School of Information and Communications, Gwangju Institute of Science and Technology, Gwangju, Korea, and is heading the optoelectronics laboratory there. He has been also an adjunct professor with Edith Cowan University, Australia, since 2007 and Southeast University, China, since 2009. His current research interests include semiconductor laser diode, photodetector, electro-optic modulator, imaging sensor, Opto-VLSI, SOA, LEDs, solar cells, integrated photonic circuits, nanophotonic devices, bio photonics, chip-to-chip optical interconnects, micro beam projection system, etc. He has authored or co-authored more than 103 patents (filed or applied), 173 papers in SCI journals and 288 conference proceedings. Professor Lee is a recipient of many awards from academic societies, including top 100 national R&D performance in 2011, best poster presentation awards at IUMRS-ICEM in 2010, minister’s award for Research Innovation at NANO-KOREA 2010, presidential medal in science and technology, Republic of Korea in 2009, plaque of recognition for distinguished service, optical society of Korea in 1997, presidential citation for distinguished researcher, Republic of Korea in 1986, minister’s citation for distinguished researcher, ministry of communications in 1985, and best paper of the year award, ETRI in 1981.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Yong-Hwa Park, Yong-Chul Cho, Jang-Woo You, Chang-Young Park, Hee-Sun Yoon, Sang-Hun Lee, Jong-Oh Kwon, Seung-Wan Lee, Byung Hoon Na, Gun Wu Ju, Hee Ju Choi, and Yong Tak Lee "Three-dimensional imaging using fast micromachined electro-absorptive shutter," Journal of Micro/Nanolithography, MEMS, and MOEMS 12(2), 023011 (6 June 2013). https://doi.org/10.1117/1.JMM.12.2.023011
Published: 6 June 2013
Lens.org Logo
CITATIONS
Cited by 21 scholarly publications and 6 patents.
Advertisement
Advertisement
KEYWORDS
Camera shutters

3D image processing

Infrared imaging

Modulation

3D displays

Cameras

Image sensors

Back to Top