Open Access
1 March 2010 Shack-Hartmann wavefront sensor with large dynamic range
Mingliang Xia, Chao Li, Lifa Hu, ZhaoLiang Cao, QuanQuan Mu, Xuan Li
Author Affiliations +
Abstract
A new spot centroid detection algorithm for a Shack-Hartmann wavefront sensor (SHWFS) is experimentally investigated. The algorithm is a kind of dynamic tracking algorithm that tracks and calculates the corresponding spot centroid of the current spot map based on the spot centroid of the previous spot map, according to the strong correlation of the wavefront slope and the centroid of the corresponding spot between temporally adjacent SHWFS measurements. That is, for adjacent measurements, the spot centroid movement will usually fall within some range. Using the algorithm, the dynamic range of an SHWFS can be expanded by a factor of three in the measurement of tilt aberration compared with the conventional algorithm, more than 1.3 times in the measurement of defocus aberration, and more than 2 times in the measurement of the mixture of spherical aberration plus coma aberration. The algorithm is applied in our SHWFS to measure the distorted wavefront of the human eye. The experimental results of the adaptive optics (AO) system for retina imaging are presented to prove its feasibility for highly aberrated eyes.

1.

Introduction

The Shack-Hartmann wavefront sensor (SHWFS) has been widely applied in adaptive optics (AO) systems, optical testing systems, and aberration measurements of the human eye.1, 2, 3, 4 Its performance decides the detecting ability and the wavefront measurement precision of the whole system.5 Unlike interferometers, it does not require a reference wave during the measurement, and thus it is more stable and easier to use.6, 7, 8 The measurement sensitivity of an SHWFS mainly depends on the light spot centroid detection accuracy, which is affected by many factors such as the sampling and truncation error of the wavefront, the read-out noise of the detector, and the photon noise as well as the background noise.9, 10

There are several algorithms that can find light spot centroid positions in an SHWFS.11 The conventional centroid detection algorithm was based on the center of weight calculation.12 Due to the strong influence of noise, the conventional centroid detection algorithm of weight results in a relatively high systemic and random error, especially in the case where the centroid of the spot is not at the center of the detection area.13 In fact, the spot can overstep the subaperture window when the atmospheric turbulence is very strong or the aberration of the human eye is very large, which results in a larger detection error and thus makes the measurement invalid. Therefore, investigating a new centroid detection algorithm to expand the dynamic range of an SHWFS is significant for the measurement of the aberration of the human eye and atmospheric turbulence, especially when the AO technique is introduced in retina imaging.14

In this work, a new spot centroid detection algorithm based on the dynamic tracking principle is presented, and its advantage is that it uses only software to correctly attribute each spot to the correct lenslet and expand the dynamic range of SHWFS without any mechanisms such as a spatial light modulator array15 or astigmatic microlenses,16 etc. The principle of the dynamic tracking centroid detection algorithm is described in Sec. 2. The experimental results of the dynamic range test are discussed in Sec. 3. The wavefront correction for retina imaging with the new SHWFS was conducted and is discussed in Sec. 4. Finally, a conclusion is given in Sec. 5.

2.

Operation Theory of Shack-Hartmann Wavefront Sensor and the New Centroid Detection Algorithm

2.1.

Principle of a Conventional Shack-Hartmann Wavefront Sensor

Figure 1 shows the principle of an SHWFS, which mainly comprises a lenslet array and a charge-coupled device (CCD) detector. The microlens array is composed of a number of small convex lenses with the same focal length and aperture size. The CCD sensor is located at the focal plane of the microlenslet array. The SHWFS spatially divides the incident wavefront into many small subapertures with the lenslet array.17 Each subaperture forms a spot on the CCD detector, and then a focused light spot array pattern can be obtained by CCD. The relative displacement of the centroid of each spot represents some tilt of the incident wavefront through the subapertures.18 The centroid position of each spot can be calculated by the formula:19

1.

Xc=i=1Mj=1NXiI(Xi,Yj)i=1Mj=1NI(Xi,Yj),
Yc=i=1Mj=1NYiI(Xi,Yj)i=1Mj=1NI(Xi,Yj),
where (Xc,Yc) is the centroid position in the detection area of a certain microlens, I(Xi,Yj) is the intensity of the (i,j) pixel, and M and N are the numbers of pixels along the X and Y directions.

Fig. 1

Principle of the Shack-Hartmann wavefront sensor.

026009_1_041002jbo1.jpg

With the centroid positions obtained according to Eq. 1, the corresponding wavefront slope of every subaperture in the X and Y directions can be expressed as20:

2.

Six=Φx=Δxif,
Siy=Φy=Δyif,
where f is the focal length of the microlens array, Δxi=xicxic0 and Δyi=yicyic0 are the displacements of the spot centroid, Φ is the phase of the incident wavefront, and Six and Siy are the local slopes of the incident wavefront in the i ’th subaperture in the X and Y directions, respectively.

The conventional centroid detection algorithm is very easily understood and applied. Assume that there is a CCD with M1×M1pixels and a microlens array with M2×M2 microlenses, and Q×Q pixels are distributed in the subaperture of each microlens (Q=M1M2) . In the detection area, the centroid position of the spot is calculated by Eq. 1. However, the centroid detection error will be very large in some cases, as shown in Fig. 2 . Figure 2 shows that a spot leaves from its subaperture to its neighbor, and Fig. 2 shows a spot overlapping with its adjacent spot. Both cases will result in a large centroid detection error when using the conventional centroid detection algorithm. Therefore, investigating a centroid detection algorithm to expand the dynamic range of a SHWFS is very necessary.

Fig. 2

The spot patterns causing large centroid detection error.

026009_1_041002jbo2.jpg

2.2.

Dynamic Tracking Centroid Detection Algorithm

The new centroid detection algorithm proposed in this work is based on the idea that the sampling time interval of a WFS is shorter than the correlation time of the measured objects such as atmospheric disturbance and human eye aberration, so the wavefront slope and the centroid of the corresponding spot are strongly correlated between temporally adjacent SHWFS measurements. According to the statistical relationship, the movement of the corresponding spot centroid will usually fall within some range between the adjacent measurements.17, 21, 22 Therefore, the current centroid is a good prediction of the spot location in the subsequent measurement. The steps of the algorithm are listed next.

Step 1

In a (Q×Q) square area, which is the area of pixels distributed to each subaperture, focus on the centroid of the standard reference wavefront and find the brightest point position of the spot of each subaperture. Focus on the brightest point position of each spot, and the centroid will be calculated by Eq. 1 in an (N×N) square area centroid detection window. If there is more than one pixel that outputs the maximum gray level (ADU) within a spot, focus on one of the pixels with the maximum gray level, and the initial centroid position of each spot is calculated by Eq. 1 in an (N2×N2) square area centroid detection window. Then focus on the initial centroid position, and the centroid of each spot will be calculated by Eq. 1 in an (N×N) square area centroid detection window. To adaptively center the window the centroid is calculated on, the algorithm includes the use of the pixels with highest signal and signal-to-noise ratio (SNR), and excludes pixels with negligible signal that only contribute undesired noise, thus improving accuracy.23, 24 The purpose of this step is to calculate the spot centroids of the first spot map, and it is the premise to track and calculate the spot centroids of the subsequent spot map dynamically. N is calculated by the following expressions:

Eq. 3

n0ω0D0=nωD0.
Equation 3 is a Lagrange-Helmholtz invariant. From Eq. 3, we obtain:

Eq. 4

ω=n0ω0D0nD0,
where n0 is the refractive index in the entrance pupil, ω0 is the field angle in the entrance pupil, D0 is the diameter of the entrance pupil, n is the refractive index in the exit pupil, ω is the field angle in the exit pupil, and D0 is the diameter of the exit pupil. n is generally equal to 1.

Eq. 5

l=1.22λdf,
where f is the focal length of the microlens array, d is the diameter of the subaperture, and l is the radius of the Airy disk. Using Eqs. 4, 5, we define:

Eq. 6

L=2ωf+2l,
where the first term on the left gives the size of the image, and the second term on the left gives the size of the Airy disk.

Eq. 7

NLP,
where P is the size of the CCD pixel, and N should be the ceiling of the result of Eq. 7. The centroid detection window should be a little larger than the spot size to prevent the spot from overstepping the detection window easily.

However, the peak of each spot of the first spot map does not leave its centroid detection window and creep into the adjacent window. If this situation occurs, the centroid precision of the first spot map will be relatively low, and the centroid track of the subsequent spot maps will fail.

Step 2

According to the theory of temporal correlation for the atmospheric turbulence and human eye aberration, there will be a statistical relationship for the spot centroid of the same subaperture between two continuous spot images sampled within the correlation time. That is to say, the relative spot movement of each subaperture between two continuous images is within a finite range, which can by calculated by22:

Eq. 8

d=σΔα(T)(D0D0)f2,
where σΔα(T) is a variance of the Gaussian random process with a zero mean, which varies with the variation of sampling time interval T , D0 is the diameter of the entrance pupil, D0 is the diameter of the exit pupil, and D0D0 is the field angle magnification. Thus, the spot centroids of each frame will fall in the range of { XpdXp+d , YpdYp+d }, and Xp and Yp are the spot centroid coordinates of the previous frame. Then the spot centroid of each subaperture for the second frame can be calculated according to the first frame.

Based on each spot centroid of the first spot map, which is calculated by step 1, select an appropriate size (M×M) square area and find the brightest point or calculate the initial centroid position in the square area. M is calculated with the following expression:

Eq. 9

M1PσΔα(T)(D0D0)f.
M should be the ceiling of the result of Eq. 9.

Focus on the position of the brightest point or the initial centroid, select an appropriate size (N×N) square area to be regarded as the centroid detection window, and the centroid of each spot will be calculated by Eq. 1 in the corresponding centroid detection window. N is obtained by Eq. 7. The purpose of this step is to calculate the spot centroids of the second spot map, and prepare for tracking and calculating the spot centroids of the next spot map dynamically. The peak of each spot of the second spot map may leave its centroid detection window and creep into the adjacent window, and may even be located in the center of the adjacent window.

Step 3

Repeat step 2 to track and reconstruct the dynamic wavefront.

Figure 3 shows the principle of the algorithm. The centroid position detection principle of the first spot map and the subsequent spot maps are shown in Figs. 3 and 3, respectively. In Fig. 3, the crisscross star represents the centroid of the standard reference wavefront, the green solid pane represents the area of pixels distributed to each subaperture, the black solid pane represents the centroid detection window of each spot, and the black dashed pane represents the window where the brightest point or the initial centroid position of each spot of the current spot map is found based on the corresponding spot centroid of the previous spot map. The white-black disk represents the spot, and the red point in the white-black disk represents the brightest point or the initial centroid position of each spot. (Color online only.)

Fig. 3

The principle figure of the new centroid detection algorithm.

026009_1_041002jbo3.jpg

3.

Experiments and Analysis of Dynamic Range Test

To verify the dynamic range of this method, we developed new control software for an SHWFS made by our group with Visual C++ using the new centroid detection algorithm, and we developed control software for it using the conventional center of weight method. This sensor offers a 15×15 microlens array and 256×256pixel CCD. The dynamic range measurement results using the new method are compared with those using the conventional algorithm. Figure 4 shows the schematics of the experimental setup.

Fig. 4

Experimental setup for measuring the dynamic range.

026009_1_041002jbo4.jpg

We translated the laser diode (LD) along the direction perpendicular to the optical axis to generate the tilt aberration to verify the dynamic range of our dynamic tracking centroid algorithm, and the movement step was 1mm every time. The distance between the sensor and the LD was 243mm , and the wavefronts were measured in relative mode. The theoretical peak-to-valley (PV) value of the distorted wavefront produced by the previously mentioned method was calculated by the following expression:

Eq. 10

PV=a×(lL),
where a is the available width of 4.356mm for the CCD panel, l is the displacement of the LD, and L is the distance between the sensor and the LD. The measurement results are shown in Fig. 5 . Figure 5 shows the reference wavefront. The wavefronts shown in Fig. 5 are the ones measured by the dynamic tracking centroid algorithm, and the wavefronts shown in Fig. 5 are the ones measured by the conventional algorithm. The corresponding PV value of the wavefront is shown in Fig. 5. The wavefront measurement error for the new centroid algorithm and the conventional algorithm are shown in Fig. 5. From Fig. 5, we can see that the wavefront measurement error for the new algorithm is 0.063μm , while the error for the conventional algorithm is 3.873μm when the LD moves 2mm , and when the LD moves 6mm , the error is 0.152μm for the new algorithm and 64.496μm for the conventional algorithm.

Fig. 5

The experiment results verifying the dynamic range of the new centroid detection algorithm.

026009_1_041002jbo5.jpg

In addition, we translated the LD along the direction parallel to the optical axis and toward the SHWFS to generate the defocus aberration to verify the dynamic range of the new centroid algorithm. The measurement results are shown in Fig. 6 . The wavefronts shown in Fig. 6 are the ones measured by the new centroid algorithm, and the wavefronts shown in Fig. 6 are the ones measured by the conventional algorithm. The corresponding radius of the sphere wavefront is shown in Fig. 6. The wavefront measurement error for the new centroid algorithm and the conventional algorithm are shown in Fig. 6. From Fig. 6, we can see that the wavefront measurement error for the new algorithm is 0.104μm , while the error for the conventional algorithm is 0.930μm when the LD moves 27.5cm ; when the LD moves 32.5cm , the error is 0.147μm for the new algorithm and 30.632μm for the conventional algorithm.

Fig. 6

The experiment results verifying the dynamic range of the new centroid detection algorithm.

026009_1_041002jbo6.jpg

From the results shown in Figs. 5 and 6, we can see that the wavefronts measured by the conventional algorithm were generally distorted as the LD moved, while the ones measured by the dynamic tracking centroid algorithm are quite close to the theoretical ones. The large error of the wavefronts measured by the conventional algorithm is due to some spots leaving their subaperture and being located in neighboring ones.

To verify that this new algorithm is suitable for aspherical wavefront measurements, we used a liquid crystal spatial light modulator (LCSLM) to generate a mixture of spherical aberration plus coma aberration, and measured them with the SHWFS using the new centroid algorithm and the conventional algorithm. The LCSLM used in the experiment from America BNS Company (Middletown, Rhode Island) had 512×512pixels with a 2π phase modulation depth, and the size of the pixels was 15μm . The LCSLM had a response frequency of 200Hz . The experimental configuration is shown in Fig. 7 .

Fig. 7

Experimental configuration for measuring the aspherical wavefronts.

026009_1_041002jbo7.jpg

The control of the LCSLM is based on Zernike polynomials.25 The wavefronts measured by the SHWFS using the new algorithm and the conventional algorithm are shown in Figs. 8 and 8 , respectively, when the Zernike coefficients Z2=1 and Z5=1 , Z2=2 and Z5=2Z2=9 and Z5=9 (piston is obviated; Z0 , x -tilt; Z1 , y -tilt; Z2 , defocus; Z3 , x -astigmatism; Z4 , y -astigmatism; Z5 , x -coma Z35 ; Zm=i means the m ’th Zernike coefficient is set to i and the other coefficients are set to 0). The corresponding PV value of the wavefront is shown in Fig. 8. The wavefront measurement errors for the new algorithm and the conventional algorithm are shown in Fig. 8. From Fig. 8, we can see that the wavefront measurement error for the new algorithm is 0.097μm , while the error for the conventional algorithm is 0.481μm when the Zernike coefficients Z2=5 and Z5=5 ; when Z2=8 and Z5=8 , the error is 0.165μm for the new algorithm and 6.456μm for the conventional algorithm; and when Z2=9 and Z5=9 , the error is 4.736μm for the new algorithm and 8.609μm for the conventional algorithm. The large wavefront measurement error for the new algorithm occurs because two or more spots come together.

Fig. 8

The experiment results of measuring the aspherical wavefronts.

026009_1_041002jbo8.jpg

From these experimental results, we can see that the new algorithm expands the dynamic range for a SHWFS by a factor of more than 3 compared with the conventional algorithm in the measurement of tilt aberration, more than 1.3 times in the measurement of defocus aberration, and more than 2 times in the measurement of the mixture of spherical aberration plus coma aberration. The centroid detection error is very large with the conventional algorithm when the tails of the spot creep into the adjacent window, while the new algorithm is still able to perform well when the spot is near the edge of the window of pixels associated with a lenslet, at which point the peak of the spot is within the window but the tails are now creeping into the adjacent window. Therefore, the wavefront could be decomposed by the new algorithm when some spots leave their subaperture and are located in neighboring ones, but it is not suitable for the situation where two or more spots come together when some kinds of aberrations are very strong, such as shown in Fig. 2.

Finally, to verify whether the algorithm also performs well for dynamic aberration measurements, a series of gray maps with simulated and real eye aberrations was sent on the prior LCSLM successively, and the wavefronts were measured by the SHWFS with the new centroid algorithm. The sending frequency was 10Hz . The simulation eye aberrations were the mixture of defocus aberration plus astigmatism aberration plus coma aberration plus spherical aberration (the Zernike coefficients Z2=Z3=Z4=Z5=Z6=Z7=i , i=0,0.1,0.2,0.31,0.9,0.8,0.70 ). The real eye aberrations were the ones of subject NZ eye. The theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm of the simulation eye dynamic aberrations are shown in Video 1 , and the wavefront measurement error is shown in Fig. 9 . Video 2 shows the theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm of the real eye dynamic aberrations. The wavefront measurement error is shown in Fig. 10 . From these experimental results, we can see that the new algorithm performs well for dynamic aberration measurements.

Fig. 9

The wavefront error between the theoretical wavefronts and the wavefronts measured by the SHWFS with the new centroid algorithm for simulated eye dynamic aberrations.

026009_1_041002jbo9.jpg

Fig. 10

The wavefront error between the theoretical wavefronts and the wavefronts measured by the SHWFS with the new centroid algorithm for real eye dynamic aberrations.

026009_1_041002jbo10.jpg

Video 1

The theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm for simulated eye dynamic aberrations. The left maps are the theoretical wavefront maps, and the right ones are the wavefront maps measured by the SHWFS with the new centroid algorithm (MPEG, 815KB ).

026009_1_041002jbov1.jpg
10.1117/1.3369810.1

Video 2

The theoretical wavefront maps and the wavefront maps measured by the SHWFS with the new centroid algorithm for real eye dynamic aberrations. The left maps are the theoretical wavefront maps, and the right ones are the wavefront maps measured by the SHWFS with the new centroid algorithm (MPEG, 963KB ).

026009_1_041002jbov2.jpg
10.1117/1.3369810.2

4.

Human Eye Aberration Correction Experiments

To verify the feasibility of the new algorithm for highly aberrated eyes, an AO system for retina imaging correction was implemented. Figure 11 shows the layout of the AO system. The AO system consists of a wavefront sensor, a wavefront corrector (an LCSLM), and a control computer. An LD of 808-nm wavelength was used for illumination. The LCSLM used in this experiment was the one used in the previous experiment. The SHWFS used in the experiment was also the one used before with the control software we developed using the new algorithm. The wavefront correction was performed by the modal closed-loop algorithms using a direct wavefront compensation method. Using this system, three subjects with different myopias and astigmatisms were measured in the laboratory.

Fig. 11

The layout of the AO system for retina imaging correction.

026009_1_041002jbo11.jpg

First, subject DL with 3-D myopia was tested. Figure 12 shows the wavefront maps and fundus image before and after correction. The wavefront distortion of 3.745μm (PV value) was corrected to 0.085μm (PV value). The fundus image after correction reached the diffraction-limited resolution. Then, subject CL with 5-D myopia was tested. The wavefront distortion in PV before and after correction was 5.444 and 0.134μm , as shown in Figs. 13 and 13 , respectively. The fundus image is shown in Figs. 13 and 13. Finally, subject LMX with 5-D myopia and 2-D astigmatism was tested. The wavefront distortion in PV before and after correction was 9.509 and 0.216μm , as shown in Figs. 14 and 14 , respectively. Figures 14 and 14 show the fundus image before and after correction. From these results, we see that the algorithm is feasible for highly aberrated eyes.

Fig. 12

The wavefront map and retina image of DL (a) and (c) before and (b) and (d) after correction with the SHWFS using the dynamic tracking centroid algorithm.

026009_1_041002jbo12.jpg

Fig. 13

The wavefront map and retina image of CL (a) and (c) before and (b) and (d) after correction with the SHWFS using the dynamic tracking centroid algorithm.

026009_1_041002jbo13.jpg

Fig. 14

The wavefront map and retina image of LMX (a) and (c) before and (b) and (d) after correction with the SHWFS using the dynamic tracking centroid algorithm.

026009_1_041002jbo14.jpg

5.

Conclusions

A dynamic tracking spot centroid detection algorithm for an SHWFS is presented and investigated experimentally. It uses each spot centroid of the previous spot map to track and calculate the corresponding spot centroid of the current spot map, according to the strong correlation of the wavefront slope and the centroid of the corresponding spot between temporally adjacent SHWFS measurements, i.e., for the adjacent measurements, the spot centroid movement will usually fall within some range. The dynamic range test results using the dynamic tracking algorithm are presented. These results indicate that the dynamic range of an SHWFS could be expanded by a factor of 3 in the measurement of tilt aberration compared with the conventional algorithm, more than 1.3 times in the measurement of defocus aberration, and more than 2 times in the measurement of the mixture of spherical aberration plus coma aberration. The experimental results also show that the new algorithm performs well for dynamic aberration measurements. The wavefronts could be decomposed by the new algorithm when some spots leave their subaperture to their neighbor ones, but it is not suitable for the situation where two or more spots come together, as when some kinds of aberrations are very strong.

In addition, an AO system for retina imaging correction is also implemented to prove the feasibility of the algorithm for highly aberrated eyes. Using this system, three subjects with differing myopias and astigmatisms are tested in the laboratory. For the subjects with 3-D myopia, 5-D myopia, 5-D myopia, and 2-D astigmatism, the wavefront distortions in PV before and after correction were 3.754 and 0.085μm , 5.444 and 0.134μm , 9.509 and 0.216μm , respectively, and clear retina images were obtained.

Acknowledgments

The work is supported by the National Natural Science Foundation (numbers 60578035, 50473040, and 60736042), and the Science Foundation of Jilin Province (numbers 20050520 and 20050321-2).

References

1. 

R. K. Tyson, Principles of Adaptive Optics, Academic Press, Boston (1991). Google Scholar

2. 

T. L. Bruno, A. Wirth, and A. J. Jankevics, “Applying Hartmann wavefront sensing technology to precision optical testing of the Hubble space telescope correctors,” Proc. SPIE, 1920 328 –336 (1993). https://doi.org/10.1117/12.152677 0277-786X Google Scholar

3. 

J. Liang and B. Grimm, “Objective measurement of wave aberrations of the human eye with the use of a Hartmann-Shack wave-front sensor,” J. Opt. Soc. Am. A, 11 (7), 1949 –1957 (1994). https://doi.org/10.1364/JOSAA.11.001949 0740-3232 Google Scholar

4. 

A. Larichev, P. Ivanov, I. Irochnikov, S. C. Nemeth, A. Edwards, and P. Soliz, “High speed measurement of human eye aberrations with Shack-Hartmann sensor,” Invest. Ophthalmol. Visual Sci., 42 897 (2001). 0146-0404 Google Scholar

5. 

A. Zhang, C. Rao, Y. Zhang, and W. Jiang, “Performance analysis of Shack-Hartmann wavefront sensor with variable subaperture pixels,” Proc. SPIE, 5490 1268 –1277 (2004). https://doi.org/10.1117/12.549403 0277-786X Google Scholar

6. 

N. Lindlein and J. Pfund, “Experimental results for expanding the dynamic range of a Shack-Hartmann sensor using astigmatic microlenses,” Opt. Eng., 41 (2), 529 –533 (2002). https://doi.org/10.1117/1.1430724 0091-3286 Google Scholar

7. 

J. Pfund, N. Lindlein, J. Schwider, R. Burow, T. Blumel, and K. E. Elssner, “Absolute sphericity measurement: a comparative study of the use of interferometry and a Shack-Hartmann sensor,” Opt. Lett., 23 (10), 742 –744 (1998). https://doi.org/10.1364/OL.23.000742 0146-9592 Google Scholar

8. 

J. A. Koch, R. W. Presta, R. A. Sacks, R. A. Zacharias, E. S. Bliss, M. J. Dailey, M. Feldman, A. A. Grey, F. R. Holdener, J. T. Salmon, L. G. Seppala, J. S. Toeppen, L. V. Atta, B. M. V. Wonterghem, W. T. Whistler, S. E. Winters, and B. W. Woods, “Experimental comparison of a Shack-Hartmann sensor and a phase-shifting interferometer for large-optics metrology applications,” Appl. Opt., 39 (25), 4540 –4546 (2000). https://doi.org/10.1364/AO.39.004540 0003-6935 Google Scholar

9. 

R. Irwan and R. G. Lane, “Analysis of optimal centroid estimation applied to Shack-Hartmann sensing,” Appl. Opt., 38 (32), 6737 –6743 (1999). https://doi.org/10.1364/AO.38.006737 0003-6935 Google Scholar

10. 

Z. Jiang, S. Gong, and Y. Dai, “Monte-Carlo analysis of centroid detected accuracy for wavefront sensor,” Opt. Laser Technol., 37 (7), 541 –546 (2005). https://doi.org/10.1016/j.optlastec.2004.08.009 0030-3992 Google Scholar

11. 

P. Arulmozhivarman, L. Praveen Kumar, and A. R. Ganesan, “Measurement of moments for centroid estimation in Shack-Hartmann wavefront sensor-a wavelet-based approach and comparison with other methods,” Optik, 117 (2), 82 –87 (2006). Google Scholar

12. 

X. Yu, D. Zhao, and C. Li, “Adaptation of adaptive optics system,” Proc. SPIE, 3126 432 –440 (1997). https://doi.org/10.1117/12.279053 0277-786X Google Scholar

13. 

H. Li, H. Song, C. Rao, and X. Rao, “Accuracy analysis of centroid calculated by a modified center detection algorithm for Shack-Hartmann wavefront sensor,” Opt. Commun., 281 (4), 750 –755 (2008). https://doi.org/10.1016/j.optcom.2007.10.108 0030-4018 Google Scholar

14. 

Q. Mu, Z. Cao, C. Li, B. Jiang, L. Hu, and L. Xuan, “Accommodation-based liquid crystal adaptive optics system for large ocular aberration correction,” Opt. Lett., 33 (24), 2898 –2900 (2008). https://doi.org/10.1364/OL.33.002898 0146-9592 Google Scholar

15. 

N. Lindlein, J. Pfund, and J. Schwider, “Algorithm for expanding the dynamic range of a Shack-Hartmann sensor by using a spatial light modulator array,” Opt. Eng., 40 (5), 837 –840 (2001). https://doi.org/10.1117/1.1357193 0091-3286 Google Scholar

16. 

N. Lindlein, J. Pfund, and J. Schwider, “Expansion of the dynamic range of a Shack-Hartmann sensor by using astigmatic microlenses,” Opt. Eng., 39 (8), 2220 –2225 (2000). https://doi.org/10.1117/1.1304846 0091-3286 Google Scholar

17. 

T. J. Kane, B. M. Welsh, and C. S. Gardner, “Wavefront detector optimization for laser guided adaptive telescope,” Proc. SPIE, 1114 160 –171 (1989). 0277-786X Google Scholar

18. 

W. H. Southwell, “Wave-front estimation from wave-front slope measurements,” J. Opt. Soc. Am., 70 (8), 998 –1006 (1980). https://doi.org/10.1364/JOSA.70.000998 0030-3941 Google Scholar

19. 

W. Jiang, H. Xian, and F. Shen, “Detecting error of Shack-Hartmann wavefront sensor,” Chin. J. Quantum Electron., 15 (2), 218 –227 (1998). Google Scholar

20. 

Y. Dai, F. Li, X. Cheng, Z. Jiang, and S. Gong, “Analysis on Shack-Hartmann wave-front sensor with Fourier optics,” Opt. Laser Technol., 39 (7), 1374 –1379 (2007). https://doi.org/10.1016/j.optlastec.2006.10.014 0030-3992 Google Scholar

21. 

H. Hofer, P. Artal, B. Singer, J. L. Aragón, and D. R. Williams, “Dynamics of the eye’s wave aberration,” J. Opt. Soc. Am. A, 18 (3), 497 –506 (2001). https://doi.org/10.1364/JOSAA.18.000497 0740-3232 Google Scholar

22. 

B. Li, X. Yu, and X. Hu, “A new centroid computation method using dynamic tracking,” J. Beijing Instit. Technol., 22 (1), 101 –104 (2002). Google Scholar

23. 

W. Quan, Z. Wang, C. Zhang, and G. Mu, “The use of template matching for Hartmann sensor spot centroid detection window,” J. Optoelectron., Laser, 13 (11), 1148 –1151 (2002). 1005-0086 Google Scholar

24. 

X. Ma, C. Rao, and H. Zheng, “Error analysis of CCD-based point source centroid computation under the background light,” Opt. Express, 17 (10), 8525 –8541 (2009). https://doi.org/10.1364/OE.17.008525 1094-4087 Google Scholar

25. 

G. D. Love, “Wave-front correction and production of Zernike modes with a liquid-crystal spatial light modulator,” Appl. Opt., 36 (7), 1517 –1524 (1997). https://doi.org/10.1364/AO.36.001517 0003-6935 Google Scholar
©(2010) Society of Photo-Optical Instrumentation Engineers (SPIE)
Mingliang Xia, Chao Li, Lifa Hu, ZhaoLiang Cao, QuanQuan Mu, and Xuan Li "Shack-Hartmann wavefront sensor with large dynamic range," Journal of Biomedical Optics 15(2), 026009 (1 March 2010). https://doi.org/10.1117/1.3369810
Published: 1 March 2010
Lens.org Logo
CITATIONS
Cited by 18 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Wavefronts

Detection and tracking algorithms

Wavefront sensors

Eye

Monochromatic aberrations

Adaptive optics

Algorithms

Back to Top