NIRCam Coronagraphy was declared ready for science in the early summer 2022. Several impactful science results have since been obtained using the NIRCam coronagraphs, mainly on known exoplanetary systems. In this contribution we give an update on all improvements we have implemented to make this mode more efficient and perform better. With tight timing constraints in commissioning, we focused on the long wavelengths occulter MASK335R. Here we describe how we improved the target acquisition for all five masks, the distortion correction and global alignment, the absolute flux calibration, etc. We also implemented the default dual channel operations mid-Cycle 1 (simultaneous short and long wavelengths). While not trivial, this new capability improves the efficiency and the impact NIRCam Coronagraphy can have in the field of exoplanets. We discuss the current on-sky contrasts and astrometric performances which are now better understood and can be compared to other high contrast facilities. We demonstrate that NIRCam Coronagraphy is transformative in characterizing known objects but also discovering colder and/or more mature exoplanets.
KEYWORDS: Coronagraphy, Stars, James Webb Space Telescope, Point spread functions, Distortion, Telescopes, Signal to noise ratio, Calibration, Target acquisition, Exoplanets, Astronomical imaging, Near infrared, Direct methods, Astronomical instrumentation
In a cold and stable space environment, the James Webb Space Telescope (JWST or ”Webb”) reaches unprecedented sensitivities at wavelengths beyond 2 microns, serving most fields of astrophysics. It also extends the parameter space of high-contrast imaging in the near and mid-infrared. Launched in late 2021, JWST underwent a six month commissioning period. In this contribution we focus on the NIRCam Coronagraphy mode which was declared ”science ready” on July 10 2022, the last of the 17 JWST observing modes. Essentially, this mode enables the detection of fainter/redder/colder (less massive for a given age) self-luminous exoplanets as well as other faint astrophysical signal in the vicinity of any bright object (stars or galaxies). Here we describe some of the steps and hurdles the commissioning team went through to achieve excellent performances. Specifically, we focus on the Coronagraphic Suppression Verification activity. We were able to produce firm detections at 3.35µm of the white dwarf companion HD 114174 B which is at a separation of ' 0.500and a contrast of ' 10 magnitudes (104 fainter than the K∼5.3 host star). We compare these first on-sky images with our latest, most informed and realistic end-to-end simulations through the same pipeline. Additionally we provide information on how we succeeded with the target acquisition with all five NIRCam focal plane masks and their four corresponding wedged Lyot stops.
KEYWORDS: Calibration, Interference (communication), Shortwaves, Sensors, Signal detection, Cadmium sulfide, James Webb Space Telescope, Nickel, Signal generators, Infrared detectors
We explore a new method to generate superbias files for the NIRCam detectors. Using data from Cryo-Vacuum 2 (CV2) testing, we subtract 1/f noise from NIRCam integrations before averaging the data to produce superbias maps. Our analysis shows that for a given dataset, using this method we are able to produce superbias images with significantly lower noise levels than those produced using the more traditional approach to superbias generation. We also find that we can produce a superbias which minimizes the noise in a superbias-subtracted file by using only the first 10 readouts from each of 15-20 dark current integrations. Our testing reveals that this method is successful for data from both the shortwave and longwave detectors on NIRCam.
KEYWORDS: Data conversion, Infrared detectors, Infrared telescopes, James Webb Space Telescope, Sensors, Electrons, Calibration, Data conversion, Interference (communication), Signal detection, Near infrared
Conversion gain is a basic detector property which relates the raw counts in a pixel in data numbers (DN) to the number of electrons detected. The standard method for determining the gain is called the Photon Transfer Curve (PTC) method and involves the measurement the change in variance as a function of signal level. For non-linear IR detectors, this method depends strongly on the non-linearity correction and is therefore susceptible to systematic biases due to calibration issues. We have developed a new, robust, and fast method, the differential Photon Transfer Curve (dPTC) method, which is independent of non-linearity corrections, but still delivers gain values similar in precision but higher in accuracy.
KEYWORDS: Temperature metrology, Cameras, Imaging systems, Calibration, Electrons, Signal to noise ratio, Temperature sensors, Astronomy, Charge-coupled devices, Control systems
Dark current is caused by electrons that are thermally exited into the conduction band. These electrons are collected by
the well of the CCD and add a false signal to the chip. We will present an algorithm that automatically corrects for dark
current. It uses a calibration protocol to characterize the image sensor for different temperatures. For a given exposure
time, the dark current of every pixel is characteristic of a specific temperature. The dark current of every pixel can
therefore be used as an indicator of the temperature. Hot pixels have the highest signal-to-noise ratio and are the best
temperature sensors. We use the dark current of a several hundred hot pixels to sense the chip temperature and predict
the dark current of all pixels on the chip. Dark current computation is not a new concept, but our approach is unique.
Some advantages of our method include applicability for poorly temperature-controlled camera systems and the
possibility of ex post facto dark current correction.
A long term program to quantify the intrinsic site seeing at McDonald Observatory, using two differential image motion monitors (DIMMs) has been initiated on Mt. Fowlkes where the Hobby-Eberly Telescope (HET) is located. Raw DIMM data are corrected to the zenith and to a uniform 10msec integration time. Nightly median seeing measurements (FWHM) along with the max/min range are presented for 186 nights over the 13 month period between July 2001 and July 2002. A definite seasonal effect is present in the dataset with the median seeing in the spring-summer-fall months (0.93±0.18 arcsec) being significantly better than the winter months (1.24±0.33 arcsec). The measured seeing was better than 0.70 arcsec about 9% of the time. Since DIMM units were operated at ground level these data are not quite lower limits to the site seeing performance. Even so, the seeing of this West Texas continental site at 6,650ft (2,027m) elevation in the Davis Mountains is superior to what has been assumed in the past, based on less direct seeing measurements.
Future plans are described for moving a DIMM telescope to a tower mounted, semi-automated observatory to sample the site seeing at an elevation above the ground similar to the HET mirror.
KEYWORDS: Astronomy, Databases, Telescopes, Image processing, Stars, Point spread functions, Space telescopes, Data processing, Data acquisition, Calibration
The era of large survey datasets has arrived, and the era of large survey telescope projects is upon us. Many of these new telescope projects will not only produce large datasets, they will produce datasets that require real-time astronomical analysis, including object detection, photometry, and classification. These datasets promise to open new horizons in the exploration of the time domain in astrophysical systems on large scales. But to fulfill this promise, the projects must design and develop data management systems on a much larger scale (many Terabytes per day continuously) than has previously been achieved in astronomy. Working together, NOAO and the University of Washington are developing prototype pipeline systems to explore the issues involved in real-time time-variability analysis. These efforts are not simply theoretical exercises, but rather are driven by NOAO Survey programs which are generating large data flows. Our survey projects provide a science-driven testbed of data management strategies needed for future initiatives such as the Large Synoptic Survey Telescope and other large-scale astronomical data production systems.
We present data for dark current of a back-illuminated CCD over the temperature range of 222 to 291 K. Using an Arrhenius law, we found that the analysis of the data leads to the relation between the prefactor and the apparent activation energy as described by the Meyer-Neldel rule. However, a more detailed analysis shows that the activation energy for the dark current changes in the temperature range investigated. This transition can be explained by the larger relative importance at high temperatures of the diffusion dark current and at low temperatures by the depletion dark current. The diffusion dark current, characterized by the band gap of silicon, is uniform for all pixels. At low temperatures, the depletion dark current, characterized by half the band gap, prevails, but it varies for different pixels. Dark current spikes are pronounced at low temperatures and can be explained by large concentrations of deep level impurities in those particular pixels. We show that fitting the data with the impurity concentration as the only variable can explain the dark current characteristics of all the pixels on the chip.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.