Abundance variations of carbon and nitrogen in globular star clusters provide astronomers with a means to determine a cluster's evolutionary past. Moreover, these clusters are so ancient (~13 billion years) and so well preserved that they provide an ideal diagnostic for the overall chemical history of the Milky Way Galaxy.
Traditionally, spectroscopy is the preferred method to perform investigations into such theories. However, it is not without its drawbacks: spectroscopy can normally only be obtained star by star, and both large telescopes and a great deal of time is required to carry out research in this manner. As globular clusters are known to
contain up to a million stars, studying each star individually would take too much time to return a true representative sample of the cluster stars. So, we opt instead for a spectrophotometric technique and a statistical approach to infer a cluster's composition variations. This has required the design and use of new custom narrow-band filters centered on the CH and CN molecular absorption bands or their adjacent continua. Two Galactic clusters (M71 & M92) with contrasting characteristics have been chosen for this study. In order to process this data a header-driven (i.e. automated) astronomical data-processing pipeline was developed for use with a
family of CCD instruments known as the FOSCs. The advent of CCD detectors has allowed astronomers to generate large quantities of raw data on a nightly basis, but processing of this amount of data is extremely time and resource intensive. In our case the majority of our cluster data has been obtained using the BFOSC instrument on the 1.52m Cassini Telescope at Loiano, Italy. However, as there are a number of these FOSC instruments throughout the world, our pipeline can be easily adapted to suit any of them. The pipeline has been tested using various types of data ranging from brown dwarf stars to globular cluster images, with each new dataset providing us with new problems/bugs to solve and overcome. The pipeline performs various tasks such as data reduction including image de-fringing, image registration and photometry, with final products consisting of RGB colour images and colour magnitude diagrams (CMD).
To take advantage of the recent upsurge in astrophysical research applications of grid technologies coupled with the increase in temporal and spatial coverage afforded to us by dedicated all-sky surveys and on-line data archives, we have developed an automated image reduction and analysis pipeline for a number of different astronomical instruments. The primary science goal of the project is in the study of long-term optical variability of brown dwarfs, although it can be tailored to suit many varied astrophysical phenomena. The pipeline complements Querator, the custom search-engine which accesses the astronomical image archives based at the ST-ECF/ESO centre in Garching, Germany. To increase our dataset we complement the reduction and analysis of WFI (Wide Field Imager, mounted on the 2.2-m MPG/ESO telescope at La Silla) archival images with the analysis of pre-reduced co-spatial HST/WFPC2 images and near infrared images from the DENIS archive. Our pipeline includes CCD-image reduction, registration, astrometry, photometry, and image matching stages. We present sample results of all stages of the pipeline and describe how we overcome such problems as missing or incorrect image meta-data, interference fringing, poor image calibration files etc. The pipeline was written using tasks contained in the IRAF environment, linked together with Unix Shell Scripts and Perl, and the image reduction and analysis is performed using a 40-processor Origin SGI 3800 based at NUI, Galway.
As the astronomical community continues to produce deeper and higher resolution data, it becomes increasingly important to provide tools to the scientist that help mining the data in order to provide only the scientifically interesting images. In the case of uncalibrated archives, this task is especially difficult as it is difficult to know whether an interesting source can be seen on images without actually looking. Here, we show how instrument simulation can be used to lightly process the database-stored image descriptors of the ESO/Wide Field Imager (WFI) archive, and compute the corresponding limiting magnitudes. The end result is a more scientific description of the ESO/ST-ECF archive contents, allowing a more astronomer-friendly archive user interface, and hence increasing the archive useability in the context of a Virtual Observatory. This method is developed for improving the Querator search engine of ESO/HST archive, in the context of the EC funded ASTROVIRTEL project, but also provides an independant tool that can be adapted to other archives.
There is a family of difficult image-processing scenarios which
involve seeking out and quantifying minute changes within a sequence
of near-identical images. Traditionally these have been dealt with by
carefully registering the images in terms of position, orientiation
and intensity, and subtracting them from some template image. However, for critical measurements, this approach breaks down if the
point-spread-functions (PSFs) vary even slightly from image to
image. Subtraction of registered images whose PSFs are not matched
leads to considerable residual structure, which may be mistakenly
interpreted as real features rather than processing artefacts. In
astronomy, software known as ISIS has been developed to
fully PSF-match image sequences and to facilitate their analysis. We
show here the tremendous improvement in detection rates and
measurement accuracy which ISIS has afforded in our program for the
study of rare variable stars in dense, globular star clusters. We
discuss the genesis from this work of our new program to use ISIS to
search for extra-solar planets in transit across the face of stars in
such clusters. Finally we illustrate an application of ISIS in the
industrial imaging sector, showing how it can be used to detect minute faults in images of products.