The Spectral Energy Distribution (SED) Machine is an Integral Field Unit (IFU) spectrograph designed specifically to classify transients. It is comprised of two subsystems. A lenselet based IFU, with a 26" × 26" Field of View (FoV) and ∼ 0.75" spaxels feeds a constant resolution (R∼100) triple-prism. The dispersed rays are than imaged onto an off-the-shelf CCD detector. The second subsystem, the Rainbow Camera (RC), is a 4-band seeing-limited imager with a 12.5' × 12.5' FoV around the IFU that will allow real time spectrophotometric calibrations with a ∼ 5% accuracy. Data from both subsystems will be processed in real time using a dedicated reduction pipeline. The SED Machine will be mounted on the Palomar 60-inch robotic telescope (P60), covers a wavelength range of 370 − 920nm at high throughput and will classify transients from on-going and future surveys at a high rate. This will provide good statistics for common types of transients, and a better ability to discover and study rare and exotic ones. We present the science cases, optical design, and data reduction strategy of the SED Machine. The SED machine is currently being constructed at the Calofornia Institute of Technology, and will be comissioned on the spring of 2013.
The Dark Energy Survey (DES) collaboration will study cosmic acceleration with a 5000 deg2griZY survey in the
southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process
and archive these data and the resulting science ready data products. The DESDM system consists of an integrated
archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the
DESDM system for operation in the high performance computing (HPC) environments at the National Center for
Supercomputing Applications (NCSA) and Fermilab. Operating the DESDM system in an HPC environment offers
both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive
tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the
public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the
storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested
the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES
nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database.
We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera.
Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.
The Dark Energy Survey (DES; operations 2009-2015) will address the nature of dark energy using four independent and complementary techniques: (1) a galaxy cluster survey over 4000 deg2 in collaboration with the South Pole Telescope Sunyaev-Zel'dovich effect mapping experiment, (2) a cosmic shear measurement over 5000 deg2, (3) a galaxy angular clustering measurement within redshift shells to redshift=1.35, and (4) distance measurements to 1900 supernovae Ia. The DES will produce 200 TB of raw data in four bands, These data will be processed into science ready images and catalogs and co-added into deeper, higher quality images and catalogs. In total, the DES dataset will exceed 1 PB, including a 100 TB catalog database that will serve as a key science analysis tool for the astronomy/cosmology community. The data rate, volume, and duration of the survey require a new type of data management (DM) system that (1) offers a high degree of automation and robustness and (2) leverages the existing high performance computing infrastructure to meet the project's DM targets. The DES DM system consists of (1) a gridenabled, flexible and scalable middleware developed at NCSA for the broader scientific community, (2) astronomy
modules that build upon community software, and (3) a DES archive to support automated processing and to serve DES catalogs and images to the collaboration and the public. In the recent DES Data Challenge 1 we deployed and tested the first version of the DES DM system, successfully reducing 700 GB of raw simulated images into 5 TB of reduced data products and cataloguing 50 million objects with calibrated astrometry and photometry.