Translator Disclaimer
13 December 2002 High-performance data processing using distributed computing on the SOLIS project
Author Affiliations +
The SOLIS solar telescope collects data at a high rate, resulting in 500 GB of raw data each day. The SOLIS Data Handling System (DHS) has been designed to quickly process this data down to 156 GB of reduced data. The DHS design uses pools of distributed reduction processes that are allocated to different observations as needed. A farm of 10 dual-cpu Linux boxes contains the pools of reduction processes. Control is through CORBA and data is stored on a fibre channel storage area network (SAN). Three other Linux boxes are responsible for pulling data from the instruments using SAN-based ringbuffers. Control applications are Java-based while the reduction processes are written in C++. This paper presents the overall design of the SOLIS DHS and provides details on the approach used to control the pooled reduction processes. The various strategies used to manage the high data rates are also covered.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Stephen Wampler "High-performance data processing using distributed computing on the SOLIS project", Proc. SPIE 4848, Advanced Telescope and Instrumentation Control Software II, (13 December 2002);


EMIR data factory system
Proceedings of SPIE (July 18 2014)
Data And Control Flow Diagram And Process Automation
Proceedings of SPIE (October 19 1987)
A new approach for instrument software at Gemini
Proceedings of SPIE (July 14 2008)
SOLIS observation control system
Proceedings of SPIE (June 16 2000)

Back to Top