Traditional cosmic ray filtering algorithms used in X-ray imaging detectors aboard space telescopes perform event reconstruction based on the properties of activated pixels above a certain energy threshold, within 3×3 or 5×5 pixel sliding windows. This approach can reject up to 98% of the cosmic ray background. However, the remaining unrejected background constitutes a significant impediment to studies of low surface brightness objects, which are especially prevalent in the high-redshift universe. The main limitation of the traditional filtering algorithms is their ignorance of the long-range contextual information present in image frames. This becomes particularly problematic when analyzing signals created by secondary particles produced during interactions of cosmic rays with body of the detector. Such signals may look identical to the energy deposition left by X-ray photons, when one considers only the properties within the small sliding window. Additional information is present, however, in the spatial and energy correlations between signals in different parts of the same frame, which can be accessed by modern machine learning (ML) techniques. In this work, we continue the development of an ML-based pipeline for cosmic ray background mitigation. Our latest method consist of two stages: first, a frame classification neural network is used to create class activation maps (CAM), localizing all events within the frame; second, after event reconstruction, a random forest classifier, using features obtained from CAMs, is used to separate X-ray and cosmic ray features. The method delivers > 40% relative improvement over traditional filtering in background rejection in standard 0.3-10 keV energy range, at the expense of only a small (< 2%) level of lost X-ray signal. Our method also provides a convenient way to tune the cosmic ray rejection threshold to adapt to a user’s specific scientific needs.
Traditional image segmentation methods employed with X-ray imaging detectors aboard X-ray space telescopes consist of two stages: first, a low energy threshold is applied; groups of activated pixels are then classified according to their shapes and identified as valid X-ray events or rejected as being possibly induced by cosmic rays. This method is fast and removes up to 98% of the cosmic ray-induced background. However, these traditional methods fail to address two important problems: first, they struggle to recover the true energies of, and sometimes fail to detect entirely, low-energy photons (photon energies less than 0.5keV); second, they consider only the shape of the active pixel regions, ignoring the longer-range context within the image frames. This limits their sensitivity to a specific type of cosmic ray signal: ”islands” created by secondary particles produced by cosmic rays hitting the body of the telescope (the shapes of which are often indistinguishable from X-ray photon signals). Together, these limitations hinder investigations of faint, diffuse targets, such as the outskirts of galaxies and galaxy clusters, and of ”low energy” targets such as individual stars, galaxies and high redshift systems. Both limitations can, however, be addressed with machine learning (ML) models. This work is part of our effort to develop fast and efficient background reduction methods for future astronomical X-ray missions using ML methods. We highlight several significant improvements in the classification and semantic segmentation of our background filtering pipeline. Our more realistic training and test data now incorporate the effects of readout noise and charge diffusion. In the presence of charge diffusion, our model is able to obtain an 80% relative improvement in lost signal recovery compared to the traditional background reduction techniques. We identify several directions for further development of the model.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.