Atmospheric fogs create degraded visual environments, making it difficult to recover optical information from our surroundings. We have developed a low-SWaP technique which characterizes these environments using an f-theta lens to capture the angular scattering profile of a pencil beam passed through a fog. These measurements are then compared to data taken in tandem by conventional characterization techniques (optical transmission, bulk scattering coefficient, etc.). We present this angular scattering measurement as a low-SWaP alternative to current degraded visual environment characterization techniques to provide real-time data for implementation with signal recovery algorithms.
Event-based sensors (EBSs) consist of a pixelated focal plane array in which each pixel is an independent asynchronous change detector. The analog asynchronous array is read by a synchronous digital readout and written to disk. As a result, EBS pixels consume minimal power and bandwidth unless the scene changes. Furthermore, the change detectors have a very large dynamic range (~120 dB) and rapid response time (~20 us). A framing camera with comparable speed requires ~3 orders of magnitude more power and ~2 orders of magnitude higher bandwidth. Remote sensing deployed in the field requires low power, low bandwidth, and low complexity algorithms. An EBS inherently allows for low power and low bandwidth, but there is a lack of mature image analysis algorithms. While analysis of conventional imagers draws from decades of image processing algorithms, EBS data is a fundamentally different format; a series of x, y, asynchronous time, and polarization change (increase/decrease) as opposed to x, y, and intensity at a regularly sampled framerate. Our team has worked to develop and refine image processing algorithms that use EBS data directly.
Event-based sensors (EBS) consist of a pixelated focal plane array in which each pixel is an independent asynchronous change detector. The analog asynchronous array is read by a synchronous digital readout and written to disk. As a result, EBS pixels consume minimal power and bandwidth unless the scene changes. Furthermore, the change detectors have a very large dynamic range (~120 dB) and rapid response time (~20 us). A framing camera with comparable speed requires ~3 orders of magnitude more power and ~2 orders of magnitude higher bandwidth. These features make EBS an appealing technology for proliferation detection applications. Remote sensing deployed in the field requires low power, low bandwidth, and low complexity algorithms. EBS inherently allows for low power and low bandwidth, but a drawback of event-based sensors is the lack of mature image analysis algorithms. While analysis of conventional imagers draws from decades of image processing algorithms, EBS data is a fundamentally different format; a series of x, y, asynchronous time, and polarization change (increase/decrease) as opposed to x, y, and intensity at a regularly sampled framerate. To leverage the advantages of EBS over conventional imagers, our team has worked to develop and refine image processing algorithms that use EBS data directly. We will discuss these efforts, including frequency and phase detection. We will also discuss the field applications of these algorithms such as degraded visual environments (e.g., fog) and defeating laser dazzling attempts.
Event-based sensors are a novel sensing technology which capture the dynamics of a scene via pixel-level change detection. This technology operates with high speed (>10 kHz), low latency (10 μs), low power consumption (<1 W), and high dynamic range (120 dB). Compared to conventional, frame-based architectures that consistently report data for each pixel at a given frame rate, event-based sensor pixels only report data if a change in pixel intensity occurred. This affords the possibility of dramatically reducing the data reported in bandwidth-limited environments (e.g., remote sensing) and thus, the data needed to be processed while still recovering significant events. Degraded visual environments, such as those generated by fog, often hinder situational awareness by decreasing optical resolution and transmission range via random scattering of light. To respond to this challenge, we present the deployment of an event-based sensor in a controlled, experimentally generated, well-characterized degraded visual environment (a fog analogue), for detection of a modulated signal and comparison of data collected from an event-based sensor and from a traditional framing sensor.
Atmospheric fog is a common degraded visual environment (DVE) that reduces sensing and imaging range and resolution in complex ways not fully encapsulated by traditional metrics. As such, better physical models are required to describe imaging systems in a fog environment. We have developed a tabletop fog chamber capable of creating repeatable fog-like environments for controlled experimentation of optical systems within this common DVE. We present measurement of transmission coefficients and droplet size distribution in a multiple scattering regime using this chamber.
Degraded visual environments like fog pose a major challenge to safety and security because light is scattered by tiny particles. We show that by interpreting the scattered light it is possible to detect, localize, and characterize objects normally hidden in fog. First, a computationally efficient light transport model is presented that accounts for the light reflected and blocked by an opaque object. Then, statistical detection is demonstrated for a specified false alarm rate using the Neyman-Pearson lemma. Finally, object localization and characterization are implemented using the maximum likelihood estimate. These capabilities are being tested at the Sandia National Laboratory Fog Chamber Facility.
Dangerous materials present in factories and military combat locations, can cause negative effects to the human body and can be life threatening. Due to this, a portable, easily maintained, and robust sensor is required to detect CWA’s, TIC’s, and TIM’s. We present a method to grow MOFs on quartz crystal microbalances (QCM’s) for sensitive, selective detection of CWA’s. Our next step is to test the sensitivity and selectivity of the MOF to dimethyl methylphosphonate (DMMP) when under varying environmental conditions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.