Given today’s challenging Irregular Warfare, members of small infantry units must be able to function as highly sensitized perceivers throughout large operational areas. Improved Situation Awareness (SA) in rapidly changing fields of operation may also save lives of law enforcement personnel and first responders. Critical competencies for these individuals include sociocultural sensemaking, the ability to assess a situation through the perception of essential salient environmental and behavioral cues, and intuitive sensemaking, which allows experts to act with the utmost agility. Intuitive sensemaking and intuitive decision making (IDM), which involve processing information at a subconscious level, have been cited as playing a critical role in saving lives and enabling mission success. This paper discusses the development of a virtual environment for modeling, analysis and human-in-the-loop testing of perception, sensemaking, intuitive sensemaking, decision making (DM), and IDM performance, using state-of-the-art scene simulation and modeled imagery from multi-source systems, under the “Intuition and Implicit Learning” Basic Research Challenge (I2BRC) sponsored by the Office of Naval Research (ONR). We present results from our human systems engineering approach including 1) development of requirements and test metrics for individual and integrated system components, 2) the system architecture design 3) images of the prototype virtual environment testing system and 4) a discussion of the system’s current and future testing capabilities. In particular, we examine an Enhanced Interaction Suite testbed to model, test, and analyze the impact of advances in sensor spatial, and temporal resolution to a user’s intuitive sensemaking and decision making capabilities.
Modeling, Simulation and Training (MS&T) technologies have provided significant capabilities for
Military training and mission rehearsal. However, most of the
state-of-the-art MS&T systems used today
are high fidelity, stand alone systems, routinely staffed by a team of support and instructional personnel.
As the military becomes more reliant on these technologies to support ever changing concepts of
operations, they are asking for numerous technological advancements including 1) automated instructional
features to reduce the number of personnel required for exercises, 2) increased capability for adaptation of
human computer interfaces to support individual differences and embedded performance support in
operational settings, and 3) a continuum of low to high fidelity system components to provide embedded,
deployable and transportable solutions. A multi-disciplinary team of researchers at the University of
Central Florida's (UCF) Institute for Simulation and Training (IST) Applied Cognition and Training in
Immersive Virtual Environments Lab (ACTIVE), lead by Dr. Denise Nicholson, is performing research and
development to address these emerging requirements as part of on-going projects for Navy, Marine Corps
and Army customers. In this paper we will discuss some of the challenges that confront researchers in this
area and how the ACTIVE lab hopes to respond to these challenges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.