Annealing, in metallurgy and materials science, is a heat treatment wherein the microstructure of a material is
altered, causing changes in its properties such as strength and hardness. We define concept annealing as a lexical,
syntactic, and semantic expansion capability (the removal of defects and the internal stresses that cause term- and
phrase-based search failure) coupled with a directed contraction capability (semantically-related terms, queries, and
concepts nucleate and grow to replace those originally deformed by internal stresses). These two capabilities are
tied together in a control loop mediated by the information retrieval precision and recall metrics coupled with
intuition provided by the operator. The specific representations developed have been targeted at facilitating highly
efficient and effective semantic indexing and searching. This new generation of Find capability enables additional
processing (i.e. all-source tracking, relationship extraction, and total system resource management) at rates,
precisions, and accuracies previously considered infeasible. In a recent experiment, an order magnitude reduction in
time to actionable intelligence and nearly three orderss magnitude reduction in false alarm rate was achieved.
Three technologies form the heart of any network-centric command, control, communication, intelligence, surveillance, and reconnaissance (C4ISR) system: distributed processing, reconfigurable networking, and distributed resource management. Distributed processing, enabled by automated federation, mobile code, intelligent process allocation, dynamic multiprocessing groups, check pointing, and other capabilities creates a virtual peer-to-peer computing network across the force. Reconfigurable networking, consisting of content-based information exchange, dynamic ad-hoc routing, information operations (perception management) and other component technologies forms the interconnect fabric for fault tolerant inter processor and node communication. Distributed resource management, which provides the means for distributed cooperative sensor management, foe sensor utilization, opportunistic collection, symbiotic inductive/deductive reasoning and other applications provides the canonical algorithms for network-centric enterprises and warfare.
This paper introduces these three core technologies and briefly discusses a sampling of their component technologies and their individual contributions to network-centric enterprises and warfare. Based on the implied requirements, two new algorithms are defined and characterized which provide critical building blocks for network centricity: distributed asynchronous auctioning and predictive dynamic source routing. The first provides a reliable, efficient, effective approach for near-optimal assignment problems; the algorithm has been demonstrated to be a viable implementation for ad-hoc command and control, object/sensor pairing, and weapon/target assignment. The second is founded on traditional dynamic source routing (from mobile ad-hoc networking), but leverages the results of ad-hoc command and control (from the contributed auctioning algorithm) into significant increases in connection reliability through forward prediction. Emphasis is placed on the advantages gained from the closed-loop interaction of the multiple technologies in the network-centric application environment.
The vision for the Joint Tactical Radio System (JTRS) is to develop a family of affordable, high-capacity tactical radios to provide both line-of-sight and beyond-line-of- sight Command, Control, Communications, Computers and Intelligence (C41) capabilities to the warfighters. This family of software will be capable of transmitting voice, video and data; the architecture will be common, open, and used in a wide range of implementations. This paper addresses several operational and implementation concepts which fit within these vision and capability statements (quoted from the program office), but require thinking outside the JTRS box.
This paper presents the Joint Communication Simulator (JCS) system design as a case study of both the conceptual and implementation applicability of High Level Architecture (HLA) in this difficult context. Specific technical topics to be covered include an overview of JCS requirements, an overview of the modeling concept and system architecture in terms of the HLA, a definition of the subset of Run Time Infrastructure (RTI) functionality and HLA interface specification applied, and an overview of the RTI subset implementation. In addition, it addresses the political questions of HLA compliance, the openness of RTI designs and implementation, and the issue of RTI certification.
Consider the downsizing of our forces, the increasing complexity of our tactical platforms, and the ever widening array of communication options and the conclusion is inevitable. The need for automated support to reduce communication-related workload is critical to continued task force effectiveness. In a previous era, communication management expertise resided solely in the form of human experts. These experts flew with the pilots, providing the most effective means of communication in real time; they have since been removed from a great number of platforms due to force downsizing and real estate value in the cockpit. This burden has typically been shifted to the pilot, providing another set of tasks in an environment which is already far too taxing. An Expert Communication Link Manger (ECLM) is required -- a trusted, reliable assistant which can determine optimal link, channel, and waveform data for the communication requirements at hand and translate those requirements transparently into communication device control. Technologies are at hand which make ECLM possible; the mixture of these elements in the correct proportions can provide a capable, deployable, and cost effective ECLM in the near term. This paper describes specific applied ECLM research work in progress funded by the USAF under a four year effort. Operational objectives, technical objectives, a reference design, and technical excursions within the broad ECLM scope will be discussed in detail. Results of prototypes built to date in the area of communication inference from speech understanding, dynamic adaptive routing, and packet switching networks in the tactical environment will be presented.
Proc. SPIE. 3083, Enabling Technology for Simulation Science
KEYWORDS: Computer simulations, Distributed interactive simulations, Data modeling, Chemical elements, Taxonomy, Standards development, Monte Carlo methods, Computer programming, Sensors, Defense and security
The construction, execution, and analysis of application- oriented simulations is difficult; the integration, coordinated execution, and after action review of heterogeneous distributed simulations can be overwhelming. Economy, risk mitigation, and just plain common sense compel us to utilize legacy simulations but discrepancies in controllability, fidelity, implementation paradigm, algorithms, representations, time management, construction, etc. tend to negate any potential gain. While several generations of interoperability approaches and associated standards have emerged and matured, even they have been limited in their ability to accommodate disparate classes of simulations. Within the permitted scope of this paper, a taxonomy for the most common interoperability issues (portcullises) for distributed simulation is developed. Part of this identification process will consist of establishing contexts and/or prerequisites for the issues, e.g. under what conditions are the issues actually issues at all. As a result, the prioritization will become application dependent. Methods for resolving the issues (battering rams), couched in the form of case studies, are subsequently presented to close the circle. Sources will include industry and government state-of- the-practice, academic state-of-the-art, and our own broad experience. Specific topics to be discussed include application philosophy, the integration of live entities, investigative versus analytical simulation, implications of human-in-the-loop, mixed and/or variable fidelity, heterogeneous time management schemes, current and emerging distributed simulation standards, simulation/exercise management, and control and data distribution. Discussion will focus heavily on examples and experience.
KEYWORDS: Sensors, Computer architecture, Commercial off the shelf technology, Data processing, Control systems, Computing systems, Data fusion, Data acquisition, Standards development, Situational awareness sensors
Increased use of joint task force concepts is expanding the battlespace and placing higher demands on interoperability. But simultaneous downsizing of forces is increasing the workload on warfighters; while there is a demand for increased decision aiding there has not been a corresponding increase in computational resources. Force wide situation management, the proactive command and control (C2) of the battlespace enabled by broad situation awareness and a deep understanding of mission context, is not likely given today's computational capability, system architecture, algorithmic, and datalink limitations. Next generation C2, e.g. decentralized, `rolling' etc., could be significantly enhanced by distributed situation management processing techniques. Presented herein is a sampling of core technologies, software architectures, cognitive processing algorithms, and datalink requirements which could enable next generation C2. Dynamic, adaptive process distribution concepts are discussed which address platform and tactical application computational capability limitations. Software and datalink architectures are then presented which facilitate situation management process distribution. Finally, required evolution of current algorithms and algorithms potentially enabled within these concepts are introduced.
Within the context of our sensor fusion systems, we define an entity's vulnerability as the certainty with which other entities have the capability to detect and/or strike the entity; vulnerability assessment (VA) is the inference of vulnerability certainties. This investigation considers two issues: the feasibility of a fuzzy VA algorithm and the interface of a fuzzy VA algorithm into an existing sensor fusion system, including human-machine interface aspects. Relative kinematics, sensor/weapon technical capabilities, sensor/weapon system state, contextual electronic signatures, physics, terrain, atmospherics, and doctrinal bias are certainly all viable inputs to a VA algorithm. These data are traditionally characterized by a mix of continuous, discrete, and/or symbolic values with associated error bounds in various mathematical forms. Hence, the algorithmic infusion of a fuzzy VA into this systemic environment implies resolving the uncertainty information content of these representations and integrating them into a coherent fuzzy reasoning context. The information overload facing the tactical operator has necessitated the reduction of many data to prioritized simple alerts. While there is a reasonable understanding of the visual representations and implications of thresholding probabilistic data, the presentation and thresholding of fuzzy data is not well understood; some of the more critical implications on the human-machine interface are presented herein.
Command and control within the ATC environment remains primarily voice-based. Hence, automatic real time, speaker independent, continuous speech recognition (CSR) has many obvious applications and implied benefits to the ATC community: automated target tagging, aircraft compliance monitoring, controller training, automatic alarm disabling, display management, and many others. However, while current state-of-the-art CSR systems provide upwards of 98% word accuracy in laboratory environments, recent low-intrusion experiments in the ATCT environments demonstrated less than 70% word accuracy in spite of significant investments in recognizer tuning. Acoustic channel irregularities and controller/pilot grammar verities impact current CSR algorithms at their weakest points. It will be shown herein, however, that real time context- and environment-sensitive gisting can provide key command phrase recognition rates of greater than 95% using the same low-intrusion approach. The combination of real time inexact syntactic pattern recognition techniques and a tight integration of CSR, gisting, and ATC database accessor system components is the key to these high phase recognition rates. A system concept for real time gisting in the ATC context is presented herein. After establishing an application context, discussion presents a minimal CSR technology context then focuses on the gisting mechanism, desirable interfaces into the ATCT database environment, and data and control flow within the prototype system. Results of recent tests for a subset of the functionality are presented together with suggestions for further research.
The primary thrusts of the intelligent multisource, multisensor integration (IMMSI) effort are to formalize an approach to hypothesis-driven distributed sensor management, validate that approach, identify candidates for decision support, and investigate implementations of appropriate cognitive processing modules. Using the existing manual voice communication-based cooperative process as a model, a coherent suite of human-machine interfaces, data communication protocols, and decision aids are being developed with the goal of real-time global optimal sensor allocation within the mission context. The Knowledgeable Observer And Linked Advice System (KOALAS) architecture provides a framework for constructing the operator-inductive/machine- deductive IMMSI system. The machine continuously updates a model of the environment from both local and remote sensor data. The operator interacts with the system by evaluating the perceived model and tuning it through the introduction of hypotheses. These hypotheses, also shared among platforms, provide cues for sensor management. The evolving sensor allocation provides new data for the model and a closed-loop intelligent control system is created. The cooperative agent paradigm provides a cognitive model for the IMMSI distributed sensor management process. In a typical cooperative task the common goal is achieved by the agents performing discrete transactions on a shared system state vector. Within the tactical environment, however, centralization of data is neither desirable nor possible; hence, coherency of a distributed track, hypothesis, and global sensor allocation database is also an issue.
To promote user community acceptance and incorporation, an intuitive operational model and GUI-based tool were developed to aid knowledge acquisition and automatically generate the required approximate grammars. Temporal step sequences are developed by operational analysts and assigned the attributes of necessity and confirming strength. Observable detectability and confusability attributes aggregate sensor suite capabilities and are specified by the sensor system engineers. The tool guides the user in combination and correspondence of the two knowledge sources, producing either a set of usable approximate grammars together with a list of potential conflicts.
Although the object oriented programming paradigm is an intuitive embodiment of the static attributes of an application, temporal behavior of object interaction is typically encrypted in the distributed control structure of the implementation. TOM requires a platform-independent operating system support library which permits the arbitrary scheduling of object message passing. Applications include systems which reason through time; the arbitration of distributed, real time competing and cooperating reasoning systems; and the rapid construction of simulators for reasoning system validation. Performance and applicability of the package are currently being evaluated via several tactical command and control development systems. TOM permits the arbitrary allocation of objects between processing platforms, i.e., object allocation need not be known at design time. Message passing is extended through host LANs when necessary to reach remote objects. Three prototype temporal behaviors are provided: single, cyclic, and frequency limited. Scheduled message services are qualified with a user- assigned priority which is used to arbitrate host computing resources. Discussion highlights the seamless integration of temporal activity into the object oriented paradigm and demonstrates the benefits of the package through several diverse example applications.