As part of the survivability engineering process it is necessary
to accurately model and visualize the vehicle signatures in multi-
or hyperspectral bands of interest. The signature at a given wavelength is a function of the surface optical properties, reflection of the background and, in the thermal region, the
emission of thermal radiation. Currently, it is difficult to
obtain and utilize background models that are of sufficient
fidelity when compared with the vehicle models. In addition, the
background models create an additional layer of uncertainty in
estimating the vehicles signature. Therefore, to meet exacting rendering requirements we have developed RenderView, which incorporates the full bidirectional reflectance distribution function (BRDF). Instead of using a modeled background we have incorporated a measured calibrated background panoramic image to provide the high fidelity background interaction. Uncertainty in the background signature is reduced to the error in the measurement which is considerably smaller than the uncertainty inherent in a modeled background. RenderView utilizes a number of different descriptions of
the BRDF, including the Sandford-Robertson. In addition, it
provides complete conservation of energy with off axis sampling. A description of RenderView will be presented along with a methodology developed for collecting background panoramics. Examples of the RenderView output and the background panoramics will be presented along with our approach to handling the solar irradiance problem.
The current state of the art in synthetic radiometrically accurate scene generation for visual signatures remains immature. Even more difficult is creating composite images of photo-realistic synthetic images placed into images of real scenes. A potential solution to this problem is to use measured background data to drive the target rendering process. This approach has the advantage of deriving synthetic images with sufficient fidelity for inputs into the visual laboratory and performance codes. Since scene luminance can change rapidly, especially during partly cloudy conditions, all measurements must be obtained nearly simultaneously. This paper will explore the requirements for a visual predictive code and meeting these requirements with a background measurement process. A prototype measurement system will be described along with results from measurements.