|
1.INTRODUCTIONHyperspectral imaging as a research topic and applications is much persecuted. It is already being used successfully many times, e.g. in quality assurance, medical and forensic investigations or remote sensing. A great feature of the spectral measurement with spatial resolution is that these techniques are non-tactile. One of the goals of the research topics in the Qualimess project was to compare different hyperspectral imaging techniques by their properties for finding the most suitable measurement principle for different investigation tasks. The developed declarative algorithm model connects al these aspects in a common signal flow. Due to many reasons of complexity, the vision is to change specific function blocks depending on the system or measurement task, without additional uncertainty or losing functionality and accuracy. Another fact is that complexity increases by the demands on the range spectral resolution. Because of technical and physically limitations, a multi-sensor system is needed to capture the information. All the work come together in the vision of capturing a whole range of spectral signatures of an object for concept the system, algorithms and components afterwards and to transfer the system inside industrial applications and environments 1,2. 2.SIGNAL FLOW MODELS2.1General information flow model for spectral imaging systemsFor an overview of the overall system, the general signal or information flow model is given in Figure 1, which is the principle underlying the recording and processing of hyperspectral data. Description… In direct interaction with the sample object, the illumination or radiation source is the energy supplier and the sensor is the primary converter for the intensity information as a one-dimensional or two-dimensional recorded, depending on the characteristics and imaging technique. In the case of whiskbroom-imaging and push-broom imaging, this information is exclusively one-dimensional, i.e. only spectrally resolved. The spatial resolution has to be registered by using another measuring system to measure the metric information. The same applies to the spectral resolution in staring-imaging. Here, the spectral channels must be decoded incrementally - usually in the form of discrete filters. More detailed information on the imaging techniques can be found in the relevant literature 3,4. It should also be noted that, for illustrative reasons, only the method of reflection spectroscopy or fluorescence spectroscopy is schematized in Figure 1. The measuring principle can also be extended to transmission spectroscopy 5. The next instance in the signal flow is the processing and control system. Their task is to process the measurement data and to control or regulate the primary converter as well as additional hardware (positioning unit for relative movement). This function block can be implemented both on a PC basis and as a system on chip (e.g. FPGA). Another work package in Quailmess deals exclusively with its implementation (6 and 7). Using the example of a task from the field of quality assurance, the process or process control is in direct interaction with the processing control system. Furthermore, the raw or pre-processed measured values obtained can be stored and visualized directly. 2.2Signal flow model for a multiple imaging systemWithin the Qualimess project different hardware for hyperspectral image acquisition were developed. The challenge over the whole development process is to abstract the general signal flow model in a top-down principle to extended signal flow model, which is able to handle aberrations and inhomogeneities of the imaging systems and multiple systems in a multichannel flow as well. Due to this claim, the model definition was extended to a multiple recording system. For some acquisition systems it is necessary to merge the spectral data sets, which were collected with different sensor systems, so that the spatial resolution matches and the wavelength range can be extended by using multiple systems. This is of interest, for example, for the correction of hyperspectral data that was acquired with a Push Broom acquisition system. Due to physical laws, different spectral ranges require different sensor systems, which may differ in their properties such as spatial resolution. In order to be able to correctly image a measurement object in an extended spectral range, these data must be combined to form a uniform spectral cube. This described synthesis model represents an independent function block in the declarative programming model. The acronym ENVI stands for “the Environment for Visualizing Images” and is used for a software application that can process and analysis remote sensing data8. ENVI uses a simple flat binary structure with an additional metadata header file in ASCII format 9. This header file contains all the relevant information from and about the imaging systems like discrete spectral bands, coding format (for example BIP, BIL or BSQ) and bit depth. Therefore, it is an appropriate exchange and storage format for the investigations done in Qualimess, especially for a multiple sensor system. 3.PREPROCESSING MODELS3.1Correction and calibration in push-broom imagingIn the signal strings upstream of the fusion block, a wide variety of additional correction methods can be implemented. This also includes the implementation of correction models, which have been developed from other work packages during processing. The model was mostly tested and developed on the basis of the Push Broom acquisition systems mentioned above. The reason for this is the high calibration effort and the number of parameters of these systems. In relation to the declarative model, this means that each system can be assigned its own correction block, which within the cybernetic structure of the overall algorithm must be linked exclusively to inputs and outputs Figure 3 shows a section of the designed and successfully tested in process algorithm structure for a Push Broom system. The shown calibration data set is initialized by s sperate preprocess calibration procedure, detailed information for the example of a push-broom imaging system are given in 7. In addition, several fusion strategies are inherent to the fusion block, which can be selected by an external parameter. Last but not least, the block could be adapted to the corresponding measurement scenario. So, it is possible to recalibrate the measurement setup for the specific conditions of the measurement task for getting the most out of the systems capabilities, particularly with regard to the spatial resolution. 3.2Merging of spectral cubesA large part of the previous studies for the development of the declarative algorithm model deals about the merging of spectral data cubes that are captured by different imaging systems with significant differences in spatial and spectral resolution. Detailed information about the theory and the studies can be found in 11 and 10. The implemented fusion techniques are implemented as a processing block for the declarative algorithm model. Based on the metadata information, the algorithm chose the best technique, differs due to boundary conditions such as measurement system, measurement environment and application. For a multiple imaging system, the pure algorithm supports to different merging principles. A distinction is made between interpolation to the lowest resolution (ip2LR) or interpolation to the highest resolution. As it is obvious, the applied principle depends on the measurement task. 4.ALGORITHMIC ASPECTS4.1Sparsity of correcting blocksOne of the great premises during the development of the algorithm is the efficiency and sparsity of the algorithms. Hyperspectral data cubes are very challenging on memory and other computing resources. Therefore, all computing steps are optimized, especially the allocating of physical dynamic memory. Further on, the in-process calibration steps are very high efficiently. For example, the process shown in Figure 3 is based on discrete wavelet transforms for representing signals with a minimum of computing resources. 4.2Sparsity of fusion blockThe fusion block, in particular, was also given the high attention during code optimization in order to make the algorithms as efficient as possible and to avoid unnecessary memory utilization in case of necessary permutations. The interpolation algorithms are implemented as resource efficient as possible. For example, fusion tasks where a higher resolution data set needs to be reduced in spatial resolution is mostly implemented by the nearest neighbor method, what saves about 50% computing time compared to the bicubic interpolation method. Regarding a system on chip based practical solutions, this approach is a major advantage. 5.LABORATORY TOOLS AND PRACTICE5.1Analytic toolsAs the first software demonstrator, a measuring program was designed that allows the representation and selective intensity measurement of individual layers of a fused and preprocessed spectral cube. A corresponding screenshot is shown in Figure 4. This tool can view all slices of the spectral cube. The primary application is to view the raw sensor data for verifying the sharpness and quality of the measurement data ad aberrations as well. Additionally, some basic preprocessing methods like digital gain, normalization and denoising are implemented. Currently, most of the enhanced analytic and processing applications (e.g. PCA based analytics) are done in separate software like “EVINCE” or “Scyven” for reasons of efficiency and research capacities. 5.2Acquisition software toolsDepending on the process, acquisition procedure is more or less complex. Push-broom imaging has a very high complexity. For a multiple sensor system it is necessary to position the sample exactly and to move it in a metric grid, which is one pixel in size according to the scale of the image. The developed acquisition software toolbox (shown in Figure 5) contains all the described functionality (function-blocks), separated by the measurement principle. It is able to merge the corrected spectral data captured by a multiple sensor system and export the dataset as a spectral cube and the corresponding meta data as well. The screenshot in Figure 5 shows the graphical user interface for a push -broom imaging process. It has the special ability to connect the necessary hardware, reference and calibrate the system and calculate the needed moving parameters for the positioning system depending on the sample properties. In the case of the filter wheel camera, Figure 6 shows the interface for this measurement process. For the evaluation and interpretation of the acquired measurement data, form elements such as points, lines, arcs or circles can be determined, and the gray value gradient with the corresponding gray value histogram can be analyzed in the search beam of a defined Area Of-Interest (AOI). For this special problem, the following project year will see work on the realization of the white balance, an automated calibration in x y direction and the improvement of the resolution in z direction. 6.CONCLUSION AND OUTLOOKThe developed algorithm model allows the manageability of all systems developed within the Qualimess project, including their calibration and error correction. It has also been evaluated in the context of several practical measurement tasks with different systems. The goal of optimizing the memory requirements during processing as well as during final storage are completely fulfilled. However, there are many shortcomings and opportunities to connect ford leading work. Firstly, the functional blocks like the correction of system specific correction of aberrations and calibration procedure should be improved in the efficiency and performance capability. In the concrete case of push-broom imaging, the quality of the images can be improved by integrating deconvolution models or pansharpening. Another promising opportunity is the direct implementation of selected and evaluated processing and analyzation techniques in the analyzation toolbox. The possibility for customizing PCA parameters for subsequent implementation in industrial processes, calculated for e.g. SOCs like FPGAs are very useful in industrial cases. For e.g., the training of the feature vector can be done PC based, but the integration in the real time process could use some advantages like the low energy consumption of FPGAs or the aspects of miniaturization as well. Finally, a more powerful direct implementation in imaging systems with using the direct output of the measurement data can be done. ACKNOWLEDGEMENTSThe presented work is the result of the research within the project ID2M (03IPT709X) which is situated at the Technische Universität Ilmenau, Germany as part of the research support program InnoProfile, funded by the Federal Ministry of Education and Research (BMBF) Germany. REFERENCESIllmann, R., Dittrich, P.-G., Hänsel, M., Rosenberger, M. and Notni, G.,
“Education & Training in Multi- and Hyperspectral Measurement Engineering and Quality Assurance,”
in J. Phys. Conf. Ser.,
22007
(2018). Google Scholar
Illmann, R.,
“Wide range UV irradiation system for imaging reflection spectroscopy,”
in SPIE Commer. + Sci. Sens. Imaging,
(2018). Google Scholar
Grahn, H. F. and Geladi, P., Techniques and Applications of Hyperspectral Image Analysis,
(2007). https://doi.org/10.1002/9780470010884 Google Scholar
Manickavasagan, A. and Jayasuriya, H., Imaging with Electromagnetic Spectrum Applications in Food and Agriculture,
(2014). Google Scholar
Park, B. and Lu, R., Hyperspectral imaging technology in food and agriculture,
(2015). https://doi.org/10.1007/978-1-4939-2836-1 Google Scholar
Schellhorn, M. and Notni, G.,
“Optimization of a Principal Component Analysis Implementation on Field-Programmable Gate Arrays (FPGA) for Analysis of Spectral Images,”
in 2018 Int. Conf. Digit. Image Comput. Tech. Appl. DICTA 2018,
(2019). Google Scholar
Schellhorn, M., Fütterer, R., Notni, G. and Rosenberger, M.,
“Programmable system on chip implementation of a principal component analysis for preprocessing of multispectral image data acquired with filter wheel cameras,”
(2018). https://doi.org/10.1117/12.2304714 Google Scholar
Chang, N. Bin and Bai, K., Multisensor data fusion and machine learning for environmental remote sensing,
(2018). https://doi.org/10.1201/9781315154602 Google Scholar
Canty, M. J., Image analysis, classification and change detection in remote sensing: With algorithms for ENVI/IDL and python, ,
(2014). https://doi.org/10.1201/b17074 Google Scholar
Illmann, R., Rosenberger, M. and Notni, G.,
“Optimized algorithm for processing hyperspectral push-broom data from multiple sources,”
in Proc.SPIE,
(2019). Google Scholar
Illmann, R., Rosenberger, M. and Notni, G.,
“Strategies for Merging Hyperspectral Data of Different Spectral and Spatial Resolution,”
(2019). Google Scholar
|