From the 21st century, the progress of science and technology becomes more and more rapid, making the application of various macro and micro sensors such as radar, infrared, photoelectric, satellite, TV camera, electron microscopy imaging, CT imaging more and more widely, and the amount, scale, and complexity of spatial data are rapidly increasing, which has far exceeded the human ability to interpret. Because end-users are unable to analyze all the data in detail and extract the spatial knowledge of interest, the phenomenon of “spatial data explosion but lack of knowledge” occurs. Therefore, in order to improve the utilization efficiency of spatial data, it is necessary to study the complex temporal data model and extraction technology of multi-spatial database and multi-spatial database. Using spatial data mining and knowledge discovery to automatically or semi-automatically mine previously unknown but potentially useful spatial patterns from multi-spatial databases becomes necessary. In view of the above situation, this paper used clustering algorithm to verify and analyze the complex temporal data model and extraction technology of multi-spatial database. By comparing the four aspects of precision P, recall rate R, comprehensive performance F, and content extraction speed of different algorithms, the relationship between the two was obtained. The experimental research results have shown that under other conditions being the same, the precision P, recall rate R, and comprehensive performance F of the k-means clustering algorithm are basically above 97%, higher than those of the ParEx and vu algorithms. In terms of the time required to extract the content, the time of the k-means clustering algorithm is 0.35s, which is much lower than the 0.49s of the ParEx algorithm and 0.61s of the vu algorithm. It can be proved that the k-means clustering algorithm can promote the development of complex temporal data model and extraction technology of multi-spatial database, indicating the positive relationship between the two.
|