We propose an approach to reconstruct a three-dimensional model from a stream of RGB-D images. Compared with existing methods, our strategy performs well in challenging cases, involving quick movements of the camera and missing depth values. The key idea underlying our approach is to combine registration with patch segmentation based on RGB information. We use these segments and the patch significance correspondence algorithm to transform a global restriction into a number of local restrictions during camera moving. Furthermore, we propose a method that improves the low precision of geometric registration by aligning corresponding patches instead of entire point clouds. In light of this consideration of spatial relations, we also propose a fusion strategy to extract the correct transformations in spite of poor RGB-D sequences. The results of tests on RGB-D benchmark sequences and comparisons with KinectFusion system showed that the proposed approach substantially increases the accuracy of the reconstructed models.
Biologists often use gene chip to get massive experimental data in the field of bioscience and chemical sciences. Facing a
large amount of experimental data, researchers often need to find out a few interesting data or simple regulations. This
paper presents a set of methods to visualize and analyze the data for gene expression signatures of people who smoke.
We use the latest research data from National Center for Biotechnology Information. Totally, there are more than 400
thousand expressions data. Using these data, we can use parallel coordinates method to visualize the different gene
expressions between smokers and nonsmokers and we can distinguish non-smokers, former smokers and current smokers
by using the different colors. It can be easy to find out which gene is more important during the lung cancer angiogenesis
in the smoking people. In another way, we can use a hierarchical model to visualize the inner relation of different genes.
The location of the nodes shows different expression moment and the distance to the root shows the sequence of the
expression. We can use the ring layout to represent all the nodes, and connect the different nodes which are related with
color lines. Combined with the parallel coordinates method, the visualization result show the important genes and some
inner relation obviously, which is useful for examination and prevention of lung cancer.
Research on the generation of natural phenomena has many applications in special effects of movie, battlefield
simulation and virtual reality, etc. Based on video synthesis technique, a new approach is proposed for the synthesis of
natural phenomena, including flowing water and fire flame. From the fire and flow video, the seamless video of arbitrary
length is generated. Then, the interaction between wind and fire flame is achieved through the skeleton of flame. Later,
the flow is also synthesized by extending the video textures using an edge resample method. Finally, we can integrate the
synthesized natural phenomena into a virtual scene.
Based on the infrared radiation characters of outer space targets and environment, considering the technologies related to
the infrared star grade, atmosphere attenuation, and coordinate transformation, this paper builds the theoretical model of
infrared radiation from flying satellites and background. Furthermore, using the database management of satellites orbits
and catalogue data, the dynamic scene image synthesis of satellites and background is implemented. Finally, we can
walkthrough the virtual scenes from different viewports, and analyze the characters of these dynamic simulating images.
Research results have significance for space targets exploration, identify and tracking.
Research on the generation of multi-spectral scene image for ocean environment is helpful for flight testing, mission routes planning, target recognition, military affairs, etc. Firstly, base on the spectrum equation of ocean wave movement, we use local evolvement of cellular automata, and build displacement rules for ocean wave with space and time. After calculating multispectral radiation of ocean wave, we generate the dynamic multispectral ocean wave. Then we establish the geometrical models of littoral environment, including terrain, coast, island, etc. Further, we calculate the radiance of different spectrum for ocean wave and ocean environment including visual and infrared, and quickly generate the multi-spectral ocean scene. Finally, after using some rendering techniques, we generate different realistic multispectral littoral environment scenes under different conditions in near real time.
In recent years, there has been a growing need for accurate, high fidelity scene simulations in the visible, infrared, microwave and other wavelengths. Based on a rigorous material classification and incorporating material attribute information, we generate wavelength independent texture maps for multi-spectral scene simulation. We calculate the sensor radiance value of every pixel, and change them into color or gray. If a single pixel in the texture contains more than one material, we mixture them based on their radiation attribution. According to area consistency and coherence across scan lines, an extended Seed Filling Algorithm is used in those areas with same or similar materials. These optical steps are performed repeatedly until a satisfactory classfication and mixture is found and the texture maps in a certain wave band are obtained. In this way we generate infrared textures from visible maps and different simulation scence textures at different time of day and under different environment conditions can also be obtained. Finally we give some examples of multi-spectral scene simulation, which are quite satisfied compared with the measured images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.