You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
7 March 2014Illumination invariant 3D change detection
We present a 3D change detection framework designed to support various applications in changing environmental conditions. Previous efforts have focused on image filtering techniques that manipulate the intensity values of the image to create a more controlled and unnatural illumination. Since most applications require detecting changes in a scene irrespective of the time of day and present lighting conditions, image filtering algorithms fail to suppress the illumination differences enough for Background Model (BM) subtraction to be effective. Our approach completely eliminates the illumination challenges from the change detection problem. The algorithm is based on our previous work in which we have shown a capability to reconstruct a surrounding environment in near real-time processing speeds. The algorithm, namely Dense Point-Cloud Representation (DPR), allows for a 3D reconstruction of a scene using only a single moving camera. In order to eliminate any effects of the illumination change, we convert each point-cloud model into a 3D binary voxel grid. A `1' is assigned to voxels containing points from the model while a `0' is assigned to voxels with no points. We detect the changes between the two environments by volumetrically subtracting the registered 3D binary voxel models. This process is extremely computationally efficient due to logic-based operations available when handling binary models. We evaluate the 3D change detection framework by experimenting on the same scene with aerial imagery captured at various times.