KEYWORDS: Motion models, Particle filters, Particles, Detection and tracking algorithms, Video, Video surveillance, Data fusion, Motion estimation, Data modeling, Cameras
In this paper, a new approach is presented for tracking object accurately and steadily when the target encountering
occlusion in video sequences. First, we use Canny algorithm to extract the edges of the object. The edge pixels are
classified as foreground/background for each frame using background subtraction. On the next stage, a set of cues
including a motion model, an elliptical shape model, a spatial-color mixture of Gaussians appearance model, and an edge
orientation histogram model is fused in a principled manner. All these cues could be modeled by a data likelihood
function; Then, a particle filter algorithm is used for tracking and the particles are re-sampled based on the fusion of the
cues. Result form simulations and experiments with real video sequences show the effectiveness of our approach for
tracking people under occlusion conditions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.