Authors:
Suil Son
;
Young-Woon Cha
and
Suk I. Yoo
Affiliation:
Seoul National University, Korea, Republic of
Keyword(s):
Foreground detection, Foreground model, Background subtraction, Background model, Object detection.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Classification
;
Computer Vision, Visualization and Computer Graphics
;
Density Estimation
;
Geometry and Modeling
;
Image and Video Analysis
;
Image Understanding
;
Image-Based Modeling
;
Pattern Recognition
;
Software Engineering
;
Theory and Methods
;
Video Analysis
Abstract:
To overcome a false detection problem caused by dynamic textures in background subtraction problems, a new modelling approach is suggested. While traditional background subtraction approaches model the background, an indirect method, to detect foreground objects, the approach described here models the foreground directly. The foreground model is given by the probability distribution of pixel positions in terms of sums of weighted intensity differences for each pixel position between all previous images and a new image. The combination of the weighting and the summing of the intensity differences produces a number of desirable effects. For instance, each position in the new image which has consistently large differences will have a high foreground probability value; each position having consistently small differences will have a low probability value; and positions having small differences for most of the previous images but large differences for a few of the previous images due to dy
namic textures or noises will have medium probability values. The final distribution of the foreground position is computed by Kernel density estimation incorporating the neighboring pixel differences, and foreground objects are then identified by the probability value of this distribution. The performance of the suggested approach is then illustrated with two classes of problems and compared to other conventional approaches.
(More)