Authors:
Juan Carlos Ramirez
and
Darius Burschka
Affiliation:
Technische Universitaet Muenchen, Germany
Keyword(s):
3D Mapping, 3D Blobs, Octree, Blobtree, Data Fusion, Ransac, Visual Motion Estimation.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Geometry and Modeling
;
Image-Based Modeling
;
Motion, Tracking and Stereo Vision
;
Optical Flow and Motion Analyses
;
Pattern Recognition
;
Robotics
;
Software Engineering
Abstract:
This paper describes an approach to consistently model and characterize potential object candidates presented
in non-static scenes. With a stereo camera rig we recollect and collate range data from different views around
a scene. Three principal procedures support our method: i) the segmentation of the captured range images
into 3D clusters or blobs, by which we obtain a first gross impression of the spatial structure of the scene, ii)
the maintenance and reliability of the map, which is obtained through the fusion of the captured and mapped
data to which we assign a degree of existence (confidence value), iii) the visual motion estimation of potential
object candidates, through the combination of the texture and 3D-spatial information, allows not only to update
the state of the actors and perceive their changes in a scene, but also to maintain and refine their individual
3D structures over time. The validation of the visual motion estimation is supported by a dual-layered 3Dmapping
framework in which we are able to store the geometric and abstract properties of the mapped entities
or blobs, and determine which entities were moved in order to update the map to the actual scene state.
(More)