Authors:
Cesar Castro
1
;
Samuel Bueno
1
and
Alessandro Victorino
2
Affiliations:
1
Computer Vision and Robotics Division, CenPRA, Brazil
;
2
INRIA-Sophia Antipolis, ICARE project, France
Keyword(s):
SLAM, localization, mapping, Kalman filtering, UAVs.
Related
Ontology
Subjects/Areas/Topics:
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robot Design, Development and Control
;
Robotics and Automation
;
Vision, Recognition and Reconstruction
Abstract:
This article presents the authors ongoing work towards a six degrees of freedom simultaneous localization and mapping (SLAM) solution for the Project AURORA autonomous robotic airship. While the vehicle’s mission is being executed in an unknown environment, where neither predefined maps nor satellite help are available, the airship has to use nothing but its own onboard sensors to capture information from its surroundings and from itself, locating itself and building a map of the environment it navigates. To achieve this goal, the airship sensorial input is provided by an inertial measurement unit (IMU), whereas a single onboard camera detects features of interest in the environment, such as landmark information. The data from both sensors are then fused using an architecture based on an extended Kalman filter, which acts as an estimator of the robot pose and the map. The proposed methodology is validated in a simulation environment, composed of virtual sensors and the aerial platfor
m simulator of the AURORA project based on a realistic dynamic model. The results are hereby reported.
(More)