MOD SLAM: Mixed Method for a More Robust SLAM without Loop Closing

Thomas Belos, Pascal Monasse, Eva Dokladalova

2022

Abstract

In recent years, the state-of-the-art of monocular SLAM has seen remarkable advances in reducing errors and improving robustness. At the same time, this quality of results can be obtained in real-time on small CPUs. However, most algorithms have a high failure rate out-of-the-box. Systematic error such as drift remains still significant even for the best algorithms. This can be handled by a global measure as a loop closure, but it penalizes online data processing. We propose a mixed SLAM, based on ORB-SLAM2 and DSO: MOD SLAM. It is a fusion of photometric and feature-based methods, without being a simple copy of both. We propose a decision system to predict at each frame which optimization will produce the minimum drift so that only one will be selected to save computational time and resources. We propose a new implementation of the map that is equipped with the ability to actively work with DSO and ORB points at the same time. Our experimental results show that this method increases the overall robustness and reduces the drift without compromising the computational resources. Contrary to the best state-of-the-art algorithms, MOD SLAM can handle 100% of KITTI, TUM, and random phone videos, without any configuration change.

Download


Paper Citation


in Harvard Style

Belos T., Monasse P. and Dokladalova E. (2022). MOD SLAM: Mixed Method for a More Robust SLAM without Loop Closing. In Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, ISBN 978-989-758-555-5, pages 691-701. DOI: 10.5220/0010833600003124


in Bibtex Style

@conference{visapp22,
author={Thomas Belos and Pascal Monasse and Eva Dokladalova},
title={MOD SLAM: Mixed Method for a More Robust SLAM without Loop Closing},
booktitle={Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP,},
year={2022},
pages={691-701},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010833600003124},
isbn={978-989-758-555-5},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP,
TI - MOD SLAM: Mixed Method for a More Robust SLAM without Loop Closing
SN - 978-989-758-555-5
AU - Belos T.
AU - Monasse P.
AU - Dokladalova E.
PY - 2022
SP - 691
EP - 701
DO - 10.5220/0010833600003124