OMNIDIRECTIONAL CAMERA MOTION ESTIMATION

Akihiko Torii, Tomáš Pajdla

2008

Abstract

We present an automatic technique for computing relative camera motion and simultaneous omnidirectional image matching. Our technique works for small as well as large motions, tolerates multiple moving objects and very large occlusions in the scene. We combine three principles and obtain a practical algorithm which improves the state of the art. First, we show that the correct motion is found much sooner if the tentative matches are sampled after ordering them by the similarity of their descriptors. Secondly, we show that the correct camera motion can be better found by soft voting for the direction of the motion than by selecting the motion that is supported by the largest set of matches. Finally, we show that it is useful to filter out the epipolar geometries which are not generated by points reconstructed in front of cameras. We demonstrate the performance of the technique in an experiment with 189 image pairs acquired in a city and in a park. All camera motions were recovered with the error of the motion direction smaller than 8◦, which is 4 % of the 183◦ field of view, w.r.t. the ground truth.

References

  1. Brown, M. and Lowe, D. G. (2003). Recognising panoramas. In ICCV 7803, Washington, DC, USA.
  2. Chum, O. and Matas, J. (2005). Matching with PROSAC - progressive sample consensus. In CVPR 7805, volume 1, pages 220-226, Los Alamitos, USA.
  3. Cornelis, N., Cornelis, K., and Gool, L. V. (2006). Fast compact city modeling for navigation previsualization. In CVPR 7806, pages 1339-1344, Washington, DC, USA.
  4. Davison, A. J. and Molton, N. D. (2007). Monoslam: Realtime single camera slam. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6):1052- 1067.
  5. Fischler, M. A. and Bolles, R. C. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM, 24(6):381-395.
  6. Hartley, R. I. and Zisserman, A. (2004). Multiple View Geometry in Computer Vision. Cambridge University Press, ISBN: 0521540518, second edition.
  7. Li, H. and Hartley, R. (2005). A non-iterative method for correcting lens distortion from nine point correspondences. In OMNIVIS 7805.
  8. Williams, B., Klein, G., and Reid, I. (2007). Real-time slam relocalisation. In ICCV 7807, Rio de Janeiro, Brazil.
  9. Algorithm 1 Camera motion estimation by ordered sampling from tentative matches with geometrical constraints. Input: Image pair I1, I2.
  10. ? := 0.3 ? . . . the tolerance for establishing matches s := 4 ? . . . the standard deviation of Gaussian kernel for soft voting NV := 50 . . . the number of soft votes NS := 500 . . . the maximum number of random samples.
  11. ? := 0.95 . . . the termination probability of the standard RANSAC (Hartley and Zisserman, 2004, p. 119). Output: Essential matrix E*.
  12. 2. Construct the list M = [m]1N of tentative matches with mutually closest descriptors. Order the list ascendingly by the distance of the descriptors. N is the length of the list.
  13. 3. Find a camera motion consistent with a large number of tentative matches: 1: Set D to zero. // Initialize the accumulator of camera translation directions.
  14. 2: for i := 1, . . . , NV do 3: t := 0 // The counter of samples. n := 5 // Initial segment length.
  15. 4: while t = NT do 5: if t = ?200000 n5 / N5 ? (Chum and Matas, 2005) then 6: n := n + 1 // The maximum number of samples for the current initial segment reached, increase the initial segment length.
  16. 7: end if 8: t := t + 1 // New sample 9: Select the 5 tentative matches M5 of the tth sample by taking 4 tentative matches from [m]n1-1 at random and adding the 5th match mn.
  17. 10: Et := the essential matrix by solving the 5-point minimal problem for M5 (Nistér, 2004; Stewénius, 2005).
  18. 11: if M5 can be reconstructed in front of cameras (Hartley and Zisserman, 2004, p. 260) then 12: St := the number of matches which are consistent with Et , i.e. the number of all matches m = [u1, u2] for which max(?(u1, Et u2), ?(u2, Et?u1)) < ?.
  19. 13: else 14: St := 0 15: end if 16: NR := log(?)/ log 1 - S5t / N5
Download


Paper Citation


in Harvard Style

Torii A. and Pajdla T. (2008). OMNIDIRECTIONAL CAMERA MOTION ESTIMATION . In Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008) ISBN 978-989-8111-21-0, pages 577-584. DOI: 10.5220/0001084505770584


in Bibtex Style

@conference{visapp08,
author={Akihiko Torii and Tomáš Pajdla},
title={OMNIDIRECTIONAL CAMERA MOTION ESTIMATION},
booktitle={Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)},
year={2008},
pages={577-584},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001084505770584},
isbn={978-989-8111-21-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)
TI - OMNIDIRECTIONAL CAMERA MOTION ESTIMATION
SN - 978-989-8111-21-0
AU - Torii A.
AU - Pajdla T.
PY - 2008
SP - 577
EP - 584
DO - 10.5220/0001084505770584