Selective Use of Appropriate Image Pairs for Shape from Multiple Motions based on Gradient Method

Norio Tagawa, Syouta Tsukada

2016

Abstract

For the gradient-based shape from motion, relative motions with various directions at each 3-D point on a target object are generally effective for accurate shape recovery. On the other hand, a proper motion size exists for each 3-D point having an intensity pattern and a depth that varied in each, i.e., a too large motion causes a large error in depth recovery as an alias problem, and a too small motion is inappropriate from the viewpoint of an SNR. Application of random camera rotations imitating involuntary eye movements of a human eyeball has been proposed, which can generate multiple image pairs. In this study, in order to realize accurate shape recovery, we improve the gradient method based on the multiple image pairs by selecting appropriate image pairs to be used. Its effectiveness is verified through experiments using the actual camera system that we developed.

References

  1. Azevedo, T. (2006). Development of a computer platform for object 3d reconstruction using computer vision techniques. In proc. Conf. Comput. Vision, Theory and Applications, pages 383-388.
  2. Brox, T. and Malik, J. (2011). Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. Pattern Anal. Machine Intell., 33(3):500-513.
  3. Bruhn, A. and Weickert, J. (2005). Lucas/kanade meets horn/schunk: combining local and global optic flow methods. Int. J. Comput. Vision, 61(3):211-231.
  4. Dempster, A. P., Laird, N. M., and Rubin, D. B. (1977). Maximum likelihood from incomplete data. J. Roy. Statist. Soc. B, 39:1-38.
  5. Lazaros, N., Sirakoulis, G. C., and Gasteratos, A. (2008). Review of stereo vision algorithm: from software to hardware. Int. J. Optomechatronics, 5(4):435-462.
  6. Ochs, P. and Brox, T. (2012). Higher order motion models and spectral clustering. In proc. CVPR2012, pages 614-621.
  7. Samaras, D., Metaxas, D., Fua, P., and Leclerc, Y. G. (2000). Variable albedo surface reconstruction from stereo and shape from shading. In proc. Int. Conf. CVPR, volume 1, pages 480-487.
  8. Seitz, S. M. and Dyer, C. M. (1997). Photorealistic scene reconstruction by voxel coloring. In proc. Int. Conf. CVPR, pages 1067-1073.
  9. Tagawa, N. (2010). Depth perception model based on fixational eye movements using byesian statistical inference. In proc. Int. Conf. Pattern Recognition, pages 1662-1665.
  10. Tagawa, N. and Koizumi, S. (2015). Selective use of optimal image resolution for depth from multiple motions based on gradient scheme. In proc. Int. Workshop on Image Mining. Theory and Applications, pages 92-99.
  11. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Machine Intell., 22(11):1330-1334.
Download


Paper Citation


in Harvard Style

Tagawa N. and Tsukada S. (2016). Selective Use of Appropriate Image Pairs for Shape from Multiple Motions based on Gradient Method . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 552-561. DOI: 10.5220/0005778805520561


in Bibtex Style

@conference{visapp16,
author={Norio Tagawa and Syouta Tsukada},
title={Selective Use of Appropriate Image Pairs for Shape from Multiple Motions based on Gradient Method},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)},
year={2016},
pages={552-561},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005778805520561},
isbn={978-989-758-175-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)
TI - Selective Use of Appropriate Image Pairs for Shape from Multiple Motions based on Gradient Method
SN - 978-989-758-175-5
AU - Tagawa N.
AU - Tsukada S.
PY - 2016
SP - 552
EP - 561
DO - 10.5220/0005778805520561