Taxonomy of 3D Sensors - A Survey of State-of-the-Art Consumer 3D-Reconstruction Sensors and their Field of Applications

Julius Schöning, Gunther Heidemann

Abstract

Sensors used for 3D-reconstruction determine both the quality of the results and the nature of reconstruction algorithms. The spectrum of such sensors ranges from expensive to low cost, from highly specialized to out-of- the-shelf, and from stereo to mono sensors. The list of available sensors has been growing steadily and is becoming difficult to manage, even in the consumer sector. We provide a survey of existing consumer 3D sensors and a taxonomy for their assessment. This taxonomy provides information about recent developments, application domains and functional criteria. The focus of this survey is on low cost 3D sensors at an accessible price. Prototypes developed in academia are also very interesting, but the price of such sensors can not easily be estimated. We try to provide an unbiased basis for decision-making for specific 3D sensors. In addition to the assessment of existing technologies, we provide a list of preferable features for 3D reconstruction sensors. We close with a discussion of common problems in available sensor systems and discuss common fields of application, as well as areas which could benefit from the application of such sensors.

References

  1. Albrecht, S. and Marsland, S. (2013). Seeing the unseen: Simple reconstruction of transparent objects from point cloud data. In 2nd Workshop on Robots in Clutter.
  2. ASUSTeK Computer Inc. (2015). Asus Xtion Specifications. http://www.asus.com/Multimedia/ Xtion PRO LIVE/ specifications/.
  3. Banerjee, T., Enayati, M., Keller, J. M., Skubic, M., Popescu, M., and Rantz, M. (2014). Monitoring patients in hospital beds using unobtrusive depth sensors. In Engineering in Medicine and Biology Society (EMBC), pages 5904-5907.
  4. Bluetechnix Group GmbH (2015). Argos 3D - P100 product website. http://www.bluetechnix.com/en/products/ depthsensing/product/argos3d-p100/.
  5. Creative Technology Ltd. (2015). Creative Senz3D website. http://us.creative.com/p/web-cameras/creativesenz3d.
  6. Cui, Y., Schuon, S., Chan, D., Thrun, S., and Theobalt, C. (2010). 3D shape scanning with a time-of-flight camera. In Computer Vision and Pattern Recognition (CVPR), pages 1173-1180.
  7. DAVID Group (2015). David - 3D coating spray. http://www.david-3d.com/products/accessories/ coating-spray-500.
  8. Dupuis, J., Paulus, S., Behmann, J., Plumer, L., and Kuhlmann, H. (2014). A multi-resolution approach for an automated fusion of different low-cost 3D sensors. Sensors, 14(4):7563-7579.
  9. El-laithy, R., Huang, J., and Yeh, M. (2012). Study on the use of microsoft kinect for robotics applications. In Position Location and Navigation Symposium (PLANS), pages 1280-1288.
  10. Fossati, A., Gall, J., Grabner, H., Ren, X., and Konolige, K., editors (2013). Consumer Depth Cameras for Computer Vision: Research Topics and Applications (Advances in Computer Vision and Pattern Recognition). Springer.
  11. Gallo, L., Placitelli, A., and Ciampi, M. (2011). Controllerfree exploration of medical image data: Experiencing the Kinect. In Computer-Based Medical Systems (CBMS).
  12. Handa, A., Whelan, T., McDonald, J., and Davison, A. J. (2014). A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. International Conference on Robotics and Automation.
  13. Henry, P., Krainin, M., Herbst, E., Ren, X., and Fox, D. (2014). Rgb-d mapping: Using depth cameras for dense 3d modeling of indoor environments. In Experimental Robotics, pages 477-491. Springer.
  14. Heptagon Micro Optics (2015). SR4500 data sheet. http:// downloads.mesa-imaging.ch/dlm.php?fname=pdf/ SR4500 DataSheet.pdf/.
  15. Huang, A. S., Bachrach, A., Henry, P., Krainin, M., Fox, D., and Roy, N. (2011). Visual odometry and mapping for autonomous flight using an RGB-D camera. In International Symposium of Robotics Research (ISRR).
  16. Ihrke, I., Kutulakos, K. N., Lensch, H., Magnor, M., and Heidrich, W. (2010). Transparent and specular object reconstruction. In Computer Graphics Forum, volume 29, pages 2400-2426. Wiley Online Library.
  17. Intel Corporation (2015a). Intel RealSense product brief. https://software.intel.com/sites/default/files/managed/ 0f/b0/IntelRealSense-WindowsSDKGold PB 1114- FINAL.pdf.
  18. Intel Corporation (2015b). RealSense 3D from lab to reality. http://iq-realsense.intel.com/from-lab-to-reality/.
  19. Khoshelham, K. and Elberink, S. O. (2012). Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors, 12(2):1437-1454.
  20. Lee, S.-O., Lim, H., Kim, H.-G., and Ahn, S. C. (2014). RGB-D fusion: Real-time robust tracking and dense mapping with RGB-D data fusion. In Intelligent Robots and Systems (IROS), pages 2749-2754.
  21. Microsoft (2015a). Kinect 2 for Windows technical datasheet. http://www.microsoft.com/en-us/ kinectforwindows/meetkinect/features.aspx.
  22. Microsoft (2015b). Kinect for Windows technical datasheet. https://readytogo.microsoft.com/en-us/ layouts/RTG/AssetViewer.aspx?AssetUrl=https%3A %2F%2Freadytogo.microsoft.com%2Fen-us %2FAsset%2FPages%2F08%20K4W%20Kinect %20for%20Windows Technical%20Datasheet.aspx.
  23. Microsoft Research (2015). 3D surface reconstruction. http://research.microsoft.com/en-us/projects/ surfacerecon/.
  24. Newcombe, R. A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A. J., Kohi, P., Shotton, J., Hodges, S., and Fitzgibbon, A. (2011). KinectFusion: Realtime dense surface mapping and tracking. In International Symposium on Mixed and Augmented Reality (ISMAR), pages 127-136.
  25. Nguyen, T. V., Feng, J., and Yan, S. (2014). Seeing human weight from a single RGB-D image. Journal of Computer Science and Technology, 29(5):777-784.
  26. Occipital, Inc (2015). Structure Sensor & SDK fact sheet. http://io.structure.assets.s3.amazonaws.com/ Structure%20Sensor%20Press%20Kit.zip.
  27. Papon, J., Abramov, A., Schoeler, M., and Wörgötter, F. (2013). Voxel cloud connectivity segmentation - supervoxels for point clouds. In Computer Vision and Pattern Recognition (CVPR), pages 2027-2034.
  28. PMD Technologies GmbH (2015). Reference design brief CamBoard pico. http://www.pmdtec.com/html/pdf/ PMD RD Brief CB pico 71.19k V0103.pdf.
  29. Schöning, J. (2015). Interactive 3D reconstruction: New opportunities for getting CAD-ready models. In Imperial College Computing Student Workshop (ICCSW), volume 49 of OpenAccess Series in Informatics (OASIcs), pages 54-61. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik.
  30. Schöning, J. and Heidemann, G. (2015). Interactive 3D modeling - a survey-based perspective on interactive 3D reconstruction. In International Conference on Pattern Recognition Applications and Methods (ICPRAM), volume 2, pages 289-294. SCITEPRESS.
  31. Wunder, E., Linz, A., Ruckelshausen, A., and Trabhardt, A. (2014). Evaluation of 3D-sensorsystems for service robotics in orcharding and viticulture. In VDI-Conference ”Agricultural Engineering” VDIBerichte Nr. 2226, pages 83-88. VDI-Verlag GmbH Düsseldorf.
  32. Yip, H. M., Ho, K. K., Chu, M., and Lai, K. (2014). Development of an omnidirectional mobile robot using a RGB-D sensor for indoor navigation. In Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), pages 162-167.
  33. Zollhöfer, M., Theobalt, C., Stamminger, M., Nießner, M., Izadi, S., Rehmann, C., Zach, C., Fisher, M., Wu, C., Fitzgibbon, A., and et al. (2014). Real-time non-rigid reconstruction using an RGB-D camera. ACM Transactions on Graphics, 33(4):1-12.
Download


Paper Citation


in Harvard Style

Schöning J. and Heidemann G. (2016). Taxonomy of 3D Sensors - A Survey of State-of-the-Art Consumer 3D-Reconstruction Sensors and their Field of Applications . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 192-197. DOI: 10.5220/0005784801920197


in Bibtex Style

@conference{visapp16,
author={Julius Schöning and Gunther Heidemann},
title={Taxonomy of 3D Sensors - A Survey of State-of-the-Art Consumer 3D-Reconstruction Sensors and their Field of Applications},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)},
year={2016},
pages={192-197},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005784801920197},
isbn={978-989-758-175-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2016)
TI - Taxonomy of 3D Sensors - A Survey of State-of-the-Art Consumer 3D-Reconstruction Sensors and their Field of Applications
SN - 978-989-758-175-5
AU - Schöning J.
AU - Heidemann G.
PY - 2016
SP - 192
EP - 197
DO - 10.5220/0005784801920197