Real-time Cargo Volume Recognition using Internet-connected 3D Scanners

Felix Föcker, Adrian Neubauer, Andreas Metzger, Gerd Gröner, Klaus Pohl

2015

Abstract

Transport and logistics faces fluctuations in cargo volume that statistically can only be captured with a large error. Observing and managing such dynamic volume fluctuations more effectively promises many benefits such as reducing unused transport capacity and ensuring timely delivery of cargo. This paper introduces an approach that combines user-friendly mobile devices with internet-connected sensors to deliver up-to-date, timely, and precise information about parcel volumes inside containers. In particular, we present (1) RCM, a mobile app for unique identification of containers, and (2) SNAP, a novel approach for employing internet connected low-cost, off-the-shelf 3D scanners for capturing and analyzing actual cargo volumes. We have evaluated the accuracy of SNAP in controlled experiments indicating that cargo volume can be measured with high accuracy. We have further evaluated RCM together with SNAP by means of a survey study with domain experts, revealing its high potential for practical use.

References

  1. Bondarev, E. et al., 2013. On photo-realistic 3D reconstruction of large-scale and arbitrary-shaped environments. In 2013 IEEE Consumer Communications and Networking Conference (CCNC). 2013 IEEE Consumer Communications and Networking Conference (CCNC). pp. 621-624.
  2. Delaunay, B., 1934. Sur la sphère vide. Bulletin of Academy of Sciences of the USSR, (6), pp.793-800.
  3. Dutta, T., 2012. Evaluation of the KinectTM sensor for 3-D kinematic measurement in the workplace. Applied ergonomics, 43(4), pp.645-649.
  4. Freedman, B. et al., 2012. Depth mapping using projected patterns, Google Patents. Available at: https://www.google.com/patents/US8150142.
  5. Izadi, S., Kim, D., et al., 2011. KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. UIST 7811. New York, NY, USA: ACM, pp. 559-568. Available at: http://doi.acm.org/
  6. 2 http://www.europoolsystem.com/128/Big-Box 10.1145/2047196.2047270 (Accessed June 11, 2014).
  7. Izadi, S., Newcombe, R.A., et al., 2011. KinectFusion: real-time dynamic 3D surface reconstruction and interaction. In ACM SIGGRAPH 2011 Talks. SIGGRAPH 7811. New York, NY, USA: ACM, pp. 23:1- 23:1. Available at: http://doi.acm.org/10.1145/ 2037826.2037857 (Accessed October 10, 2013).
  8. Kainz, B. et al., 2012. OmniKinect: real-time dense volumetric data acquisition and applications. In Proceedings of the 18th ACM symposium on Virtual reality software and technology. VRST 7812. New York, NY, USA: ACM, pp. 25-32. Available at: http://doi.acm.org/10.1145/2407336.2407342 (Accessed October 10, 2013).
  9. Khoshelham, K., 2011. Accuracy Analysis of Kinect Depth Data. ISPRS Journal of Photogrammetry and Remote Sensing, (XXXVIII - 5/W12), p.6.
  10. Kückelhaus, M. et al., 2013. Low-cost sensor technology - A DHL perspective on implications and use cases for the logistics industry M. Wegner, ed. Available at: http://www.dhl.com/content/dam/downloads/g0/about _us/innovation/CSI_Studie_Low_Sensor.PDF.
  11. Litomisky, K. & Bhanu, B., 2013. Removing Moving Objects from Point Cloud Scenes. In X. Jiang et al., eds. Advances in Depth Image Analysis and Applications. Lecture Notes in Computer Science. Springer Berlin Heidelberg, pp. 50-58. Available at: http://link.springer.com/chapter/10.1007/978-3-642- 40303-3_6 (Accessed June 25, 2014).
  12. Metzger, A. et al., 2014. Comparing and Combining Predictive Business Process Monitoring Techniques. IEEE Transactions on Systems, Man, and Cybernetics: Systems, Early Access Online.
  13. Microsoft, 2014. Kinect for Windows. Available at: https://www.microsoft.com/en-us/kinectforwindows/.
  14. Microsoft Developer Network, 2014. Kinect Fusion. Available at: http://msdn.microsoft.com/enus/library/dn188670.aspx.
  15. Microsoft Developer Network, 2013. Kinect SDK with Kinect Fusion. Available at: http://blogs.msdn.com/b/ kinectforwindows/archive/2013/03/18/the-latestkinect-for-windows-sdk-is-here.aspx.
  16. Newcombe, R.A. et al., 2011. KinectFusion: Real-time dense surface mapping and tracking. In 2011 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 2011 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR). pp. 127-136.
  17. Prechelt, L., 2001. Kontrollierte Experimente in der Softwaretechnik: Potenzial und Methodik, Springer-Verlag GmbH.
  18. Rasmussen, C. et al., 2013. Towards functional labeling of utility vehicle point clouds for humanoid driving. In 2013 IEEE International Conference on Technologies for Practical Robot Applications (TePRA). 2013 IEEE International Conference on Technologies for Practical Robot Applications (TePRA). pp. 1-6.
  19. Roth, H. & Vona, M., 2012. Moving Volume KinectFusion. In British Machine Vision Association, pp. 112.1-112.11. Available at: http://www.bmva.org/ bmvc/2012/BMVC/paper112/index.html (Accessed June 23, 2014).
  20. Smisek, J., Jancosek, M. & Pajdla, T., 2011. 3D with Kinect. In 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). pp. 1154-1160.
  21. Whelan, T. et al., 2012. Kintinuous: Spatially Extended KinectFusion. Available at: http://dspace.mit.edu/ handle/1721.1/71756 (Accessed June 23, 2014).
  22. Xingyan Li et al., 2012. Using Kinect for monitoring warehouse order picking operations. In Proceedings of Australasian Conference on Robotics and Automation. Australasian Conference on Robotics and Automation. Victoria University of Wellington, New Zealand.
  23. Xu, D. et al., 2012. Kinect-Based Easy 3D Object Reconstruction. In W. Lin et al., eds. Advances in Multimedia Information Processing - PCM 2012. Lecture Notes in Computer Science. Springer Berlin Heidelberg, pp. 476-483. Available at: http://link.springer.com/chapter/10.1007/978-3-642 - 34778-8_44 (Accessed October 10, 2013).
  24. Zeng, M. et al., 2012. A Memory-Efficient KinectFusion Using Octree. In S.-M. Hu & R. R. Martin, eds. Computational Visual Media. Lecture Notes in Computer Science. Springer Berlin Heidelberg, pp. 234-241. Available at: http://link.springer.com/chapter/ 10.1007/978-3-642-34263-9_30 (Accessed October 10, 2013).
  25. Zhou, Q.-Y. & Koltun, V., 2013. Dense scene reconstruction with points of interest. ACM Trans. Graph., 32(4), pp.112:1-112:8.
Download


Paper Citation


in Harvard Style

Föcker F., Neubauer A., Metzger A., Gröner G. and Pohl K. (2015). Real-time Cargo Volume Recognition using Internet-connected 3D Scanners . In Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE, ISBN 978-989-758-100-7, pages 323-330. DOI: 10.5220/0005377203230330


in Bibtex Style

@conference{enase15,
author={Felix Föcker and Adrian Neubauer and Andreas Metzger and Gerd Gröner and Klaus Pohl},
title={Real-time Cargo Volume Recognition using Internet-connected 3D Scanners},
booktitle={Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE,},
year={2015},
pages={323-330},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005377203230330},
isbn={978-989-758-100-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Evaluation of Novel Approaches to Software Engineering - Volume 1: ENASE,
TI - Real-time Cargo Volume Recognition using Internet-connected 3D Scanners
SN - 978-989-758-100-7
AU - Föcker F.
AU - Neubauer A.
AU - Metzger A.
AU - Gröner G.
AU - Pohl K.
PY - 2015
SP - 323
EP - 330
DO - 10.5220/0005377203230330