Experimental Studies in Software Inspection Process - A Systematic Mapping

Elis Montoro Hernandes, Anderson Belgamo, Sandra Fabbri

2013

Abstract

Background: The interest in produce experimental knowledge about verification and validation techniques increases over the years. This kind of knowledge can be useful for researchers who develop studies in that area as well as for industry that can make decisions about verification and validation activities (V&V) on the basis of experimental results. Aim: This paper aims to map the empirical studies conducted in the software inspection process area. Method: Each step of the Systematic Mapping (SM) process was performed with the support of the StArt tool, and the papers from major databases, journals, conferences, and workshops were covered. Results: Seventy nine papers were accepted in this mapping and helped identifying the inspection processes, techniques and tools commonly referenced in that papers, as well as the artifacts usually inspected and the research groups and universities frequently involved in these papers. Conclusion: Different inspection processes have been investigated through experimental studies, and the Fagan’s process is the most investigated of them. To evaluate these different processes requirements document and source code were the artifacts more used. Besides, different tools and techniques have been used to support these processes. Some important lessons were learned, which are in accordance to explanations of others authors.

References

  1. Anderson, P.; Reps, T.; Teitelbaum, T. 2003. Design and implementation of a fine-grained software inspection tool. IEEE Transactions on Software Engineering, 29(8), pp. 721-733.
  2. Ardito, C., Lanzilotti, R., Buono, P., Piccinno, A. 2006. A tool to support usability inspection. In: Working Conference on Advanced Visual Interfaces, 11th. AVI. Veneza, Italy, May, 2006. New York: ACM Press.
  3. Basili, V.R., Green, S., Laitenberger, O., Shull, F., Sørumgård, S., Zelkowitz, M. 1996. The empirical investigation of Perspective-Based Reading. Empirical Software Engineering, 1(2), pp. 133-164.
  4. Bailey, J., Budgen, D., Turner, M., Kitchenham, B., Brereton, P., and Linkman, S. (2007). Evidence relating to Object-Oriented software design: A survey. In: International Symposium on Empirical Software Engineering and Measurement, Madri, Spain, Sep., 2004. Los Alamitos: IEEE Press.
  5. Bernardez, B.; Genero, M.; Duran, A.; Toro, M. 2004. A controlled experiment for evaluating a metric-based reading technique for requirements inspection. In: International Symposium on Software Metrics, 2004. Proceedings. 10th, Chicago, USA, Sep., 2004. Los Alamitos: IEEE Press.
  6. Boogerd, C., Moonen, L. 2006. Prioritizing Software Inspection Results using Static Profiling. In IEEE International Workshop on Source Code Analysis and Manipulation, 6th. SCAM. Philadelphia, USA. Dec., 2006. Los Alamitos: IEEE Computer Society.
  7. Budgen, D.; Kitchenham, B.; Charters, S.; Turner, M.; Brereton, P.; Linkman, S. 2008. Presenting software engineering results using structured abstracts: a randomised experiment. Empirical Software Engineering. 13(4). pp. 435--468.
  8. Cruzes, D.; Dyba, T. 2011. Recommended Steps for Thematic Synthesis in Software Engineering. In: International Symposium on Empirical Software Engineering and Measurement, Banff, Canada, Sep. 2011. Los Alamitos: IEEE Computer Society.
  9. Denger, C., Elberzhager, F. 2007. Unifying inspection processes to create a framework to support static quality assurance planning. In: Euromicro Conference on Software Engineering and Advanced Applications, 33rd, Lubeck, Germany, Aug., 2007. Los Alamitos: IEEE Press.
  10. Dieste, O.; Grimán, A.; Juristo, N. 2009. Developing search strategies for detecting relevant experiments. Empirical Software Engineering. 14(5). pp 513-539.
  11. Eick, S., Loader, C., Long, D., Votta, L., and Vander Wiel, S. 1992. Estimating software fault content before coding. In: International Conference on Software Engineering, 14th, Melbourne, Australia, May., 1992. Los Alamitos: IEEE Computer Society.
  12. Fabbri, S., Hernandes, E., Di Thommazo, A., Belgamo, A., Zamboni, A., Silva, C. 2012. Managing literature reviews information through visualization. In International Conference on Enterprise Information Systems.14th. ICEIS, Wroclaw, Poland, Jun, 2012. New York: SCITEPRESS.
  13. Fagan, M. E. 1976. Design and code inspections to reduce errors in program development. IBM Systems Journal, 15(7), pp. 182-211.
  14. Fagan, M. E. 1986. Advances in software inspections. IEEE Transactions on software Engineering, 12(7), pp. 744-751.
  15. Freimut, B.; Laitenberger, O.; Biffl, S.. 2001. Investigating the impact of reading techniques on the accuracy of different defect content estimation techniques. In: Software Metrics Symposium, 17th., London, UK, Apr., 2001. Los Alamitos: IEEE Press.
  16. Frøkjaer, E; Hornbaek, K. 2008. Metaphors of Human Thinking for Usability Inspection and Design. ACM Transactions on Computer-Human Interaction. 14(4). pp. 1--33.
  17. Gilb, T., Graham, D. 1993. Software Inspection. Wokingham, England: AddisonWesley,.
  18. Halling, M., Biffl, S., Grunbacher, P. 2002. A groupwaresupported inspection process for active inspection management. In: Euromicro Conference, 28th. Dortmund, Germany, Dec., 2002. Los Alamitos: IEEE Computer Society.
  19. Harjumaa, L. 2003. Distributed software inspections - An experiment with Adobe Acrobat. In: International Conference on Computer Science and Technology, 26th. Cancun, Mexico, May, 2003. New York: IASTED.
  20. He, L, Carver, J. 2006. PBR vs. checklist: a replication in the n-fold inspection context. In: ACM/IEEE International Symposium on Empirical Software Engineering, Rio de Janeiro, Brazil, Sep., 2006. New York: ACM Press.
  21. Hernandes, E. C. M.; Zamboni, A. B.; Thommazo, A. D.; Fabbri, S. C. P. F. 2010. Avaliação da ferramenta StArt utilizando o modelo TAM e o paradigma GQM. In: X Experimental Software Engineering Latin American Workshop, ICMC-São Carlos.
  22. Humphrey, W. S. 1989. Managing the software process. Addison-Wesley Longman Publishing Co.
  23. Jedlitschka, A.; Pfahl, D. 2005. Reporting guidelines for controlled experiments in software engineering. In: International Symposium on Empirical Software Engineering, Kaiserslautern, Germany, Nov. 2005. Los Alamitos: IEEE Press.
  24. Kalinowski, M., Travassos, G.H. 2004. A computational framework for supporting software inspections. In: International Conference on Automated Software Engineering, 19th, Linz, Austria, Set., 2004. Los Alamitos: IEEE Computer Society.
  25. Kelly, D.; Shepard, T. 2000. Task-directed software inspection technique: an experiment and case study. In Conference of the Centre for Advanced Studies on Collaborative research, Mississauga, Canada, Nov., 2000. Palo Alto: IBM Press.
  26. Kitchenham, B. A. 2004. Procedures for Performing Systematic Reviews. Software Engineering Group, Keele University, Keele, Tech. Rep. TR/SE 0401.
  27. Kitchenham, B. A. 2007. Guidelines for performing Systematic Literature Reviews in Software. Software Engineering Group, Keele Univ., Keele, Univ. Durham, Durham, Tech. Rep. EBSE-2007-01.
  28. Lanubile, F.; Mallardo, T.; Calefato, F. 2004. Tool support for geographically dispersed inspection teams. Software Process: Improvement and Practice, 8(4), pp. 217-231.
  29. Miller, J. 1999. Estimating the number of remaining defects after inspection. Software Testing, Verification and Reliability. University of Strathclyde, UK.
  30. Murphy, P., Miller, J. 1997. A process for asynchronous software inspection. In: IEEE International Workshop on Software Technology and Engineering Practice, 8th, Jul., 1997, London, UK. Los Alamitos: IEEE Press.
  31. NASA. 1993. Software Formal Inspections Guidebook. Washington, USA, Aug., 1993. Available at: <http://www.cs.nott.ac.uk/cah/G53QAT/fi.pdf>. Accessed: 07/Feb/2013.
  32. Pai, M., McCulloch, M., Gorman, J. D., Pai, N., Enanoria, W., Kennedy, G., Tharyan, P., Colford Jr., J. M. 2004. Clinical Research Methods - Systematic reviews and meta-analyses: An illustrated, step-by-step guide. The National Medical Journal of India.
  33. Perpich, J.; Perry, D.; Porter, A.; Votta, L.; Wade, M. 1997. Anywhere, Anytime Code Inspections: Using the Web to Remove Inspection Bottlenecks in LargeScale Software Development. In: International Conference on Software Engineering, Boston, USA, May, 1997. Los Alamitos: IEEE Computer Society.
  34. Perry, D.; Porter, A.; Wade, M.; Votta, L.; Perpich, J. 2002. Reducing inspection interval in large-scale software development. IEEE Transaction Software Engineering. 28(7). pp. 695--705.
  35. Petersen, K. et al. 2008. Systematic Mapping Studies in Software Engineering, In: Proc. Inter. Conf. on Evaluation and Assessment in Software Engineering, Bari, Italy.
  36. Petticrew, M. and Roberts, H. 2006. Systematic Reviews in the Social Sciences - a practical guide. Blackwell Publishing, Malden.
  37. Porter, A., Votta, L., Basili, V. 1995. Comparing Detection Methods for Software Requirements Inspections: A Replicated Experiment. IEEE Transactions on Software Engineering, 21(6), pp. 563- 575.
  38. Porter, A., Siy, H., Mockus, A., Votta, L. 1998. Understanding the Sources of Variation in Software Inspections. ACM Transactions on Software Engineering and Methodology, 7(1), pp. 41-79.
  39. Porto, D., Mendonca, M., Fabbri, S. 2009. The use of reading technique and visualization for program understanding. In: International Conference on Software Engineering and Knowledge Engineering, 21st. Boston, USA, May, 2009. New York: ACM Press.
  40. Runeson, P., Wohlin, C. 1998. An Experimental Evaluation of an Experience-Based Capture-Recapture Method in Software Code Inspections. Empirical Software Engineering. 3(4), pp.381-406.
  41. Sauer, C. Jeffery, D.; Land, L.; Yetton, P. 2000. The effecticveness of software development technical review: a behaviorally motivated program of research. IEEE Transactions on Software Engineering, 1(26), pp. 1--14.
  42. Schneider, M; Martin, J.; Tsai. 1992. An experimental study of fault detection in user requirements documents. ACM Transactions on Software Engineering and Methodology, 1(2). pp. 188-- 204.
  43. Thelin, T. 2003. Empirical Evaluations of Usage-Based Reading and Fault Content Estimation for Software Inspections. Empirical Software Engineering, 8(3).
  44. Thelin. T. 2004. Team-Based Fault Content Estimation in the Software Inspection Process. In International Conference on Software Engineering, 26th ICSE, St Louis, USA, May, 2004. Washington: IEEE Computer.
  45. Tyran, C. K., George, J. F. 2002. Improving software inspections with group process support. Communications of the ACM. 45(9), pp. 87-92.
  46. Torner, T,; Ivarsson, M.; Pettersson, F.; Öhman, P. 2006. Defects in automotive use cases. In: ACM/IEEE International Symposium on Empirical Software Engineering, Redondo Beach, USA, Aug., 2006. New York: ACM Press.
  47. Travassos, G. H., Gurov, D., Amaral, E. 2002. Introdução à Engenharia de Software Experimental. UFRJ. Rio de Janeiro, Brazil. 2002. (Technical Report 590/02).
  48. Vitharana, P., Ramamurthy, K. 2003. Computer-mediated group support, anonymity, and the software inspection process: An empirical investigation. IEEE Transactions on Software Engineering. 29(2). pp. 167- 180.
  49. Walia, G.; Carver, J. 2008. Evaluation of capturerecapture models for estimating the abundance of naturally-occurring defects. In: ACM-IEEE International Symposium on Empirical software Engineering and Measurement, 2nd, Kaiserslautern, Germany, Oct., 2008. New York: ACM Press.
  50. Wieringa,R.; Maiden, N.; Mead, N.; Rolland, C. 2005. Requirements engineering paper classification and evaluation criteria: a proposal and a discussion. Requirements Engineering. 11(1), pp.102-107.
  51. Winkler, D.; Biffl, S.; Thurnher, B. 2005. Investigating the impact of active guidance on design inspection. In; International Conference on Product Focused Software Process Improvement, Oulu, Finland, Jun., 2005 Berlin: Springer-Verlag.
  52. Wojcicki, M.; Strooper, P. 2006. Maximising the information gained from an experimental analysis of code inspection and static analysis for concurrent java components. In: In: ACM/IEEE International Symposium on Empirical Software Engineering, Redondo Beach, USA, Aug., 2006. New York: ACM Press.
Download


Paper Citation


in Harvard Style

Montoro Hernandes E., Belgamo A. and Fabbri S. (2013). Experimental Studies in Software Inspection Process - A Systematic Mapping . In Proceedings of the 15th International Conference on Enterprise Information Systems - Volume 1: ICEIS, ISBN 978-989-8565-59-4, pages 66-76. DOI: 10.5220/0004454000660076


in Bibtex Style

@conference{iceis13,
author={Elis Montoro Hernandes and Anderson Belgamo and Sandra Fabbri},
title={Experimental Studies in Software Inspection Process - A Systematic Mapping},
booktitle={Proceedings of the 15th International Conference on Enterprise Information Systems - Volume 1: ICEIS,},
year={2013},
pages={66-76},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004454000660076},
isbn={978-989-8565-59-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 15th International Conference on Enterprise Information Systems - Volume 1: ICEIS,
TI - Experimental Studies in Software Inspection Process - A Systematic Mapping
SN - 978-989-8565-59-4
AU - Montoro Hernandes E.
AU - Belgamo A.
AU - Fabbri S.
PY - 2013
SP - 66
EP - 76
DO - 10.5220/0004454000660076