An Acceptance Empirical Assessment of Open Source Test Tools

Natasha M. Costa Valentim, Adriana Lopes, Edson César, Tayana Conte, Auri Marcelo Rizzo Vincenzi, José Carlos Maldonado

2017

Abstract

Software testing is one of the verification and validation activities of software development process. Test automation is relevant, since manual application of tests is laborious and more prone to error. The choice of test tools should be based on criteria and evidence of their usefulness and ease of use. This paper presents an acceptance empirical assessment of open source testing tools. Practitioners and graduate students evaluated five tools often used in the industry. The results describe how these tools are perceived in terms of ease of use and usefulness. These results can support software practitioners in the process of choosing testing tools for their projects.

References

  1. Babar, M.A., Winkler, D. and Biffl, S. (2007). “Evaluating the Usefulness and Ease of Use of a Groupware Tool for the Software Architecture Evaluation Process”. In International Symposium on Empirical Software Engineering and Measurement, pp. 430-439.
  2. Boehm, B., Basili, V. R. (2001). “Software Defect Reduction Top 10 List”. Computer, IEEE Computer Society Press, v. 34, pp. 135 - 137.
  3. Calefato, F., Lanubile, F., Minervini (2010), P. “Can RealTime Machine Translation Overcome Language Barriers in Distributed Requirements Engineering?” In IEEE International Conference on Global Software Engineering, pp. 257-264.
  4. Carmines, E. G., Zeller, R. A. (1979). “Reliability and Validity Assessment”. In SAGE Pub., 72 pages.
  5. Davis, F. (1989). “Perceived usefulness, perceived ease of use, and user acceptance of information technology”. In MIS Quarterly, v. 13, n. 3, pp. 319 - 339.
  6. Debbarma, M. K., Debbarma, S., Debbarma, N., Chakma, K., Jamatia, A. (2013). “A Review and Analysis of Software Complexity Metrics in Structural Testing”. In International Journal of Computer and Communication Engineering, v. 2 (2), pp. 129-133.
  7. Delahaye, M., Bousquet, L. (2015). “Selecting a software engineering tool: lessons learnt from mutation analysis”. In Software: Practice and Experience, v. 45, n. 7, pp. 875 - 891.
  8. Fernandez, A., Abrahão, S., Insfran, E., Matera, M. (2012). “Further analysis on the validation of a usability inspection method for model-driven web development”. In Proceedings of International 7 http://napsol.icmc.usp.br/moodle/ symposium on Empirical software engineering and measurement (ESEM), Lund, Sweden, pp. 153-156.
  9. Feldt, R., Torkar, R., Ahmad, E., Raza, B. (2010). “Challenges with Software Verification and Validation Activities in the Space Industry”. In International Conference on Software Testing, Verification and Validation (ICST), pp. 225-234.
  10. Field, A. (2013). “Discovering Statistics Using SPSS”. In Sage Publications (CA), Edition 4, 915 pages.
  11. IEEE Computer Society, SWEBOK, “A Guide to the Software Engineering Body of Knowledge,” 2004.
  12. Johns, R (2005). “One Size Doesn't Fit All: Selecting Response Scales For Attitude Items”. In Journal of Elections, Public Opinion, and Parties, v. 15 (2), pp. 237-264.
  13. King, W. R., He, J. (2006). “A meta-analysis of the technology acceptance model”. In Information and Management. v.43 (6), pp. 740-755.
  14. Laitenberger, O., Dreyer, H. M. (1998). “Evaluating the usefulness and the ease of use of a Web-based inspection data collection tool”. In International Software Metrics Symposium, pp. 122-132.
  15. Lanubile, F., Mallardo, T., Calefato, F. (2003). “Tool support for Geographically Dispersed Inspection Teams”. In Software Process Improvement and Practice, v. 8, pp. 217-231.
  16. Monier, M., El-Mahdy, M. M. (2015). “Evaluation of A Guide to the Software Engineering Body of Knowledge automated web testing tools”. In International Journal of Computer Applications Technology and Research, v. 4 (5), pp. 405 - 408.
  17. Naik, K., Tripathy, P. (2008). “Software Testing and Quality Assurance: Theory and Practice”. In WileySpektrum, 1st Edition, 648 pages.
  18. Roper, M. (1994). “Software Testing”. McGrall Hill, 149 pages.
  19. Sharma, M., Angmo, R. (2014). “Web based Automation Testing and Tools”. In International Journal of Computer Science and Information Technologies, v. 5 (1), pp. 908 - 912.
  20. Steinmacher, I., Conte, T. U., Treude, C., Gerosa, M. A. (2016). “Overcoming Open Source Project Entry Barriers with a Portal for Newcomers”. In International Conference on Software Engineering, Austin, pp. 1-12.
  21. Tahbildar, H., Borbora, P., Khataniar, G. P. (2013) Teaching Automated Test Data Generation Tools for C, C++, and Java Programs. In International Journal of Computer Science & Information Technology (IJCSIT), v. 5 (1), pp. 181-195.
  22. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., Wessl, A. (2000). “Experimentation in software engineering: an introduction”. In Kluwer Academic Publishers, 236 pages.
  23. Zhu, H., Hall, P. A. V., May, J. H. R. (1997). “Software unit test coverage and adequacy”. In ACM Computing Surveys, v. 29 (4), pp. 366-427.
Download


Paper Citation


in Harvard Style

Valentim N., Lopes A., César E., Conte T., Vincenzi A. and Maldonado J. (2017). An Acceptance Empirical Assessment of Open Source Test Tools . In Proceedings of the 19th International Conference on Enterprise Information Systems - Volume 2: ICEIS, ISBN 978-989-758-248-6, pages 379-386. DOI: 10.5220/0006319603790386


in Bibtex Style

@conference{iceis17,
author={Natasha M. Costa Valentim and Adriana Lopes and Edson César and Tayana Conte and Auri Marcelo Rizzo Vincenzi and José Carlos Maldonado},
title={An Acceptance Empirical Assessment of Open Source Test Tools},
booktitle={Proceedings of the 19th International Conference on Enterprise Information Systems - Volume 2: ICEIS,},
year={2017},
pages={379-386},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006319603790386},
isbn={978-989-758-248-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 19th International Conference on Enterprise Information Systems - Volume 2: ICEIS,
TI - An Acceptance Empirical Assessment of Open Source Test Tools
SN - 978-989-758-248-6
AU - Valentim N.
AU - Lopes A.
AU - César E.
AU - Conte T.
AU - Vincenzi A.
AU - Maldonado J.
PY - 2017
SP - 379
EP - 386
DO - 10.5220/0006319603790386