An Approach for Automated Scenario-based Testing of Distributed and Heterogeneous Systems

Bruno Lima, João Pascoal Faria

2015

Abstract

The growing dependence of our society on increasingly complex software systems, makes software testing ever more important and challenging. In many domains, such as healthcare and transportation, several independent systems, forming a heterogeneous and distributed system of systems, are involved in the provisioning of end-to-end services to users. However, existing testing techniques, namely in the model-based testing field, provide little tool support for properly testing such systems. Hence, in this paper, we propose an approach and a toolset architecture for automating the testing of end-to-end services in distributed and heterogeneous systems. The tester interacts with a visual modeling frontend to describe key behavioral scenarios, invoke test generation and execution, and visualize test results and coverage information back in the model. The visual modeling notation is converted to a formal notation amenable for runtime interpretation in the backend. A distributed test monitoring and control infrastructure is responsible for interacting with the components of the system under test, as test driver, monitor and stub. At the core of the toolset, a test execution engine coordinates test execution and checks the conformance of the observed execution trace with the expectations derived from the visual model. A real world example from the Ambient Assisted Living domain is presented to illustrate the approach.

References

  1. AAL4ALL (2015). Ambient Assisted Living For All. http://www.aal4all.org.
  2. Boehm, B. (2011). Some Future Software Engineering Opportunities and Challenges. In Nanz, S., editor, The Future of Software Engineering, pages 1-32. Springer Berlin Heidelberg.
  3. Canini, M., Jovanovic, V., Venzano, D., Novakovic, D., and Kostic, D. (2011). Online Testing of Federated and Heterogeneous Distributed Systems. SIGCOMM Comput. Commun. Rev., 41(4):434-435.
  4. Dias Neto, A. C., Subramanyan, R., Vieira, M., and Travassos, G. H. (2007). A Survey on Model-based Testing Approaches: A Systematic Review. In Proceedings of the 1st ACM International Workshop on Empirical Assessment of Software Engineering Languages and Technologies: Held in Conjunction with the 22Nd IEEE/ACM International Conference on Automated Software Engineering (ASE) 2007, WEASELTech 7807, pages 31-36, New York, NY, USA. ACM.
  5. DoD (2008). Systems Engineering Guide for Systems of Systems. Technical report, Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering Version 1.0.
  6. Faria, J. (2014). A Toolset for Conformance Testing against UML Sequence Diagrams. https://blogs.fe.up.pt/sdbt/.
  7. Faria, J. and Paiva, A. (2014). A toolset for conformance testing against UML sequence diagrams based on event-driven colored Petri nets. International Journal on Software Tools for Technology Transfer, pages 1-20.
  8. Faria, J. P., Lima, B., Sousa, T. B., and Martins, A. (2014). A Testing and Certification Methodology for an Open Ambient-Assisted Living Ecosystem. International Journal of E-Health and Medical Communications (IJEHMC), 5(4):90-107.
  9. Gamma, E., Helm, R., Johnson, R., and Vlissides, J. (1994). Design patterns: elements of reusable object-oriented software. Pearson Education.
  10. Gross, H.-G. (2005). Component-Based Software Testing with UML. Springer Berlin Heidelberg.
  11. Hierons, R. M. (2014). Combining Centralised and Distributed Testing. ACM Trans. Softw. Eng. Methodol., 24(1):5:1-5:29.
  12. Hierons, R. M., Merayo, M. G., and Nún˜ez, M. (2011). Scenarios-based testing of systems with distributed ports. Software: Practice and Experience, 41(10):999-1026.
  13. IBM (2013). IBM R Rational R Rhapsody R Automatic Test Conductor Add On User Guide, v2.5.2.
  14. Javed, A., Strooper, P., and Watson, G. (2007). Automated Generation of Test Cases Using Model-Driven Architecture. In Automation of Software Test , 2007. AST 7807. Second International Workshop on, pages 3-3.
  15. Jensen, K., Kristensen, L., and Wells, L. (2007). Coloured Petri Nets and CPN Tools for modelling and validation of concurrent systems. International Journal on Software Tools for Technology Transfer, 9(3-4):213- 254.
  16. Kiczales, G., Lamping, J., Mendhekar, A., Maeda, C., Lopes, C., Loingtier, J.-M., and Irwin, J. (1997). Aspect-oriented programming. In Aks¸it, M. and Matsuoka, S., editors, ECOOP'97 - Object-Oriented Programming, volume 1241 of Lecture Notes in Computer Science, pages 220-242. Springer Berlin Heidelberg.
  17. Moreira, R. M. and Paiva, A. C. (2014). PBGT Tool: An Integrated Modeling and Testing Environment for Pattern-based GUI Testing. In Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering, ASE 7814, pages 863-866, New York, NY, USA. ACM.
  18. OMG (2011). OMG Unified Modeling LanguageTM (OMG UML), Superstructure. Technical report, Object Management Group.
  19. STAF (2014). Software Testing Automation Framework (STAF).
  20. Tassey, G. (2002). The Economic Impacts of Inadequate Infrastructure for Software Testing. Technical report, National Institute of Standards and Technology.
  21. Torens, C. and Ebrecht, L. (2010). RemoteTest: A Framework for Testing Distributed Systems. In Software Engineering Advances (ICSEA), 2010 Fifth International Conference on, pages 441-446.
  22. Ulrich, A. and König, H. (1999). Architectures for Testing Distributed Systems. In Csopaki, G., Dibuz, S., and Tarnay, K., editors, Testing of Communicating Systems, volume 21 of IFIP - The International Federation for Information Processing, pages 93-108. Springer US.
  23. Utting, M. and Legeard, B. (2007). Practical Model-Based Testing: A Tools Approach. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
  24. Utting, M., Pretschner, A., and Legeard, B. (2012). A taxonomy of model-based testing approaches. Software Testing, Verification and Reliability, 22(5):297-312.
  25. Völter, M., Stahl, T., Bettin, J., Haase, A., and Helsen, S. (2013). Model-driven software development: technology, engineering, management. John Wiley & Sons.
  26. Wittevrongel, J. and Maurer, F. (2001). SCENTOR: scenario-based testing of e-business applications. In Enabling Technologies: Infrastructure for Collaborative Enterprises, 2001. WET ICE 2001. Proceedings. Tenth IEEE International Workshops on, pages 41-46.
  27. Zhang, F., Qi, Z., Guan, H., Liu, X., Yang, M., and Zhang, Z. (2009). FiLM: A Runtime Monitoring Tool for Distributed Systems. In Secure Software Integration and Reliability Improvement, 2009. SSIRI 2009. Third IEEE International Conference on, pages 40-46.
Download


Paper Citation


in Harvard Style

Lima B. and Faria J. (2015). An Approach for Automated Scenario-based Testing of Distributed and Heterogeneous Systems . In Proceedings of the 10th International Conference on Software Engineering and Applications - Volume 1: ICSOFT-EA, (ICSOFT 2015) ISBN 978-989-758-114-4, pages 241-250. DOI: 10.5220/0005558602410250


in Bibtex Style

@conference{icsoft-ea15,
author={Bruno Lima and João Pascoal Faria},
title={An Approach for Automated Scenario-based Testing of Distributed and Heterogeneous Systems},
booktitle={Proceedings of the 10th International Conference on Software Engineering and Applications - Volume 1: ICSOFT-EA, (ICSOFT 2015)},
year={2015},
pages={241-250},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005558602410250},
isbn={978-989-758-114-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Software Engineering and Applications - Volume 1: ICSOFT-EA, (ICSOFT 2015)
TI - An Approach for Automated Scenario-based Testing of Distributed and Heterogeneous Systems
SN - 978-989-758-114-4
AU - Lima B.
AU - Faria J.
PY - 2015
SP - 241
EP - 250
DO - 10.5220/0005558602410250