Testing Distributed and Heterogeneous Systems: State of the Practice

Bruno Lima, João Pascoal Faria


In a growing number of domains, such as health-care and transportation, several independent systems, forming a heterogeneous and distributed system of systems, are involved in the provisioning of end-to-end services to users. Testing such systems, running over interconnected mobile and cloud-based platforms, is particularly important and challenging, with little support being provided by current tools. In order to assess the current state of the practice regarding the testing of distributed and heterogeneous systems (DHS) and identify opportunities and priorities for research and innovation initiatives, we conducted an exploratory survey that was responded by 147 software testing professionals that attended industry-oriented software testing conferences, and present the main results in this paper. The survey allowed us to assess the relevance of DHS in software testing practice, the most important features to be tested in DHS, the current status of test automation and tool sourcing for testing DHS, and the most desired features in test automation solutions for DHS. We expect that the results presented in the paper are of interest to researchers, tool vendors and service providers in this field.


  1. AAL4ALL (2015). Ambient Assisted Living For All. http://www.aal4all.org.
  2. Boehm, B. (2011). Some Future Software Engineering Opportunities and Challenges. In Nanz, S., editor, The Future of Software Engineering, pages 1-32. Springer Berlin Heidelberg.
  3. Dias Neto, A. C., Subramanyan, R., Vieira, M., and Travassos, G. H. (2007). A Survey on Model-based Testing Approaches: A Systematic Review. In Proceedings of the 1st ACM International Workshop on Empirical Assessment of Software Engineering Languages and Technologies: Held in Conjunction with the 22Nd IEEE/ACM International Conference on Automated Software Engineering (ASE) 2007, WEASELTech 7807, pages 31-36, New York, NY, USA. ACM.
  4. DoD (2008). Systems Engineering Guide for Systems of Systems. Technical report, Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering Version 1.0.
  5. Faria, J. and Paiva, A. (2014). A toolset for conformance testing against UML sequence diagrams based on event-driven colored Petri nets. International Journal on Software Tools for Technology Transfer, pages 1-20.
  6. Ghazi, A. N., Petersen, K., and Börstler, J. (2015). Software Quality. Software and Systems Quality in Distributed and Mobile Environments: 7th International Conference, SWQD 2015, Vienna, Austria, January 20-23, 2015, Proceedings, chapter Heterogeneous Systems Testing Techniques: An Exploratory Survey, pages 67-85. Springer International Publishing, Cham.
  7. Hierons, R. M. (2014). Combining Centralised and Distributed Testing. ACM Trans. Softw. Eng. Methodol., 24(1):5:1-5:29.
  8. ISTQB (2016a). International Software Testing Qualifications Board. http://www.istqb.org/.
  9. ISTQB (2016b). ISTQB Worldwide Software Testing Practices Report 2015-2016. Technical report.
  10. Kim, B., Hwang, H. I., Park, T., Son, S. H., and Lee, I. (2014). A layered approach for testing timing in the model-based implementation. In DATE, pages 1-4. IEEE.
  11. Lima, B. and Faria, J. P. (2016). Software Technologies: 10th International Joint Conference, ICSOFT 2015, Colmar, France, July 20-22, 2015, Revised Selected Papers, chapter Automated Testing of Distributed and Heterogeneous Systems Based on UML Sequence Diagrams, pages 380-396. Springer International Publishing, Cham.
  12. OMG (2015). OMG Unified Modeling LanguageTM (OMG UML) Version 2.5, Superstructure. Technical report, Object Management Group.
  13. Petrenko, A. and Yevtushenko, N. (2011). Testing Software and Systems: 23rd IFIP WG 6.1 International Conference, ICTSS 2011, Paris, France, November 7-10, 2011. Proceedings, chapter Adaptive Testing of Deterministic Implementations Specified by Nondeterministic FSMs, pages 162-178. Springer Berlin Heidelberg, Berlin, Heidelberg.
  14. Tassey, G. (2002). The Economic Impacts of Inadequate Infrastructure for Software Testing. Technical report, National Institute of Standards and Technology.
  15. Torens, C. and Ebrecht, L. (2010). RemoteTest: A Framework for Testing Distributed Systems. In Software Engineering Advances (ICSEA), 2010 Fifth International Conference on, pages 441-446.
  16. Utting, M. and Legeard, B. (2007). Practical Model-Based Testing: A Tools Approach. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
  17. West, S., Nanz, S., and Meyer, B. (2012). Demonic testing of concurrent programs. In Proceedings of the 14th International Conference on Formal Engineering Methods: Formal Methods and Software Engineering, ICFEM'12, pages 478-493, Berlin, Heidelberg. Springer-Verlag.
  18. Wohlin, C., Höst, M., and Henningsson, K. (2003). Empirical research methods in software engineering. In Empirical methods and studies in software engineering, pages 7-23. Springer.

Paper Citation

in Harvard Style

Lima B. and Faria J. (2016). Testing Distributed and Heterogeneous Systems: State of the Practice . In Proceedings of the 11th International Joint Conference on Software Technologies - Volume 1: ICSOFT-EA, (ICSOFT 2016) ISBN 978-989-758-194-6, pages 69-78. DOI: 10.5220/0005989100690078

in Bibtex Style

author={Bruno Lima and João Pascoal Faria},
title={Testing Distributed and Heterogeneous Systems: State of the Practice},
booktitle={Proceedings of the 11th International Joint Conference on Software Technologies - Volume 1: ICSOFT-EA, (ICSOFT 2016)},

in EndNote Style

JO - Proceedings of the 11th International Joint Conference on Software Technologies - Volume 1: ICSOFT-EA, (ICSOFT 2016)
TI - Testing Distributed and Heterogeneous Systems: State of the Practice
SN - 978-989-758-194-6
AU - Lima B.
AU - Faria J.
PY - 2016
SP - 69
EP - 78
DO - 10.5220/0005989100690078