Enriching Model Execution with Feedback to Support Testing of Semantic Conformance between Models and Requirements - Design and Evaluation of Feedback Automation Architecture

Gayane Sedrakyan, Monique Snoeck

Abstract

Model Driven Development (MDD) has traditionally been used to support model transformations and code generation. While plenty of techniques and tools are available to support modeling and transformations, tool support for checking the model quality in terms of semantic conformance with respect to the domain requirements is largely absent. In this work we present a model verification and validation approach based on model-driven feedback generation in a model-to-code transformation. The transformation is achieved using a single click. The generated output of the transformation is a compiled code which is achieved by a single click. This also serves as a rapid prototyping instrument that allows simulating a model (the terms prototyping and simulation are thus used interchangeably in the paper). The proposed feedback incorporation method in the generated prototype allows linking event execution in the generated code to its causes in the model used as input for the generation. The goal of the feedback is twofold: (1) to assist a modeler in validating semantic conformance of a model with respect to a domain to be engineered; (2) to support the learning perspective of less experienced modelers (such as students or junior analysts in their early career) by allowing them to detect modeling errors that result from the misinterpreted use of modeling language constructs. Within this work we focus on conceptual and platform independent models (PIM) that make use of two prominent UML diagrams – a class diagram (for modeling the structure of a system) and multiple interacting statecharts (for modeling a system’s dynamic behavior). The tool has been used in the context of teaching a requirements analysis and modeling course at KU Leuven. The proposed feedback generation technique has been constantly validated by means of “usability” evaluations, and demonstrates a high level of self-reported utility of the feedback. Additionally, the findings of our experimental studies also show a significant positive impact of feedback-enabled rapid prototyping method on semantic validation capabilities of novices. Despite our focus on specific diagramming techniques, the principles of the approach presented in this work can be used to support educational feedback automation for a broader spectrum of diagram types in the context of MDD and simulation.

References

  1. Banks, J. (1999). Introduction to simulation. Paper presented at the Proceedings of the 31st conference on Winter simulation: Simulation-a bridge to the future, Volume 1.
  2. Bourgonjon, J., Valcke, M., Soetaert, R., & Schellens, T. (2010). Students' perceptions about the use of video games in the classroom. Computers & Education, 54(4), 1145-1156.
  3. Carbone, M., & Santucci, G. (2002). Fast&&Serious: a UML based metric for effort estimation. Paper presented at the Proceedings of the 6th ECOOP workshop on quantitative approaches in objectoriented software engineering (QAOOSE'02).
  4. Cruz-Lemus, J. A., Genero, M., Manso, M. E., Morasca, S., & Piattini, M. (2009). Assessing the understandability of UML statechart diagrams with composite states-A family of empirical studies. Empirical Software Engineering, 14(6), 685-719.
  5. Cruz-Lemus, J. A., Genero, M., Morasca, S., & Piattini, M. (2007). Using practitioners for assessing the understandability of UML statechart diagrams with composite states Advances in Conceptual ModelingFoundations and Applications (pp. 213-222): Springer.
  6. Cruz-Lemus, J. A., Genero, M., & Piattini, M. (2008). Using controlled experiments for validating uml statechart diagrams measures Software Process and Product Measurement (pp. 129-138): Springer.
  7. Cruz-Lemus, J. A., Maes, A., Genero, M., Poels, G., & Piattini, M. (2010). The impact of structural complexity on the understandability of UML statechart diagrams. Information Sciences, 180(11), 2209-2220.
  8. Davis, F. D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319-340. doi: 10.2307/249008
  9. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Management Science, 35(8), 982-1003.
  10. Erickson, J., & Siau, K. (2007). Can UML Be Simplified? Practitioner Use of UML in Separate Domains. Paper presented at the Proceedings of the 12th Workshop on Exploring Modeling Methods for Systems Analysis and Design (EMMSAD'07), held in conjunction with the 19th Conference on Advanced Information Systems (CAiSE'07),Trondheim, Norway.
  11. Genero, M., Miranda, D., & Piattini, M. (2003). Defining metrics for UML statechart diagrams in a methodological way Conceptual Modeling for Novel Application Domains (pp. 118-128): Springer.
  12. Hevner, A., R., March, S. T., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28(1), 75-105.
  13. Hsu, C.-L., & Lu, H.-P. (2007). Consumer behavior in online game communities: A motivational factor perspective. Computers in Human Behavior, 23(3), 1642-1659.
  14. Ives, B., Olson, M. H., & Baroudi, J. J. (1983). The measurement of user information satisfaction. Communications of the ACM, 26(10), 785-793.
  15. Lindland, O. I., Sindre, G., & Solvberg, A. (1994). Understanding quality in conceptual modeling. Software, IEEE, 11(2), 42-49.
  16. Mitchell, M. L., & Jolley, J. M. (2012). Research design explained: Cengage Learning.
  17. Nelson, H. J., Poels, G., Genero, M., & Piattini, M. (2012). A conceptual modeling quality framework. Software Quality Journal, 20(1), 201-228. doi: 10.1007/s11219-011-9136-9
  18. Reggio, G., Leotta, M., Ricca, F., & Clerissi, D. (2013). What are the used UML diagrams? A Preliminary Survey. Paper presented at the EESSMOD@ MoDELS.
  19. Sedrakyan, G., & Snoeck, M. (2012). Technologyenhanced support for learning conceptual modeling Enterprise, Business-Process and Information Systems Modeling (pp. 435-449): Springer.
  20. Sedrakyan, G., & Snoeck, M. (2013a). Feedback-enabled MDA-prototyping effects on modeling knowledge Enterprise, Business-Process and Information Systems Modeling (pp. 411-425): Springer.
  21. Sedrakyan, G., & Snoeck, M. (2013b). A PIM-to-Code requirements engineering framework. Paper presented at the Proceedings of Modelsward 2013-1st International Conference on Model-driven Engineering and Software Development-Proceedings.
  22. Sedrakyan, G., & Snoeck, M. (2014a). Do we need to teach testing skills in courses on requirements engineering and modelling? Paper presented at the CEUR Workshop Proceedings.
  23. Sedrakyan, G., & Snoeck, M. (2014b). Lightweight semantic prototyper for conceptual modeling Advances in Conceptual Modeling (pp. 298-302): Springer.
  24. Sedrakyan, G., & Snoeck, M. (2015). Effects of Simulation on Novices' Understanding of the Concept of Inheritance in Conceptual Modeling Advances in Conceptual Modeling (pp. 327-336): Springer.
  25. Sedrakyan, G., Snoeck, M., & De Weerdt, J. (2014). Process mining analysis of conceptual modeling behavior of novices-empirical study using JMermaid modeling and experimental logging environment. Computers in Human Behavior, 41, 486-503.
  26. Sedrakyan, G., Snoeck, M., & Poelmans, S. (2014). Assessing the effectiveness of feedback enabled simulation in teaching conceptual modeling. Computers & Education, 78, 367-382.
  27. Siau, K., & Cao, Q. (2001). Unified Modeling Language (UML)-a complexity analysis. Journal of Database Management, 12(1), 26.
  28. Snoeck, M. (2014). Enterprise Information Systems Engineering: The MERODE Approach: Springer.
  29. Stahl, T., Voelter, M., & Czarnecki, K. (2006). Modeldriven software development: technology, engineering, management: John Wiley & Sons.
  30. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3).
  31. Walia, G. S., & Carver, J. C. (2009). A systematic literature review to identify and classify software requirement errors. Information and Software Technology, 51(7), 1087-1109.
  32. Wixom, B. H., & Todd, P. A. (2005). A theoretical integration of user satisfaction and technology acceptance. Information systems research, 16(1), 85- 102.
Download


Paper Citation


in Harvard Style

Sedrakyan G., Sedrakyan G., Snoeck M. and Snoeck M. (2016). Enriching Model Execution with Feedback to Support Testing of Semantic Conformance between Models and Requirements - Design and Evaluation of Feedback Automation Architecture . In Proceedings of the International Workshop on domAin specific Model-based AppRoaches to vErificaTion and validaTiOn - Volume 1: AMARETTO, (MODELSWARD 2016) ISBN 978-989-758-166-3, pages 14-22. DOI: 10.5220/0005841800140022


in Bibtex Style

@conference{amaretto16,
author={Gayane Sedrakyan and Gayane Sedrakyan and Monique Snoeck and Monique Snoeck},
title={Enriching Model Execution with Feedback to Support Testing of Semantic Conformance between Models and Requirements - Design and Evaluation of Feedback Automation Architecture},
booktitle={Proceedings of the International Workshop on domAin specific Model-based AppRoaches to vErificaTion and validaTiOn - Volume 1: AMARETTO, (MODELSWARD 2016)},
year={2016},
pages={14-22},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005841800140022},
isbn={978-989-758-166-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Workshop on domAin specific Model-based AppRoaches to vErificaTion and validaTiOn - Volume 1: AMARETTO, (MODELSWARD 2016)
TI - Enriching Model Execution with Feedback to Support Testing of Semantic Conformance between Models and Requirements - Design and Evaluation of Feedback Automation Architecture
SN - 978-989-758-166-3
AU - Sedrakyan G.
AU - Sedrakyan G.
AU - Snoeck M.
AU - Snoeck M.
PY - 2016
SP - 14
EP - 22
DO - 10.5220/0005841800140022