An Ontological Model for Assessment Analytics

Azer Nouira, Lilia Cheniti-Belcadhi, Rafik Braham

2017

Abstract

Today, there is a growing interest in data and analytics in the learning environment resulting in a highly qualified research concerning models, methods, tools, technologies and analytics. This research area is referred to as learning analytics. Metadata becomes an important item in an e-learning system, many learning analytics models are currently developed. They use metadata to tag learning materials, learning resources and learning activities. In this paper, we firstly give a detailed injection of the existing learning analytics models in the literature. We particularly observed that there is a lack of models dedicated to conceive and analyze the assessment data. That is why our objective in this paper is to propose an assessment analytics model inspired by the Experience API data model. Hence, an assessment analytics ontology model is developed supporting the analytics of assessment data by tracking the assessment activities, assessment result and assessment context of the learner.

References

  1. Brickley, D. and L. Miller (2014). FOAF Vocabulary Specification 0.99.
  2. Brouns, F., Tammets, K., and Padrón-Nápoles, C. L. (2014). How can the EMMA approach to learning analytics improve employability?.
  3. Corbi, A., and Burgos, D. (2014). Review of Current Student-Monitoring Techniques used in eLearningFocused recommender Systems and Learning analytics. The Experience API & LIME model Case Study. IJIMAI, 2(7), 44-52.
  4. Cooper, A. (2015). Assessment Analytics. In Eunis Elearning task force workshop.
  5. Del Blanco, Á., Serrano, Á., Freire, M., Martínez-Ortiz, I., and Fernández-Manjón, B. (2013). E-Learning standards and learning analytics. Can data collection be improved by using standard data models?. In Global Engineering Education Conference (EDUCON), 2013 IEEE (pp. 1255-1261). IEEE.
  6. Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44, 4, 662-664.
  7. Experience API Working Group (2013). Experience API. Version 1.0.1.
  8. IEEE Learning Technology Standards Committee (2002). Draft standard for learning object.
  9. J. Snell, M. Atkins, W. Norris, C. Messina, M. Wilkinson, and R. Dolin (2015). Activity streams 2.0 w3c working draft. W3C Working Draft, W3C. http://www.w3.org/TR/2015/WD-activitystreamscore/ Kelly, D. and Thorn, K. (2013). Should Instructional Designers care about the Tin Can API? In eLearn, 3, 1.
  10. Kitto, K., Cross, S., Waters, Z., and Lupton, M. (2015). Learning analytics beyond the LMS: the connected learning analytics toolkit. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 11-15). ACM.
  11. Lohmann, S., Negru, S., and Bold, D. (2014). The ProtégéVOWL plugin: ontology visualization for everyone. In European Semantic Web Conference, 395-400. Springer.
  12. Lukarov, V., Chatti, M. A., Thus, H., Kia, F.S., Muslim, A., Greven, C., and Schroeder, U. (2014).Data Models in Learning Analytics. In DeLFI Workshops 88-95.
  13. Moody, D. L. (1998). Metrics for evaluating the quality of entity relationship models. In International Conference on Conceptual Modeling, 211-225.
  14. Niemann, K., Scheffel, M., and Wolpers, M. (2012). An overview of usage data formats for recommendations in TEL. In 2 nd Workshop on Recommender Systems for Technology Enhanced Learning, 95.
  15. NSDL's Technical Schema for Paradata Exchange. (2011). Retrieved from http://nsdlnetwork.org/stemexchange/paradata/schema.
  16. Noy, N. F. and McGuinness, D. L. (2005). Développement d'une ontologie 101: Guide pour la création de votre première ontologie. Université de Stanford. Rapport technique.
  17. Ochoa, X. and Duval, E. (2006). Use of contextualized attention metadata for ranking and recommending learning objects. In Proceedings of the 1st international workshop on Contextualized attention metadata: collecting, managing and exploiting of rich usage information, 9-16.
  18. Paradata Specification. v1.0, (2011). Retrieved from https://docs.google.com/document/d/1IrOYXd3S0FU wNozaEG5tM7Ki4_AZPrBn-pbyVUzBh0/edit?hl=en_US.
  19. Schmitz, H. C., Wolpers, M., Kirschenmann, U., and Niemann, K. (2012). Contextualized Attention Metadata'. Human Attention in Digital Environments, 186-209.
  20. Siemens, G. (2011). What Are Learning Analytics? In 1st international conference on Learning Analytics and Knowledge. 2015.
  21. The Dublin Core Metadata Initiative. (2001) DC-Library Application Profile, Dublin Core.
  22. Thus, H, Chatti, M.A, Brandt, R., and Schroeder, U. (2015) Evolution of Interests in the Learning Context Data Model. In Design for Teaching and Learning in a Networked World. Springer.
  23. Wolpers, M., Najjar, J., Verbert, K., and Duval, E. (2007). Tracking actual usage: the attention metadata approach. Educational Technology & Society, 10, 3, 106-121.
Download


Paper Citation


in Harvard Style

Nouira A., Cheniti-Belcadhi L. and Braham R. (2017). An Ontological Model for Assessment Analytics . In Proceedings of the 13th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST, ISBN 978-989-758-246-2, pages 243-251. DOI: 10.5220/0006284302430251


in Bibtex Style

@conference{webist17,
author={Azer Nouira and Lilia Cheniti-Belcadhi and Rafik Braham},
title={An Ontological Model for Assessment Analytics},
booktitle={Proceedings of the 13th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST,},
year={2017},
pages={243-251},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006284302430251},
isbn={978-989-758-246-2},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 13th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST,
TI - An Ontological Model for Assessment Analytics
SN - 978-989-758-246-2
AU - Nouira A.
AU - Cheniti-Belcadhi L.
AU - Braham R.
PY - 2017
SP - 243
EP - 251
DO - 10.5220/0006284302430251