Stability and Sensitivity of Learning Analytics based Prediction Models

Dirk Tempelaar, Bart Rienties, Bas Giesbers

2015

Abstract

Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. In this follow-up study of previous research (Tempelaar, Rienties, and Giesbers, 2015), we focus on the issues of stability and sensitivity of Learning Analytics (LA) based prediction models. Do predictions models stay intact, when the instructional context is repeated in a new cohort of students, and do predictions models indeed change, when relevant aspects of the instructional context are adapted? This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. We compare two cohorts of a large introductory quantitative methods module, with 1005 students in the ’13/’14 cohort, and 1006 students in the ’14/’15 cohort. Both modules were based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials, and have similar instructional design, except for an intervention into the design of quizzes administered in the module. Focusing on the predictive power, we provide evidence of both stability and sensitivity of regression type prediction models.

References

  1. Agudo-Peregrina, Á. F., Iglesias-Pradas, S., CondeGonzález, M. Á., and Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542-550. doi: 10.1016/j.chb.2013.05.031.
  2. Arbaugh, J. B. (2014). System, scholar, or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning. doi: 10.1111/jcal.12048.
  3. Baker, R. (2010). Data mining for education. International encyclopedia of education, 7, 112-118.
  4. Bienkowski, M., Feng, M., and Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. US Department of Education, Office of Educational Technology, 1-57.
  5. Boud, D., and Falchikov, N. (2006). Aligning assessment with long term learning. Assessment & Evaluation in Higher Education, 31(4), 399-413. doi: 10.1080/02602930600679050.
  6. Buckingham Shum, S., and Deakin Crick, R. (2012). Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics. Paper presented at the 2nd International Conference on Learning Analytics & Knowledge, Vancouver, British Columbia.
  7. Buckingham Shum, S., and Ferguson, R. (2012). Social Learning Analytics. Journal of Educational Technology & Society, 15(3). doi: 10.1145/2330601.2330616.
  8. Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695. doi: 10.1080/13562517.2013.827653.
  9. Greller, W., and Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Journal of Educational Technology & Society, 15(3).
  10. Hommes, J., Rienties, B., de Grave, W., Bos, G., Schuwirth, L., and Scherpbier, A. (2012). Visualising the invisible: a network approach to reveal the informal social side of student learning. Advances in Health Sciences Education, 17(5), 743-757. doi: 10.1007/s10459-012-9349-0.
  11. Järvelä, S., Hurme, T., and Järvenoja, H. (2011). Selfregulation and motivation in computer-supported collaborative learning environments. In S. Ludvigson, A. Lund, I. Rasmussen and R. Säljö (Eds.), Learning across sites: new tools, infrastructure and practices (pp. 330-345). New York, NY: Routledge.
  12. Lajoie, S. P., and Azevedo, R. (2006). Teaching and learning in technology-rich environments. In P. Alexander and P. Winne (Eds.), Handbook of educational psychology (2 ed., pp. 803-821). Mahwah, NJ: Erlbaum.
  13. Lehmann, T., Hähnlein, I., and Ifenthaler, D. (2014). Cognitive, metacognitive and motivational perspectives on preflection in self-regulated online learning. Computers in Human Behavior, 32, 313-323. doi: 10.1016/j.chb.2013.07.051.
  14. Macfadyen, L. P., and Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588-599. doi: 10.1016/j.compedu.2009.09.008.
  15. Marks, R. B., Sibley, S. D., and Arbaugh, J. B. (2005). A Structural Equation Model of Predictors for Effective Online Learning. Journal of Management Education, 29(4), 531-563. doi: 10.1177/1052562904271199.
  16. Martin, A. J. (2007). Examining a multidimensional model of student motivation and engagement using a construct validation approach. British Journal of Educational Psychology, 77(2), 413-440. doi: 10.1348/000709906X118036.
  17. Narciss, S. (2008). Feedback strategies for interactive learning tasks. In J. M. Spector, M. D. Merrill, J. J. G. van Merrienboer and M. P. Driscoll (Eds.), Handbook of Research on Educational Communications and Technology (3 ed., pp. 125-144). Mahaw, NJ: Lawrence Erlbaum Associates.
  18. Narciss, S., and Huth, K. (2006). Fostering achievement and motivation with bug-related tutoring feedback in a computer-based training for written subtraction. Learning and Instruction, 16(4), 310-322. doi: 10.1016/j.learninstruc.2006.07.003.
  19. Nistor, N., Baltes, B., Dascalu, M., Mihaila, D., Smeaton, G., and Trausan-Matu, S. (2014). Participation in virtual academic communities of practice under the influence of technology acceptance and community factors. A learning analytics application. Computers in Human Behavior, 34, 339-344. doi: 10.1016/j.chb.2013.10.051.
  20. Oblinger, D. G. (2012). Let's Talk... Analytics. EDUCAUSE Review, 47(4), 10-13.
  21. Pekrun, R., Goetz, T., Frenzel, A. C., Barchfeld, P., and Perry, R. P. (2011). Measuring emotions in students' learning and performance: The Achievement Emotions Questionnaire (AEQ). Contemporary Educational Psychology, 36(1), 36-48. doi: 10.1016/j.cedpsych.2010.10.002.
  22. Richardson, J. T. E. (2012). The attainment of White and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393-408. doi: 10.1080/02602938.2010.534767.
  23. Rienties, B., and Alden Rivers, B. (2014). Measuring and Understanding Learner Emotions: Evidence and Prospects. Learning Analytics Review 1, Learning Analytics Community Exchange (LACE). http://www.laceproject.eu/learning-analyticsreview/measuring-and-understanding-learneremotions/
  24. Rienties, B., Cross, S., and Zdrahal, Z. (2015). "Implementing a Learning Analytics Intervention and Evaluation Framework: what works?" In B. Motidyang and R. Butson (Eds.): Big data and learning analytics in higher education. Springer, Berlin.
  25. Rienties, B., Tempelaar, D. T., Giesbers, B., Segers, M., and Gijselaers, W. H. (2012). A dynamic analysis of social interaction in Computer Mediated Communication; a preference for autonomous learning. Interactive Learning Environments, 22(5), 631-648. doi: 10.1080/10494820.2012.707127.
  26. Rienties, B., Tempelaar, D. T., Van den Bossche, P., Gijselaers, W. H., and Segers, M. (2009). The role of academic motivation in Computer-Supported Collaborative Learning. Computers in Human Behavior, 25(6), 1195-1206. doi: 10.1016/j.chb.2009.05.012.
  27. Schmidt, H. G., Van Der Molen, H. T., Te Winkel, W. W. R., and Wijnen, W. H. F. W. (2009). Constructivist, Problem-Based Learning does work: A meta-analysis of curricular comparisons involving a single medical school. Educational Psychologist, 44(4), 227-249. doi: 10.1080/00461520903213592.
  28. Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380-1400. doi: 10.1177/0002764213498851.
  29. Siemens, G., Dawson, S., and Lynch, G. (2013). Improving the quality of productivity of the higher education sector: Policy and strategy for systems-level deployment of learning analytics: Solarresearch.
  30. Stiles, R. J. (2012). Understanding and Managing the Risks of Analytics in Higher Education: A Guide: Educause.
  31. Tempelaar, D. T., Heck, A., Cuypers, H., van der Kooij, H., and van de Vrie, E. (2013). Formative Assessment and Learning Analytics. In D. Suthers and K. Verbert (Eds.), Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, 205-209. New York: ACM. doi: 10.1145/2460296.2460337.
  32. Tempelaar, D. T., Kuperus, B., Cuypers, H., Van der Kooij, H., Van de Vrie, E., and Heck, A. (2012). “The Role of Digital, Formative Testing in e-Learning for Mathematics: A Case Study in the Netherlands”. In: “Mathematical e-learning” [online dossier]. Universities and Knowledge Society Journal (RUSC), 9(1). UoC.
  33. Tempelaar, D. T., Niculescu, A., Rienties, B., Giesbers, B., and Gijselaers, W. H. (2012). How achievement emotions impact students' decisions for online learning, and what precedes those emotions. Internet and Higher Education, 15(3), 161-169. doi: 10.1016/j.iheduc.2011.10.003.
  34. Tempelaar, D. T., Rienties, B., and Giesbers, B. (2009). Who profits most from blended learning? Industry and Higher Education, 23(4), 285-292.
  35. Tempelaar, D. T., Rienties, B., and Giesbers, B. (2014). Computer Assisted, Formative Assessment and Dispositional Learning Analytics in Learning Mathematics and Statistics. In M. Kalz and E. Ras (Eds.), Computer Assisted Assessment. Research into E-Assessment, pp. 67-78. Berlin, Springer: Communications in Computer and Information Science, Volume 439. doi: 10.1007/978-3-319-08657- 6_7.
  36. Tempelaar, D. T., Rienties, B, and Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157-167. doi: 10.1016/j.chb.2014.05.038.
  37. Thakur, G., Olama, M. M., McNair, W., Sukumar, S. R., and Studham, S. (2014). Towards Adaptive Educational Assessments: Predicting Student Performance using Temporal Stability and Data Analytics in Learning Management Systems. In: Proceedings 20th ACM SIGKDD conference on knowledge discovery and data mining: ACCESS, New York city, NY.
  38. Tobarra, L., Robles-Gómez, A., Ros, S., Hernández, R., and Caminero, A. C. (2014). Analyzing the students' behavior and relevant topics in virtual learning communities. Computers in Human Behavior, 31, 659- 669. doi: 10.1016/j.chb.2013.10.001.
  39. Verbert, K., Manouselis, N., Drachsler, H., and Duval, E. (2012). Dataset-Driven Research to Support Learning and Knowledge Analytics. Journal of Educational Technology & Society, 15(3), 133-148.
  40. Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning styles and strategies: A phenomenographic analysis. Higher Education, 31, 25-50. doi: 10.1007/BF00129106.
  41. Whitelock, D., Richardson, J., Field, D., Van Labeke, N., and Pulman, S. (2014). Designing and Testing Visual Representations of Draft Essays for Higher Education Students. Paper presented at the LAK 2014, Indianapolis.
  42. Wolff, A., Zdrahal, Z., Nikolov, A., and Pantucek, M. (2013). Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. In D. Suthers and K. Verbert (Eds.), Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, 145-149. New York: ACM. doi: 10.1145/2460296.2460324.
Download


Paper Citation


in Harvard Style

Tempelaar D., Rienties B. and Giesbers B. (2015). Stability and Sensitivity of Learning Analytics based Prediction Models . In Proceedings of the 7th International Conference on Computer Supported Education - Volume 1: CSEDU, ISBN 978-989-758-107-6, pages 156-166. DOI: 10.5220/0005497001560166


in Bibtex Style

@conference{csedu15,
author={Dirk Tempelaar and Bart Rienties and Bas Giesbers},
title={Stability and Sensitivity of Learning Analytics based Prediction Models},
booktitle={Proceedings of the 7th International Conference on Computer Supported Education - Volume 1: CSEDU,},
year={2015},
pages={156-166},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005497001560166},
isbn={978-989-758-107-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 7th International Conference on Computer Supported Education - Volume 1: CSEDU,
TI - Stability and Sensitivity of Learning Analytics based Prediction Models
SN - 978-989-758-107-6
AU - Tempelaar D.
AU - Rienties B.
AU - Giesbers B.
PY - 2015
SP - 156
EP - 166
DO - 10.5220/0005497001560166