An Analysis of Factors Affecting Automatic Assessment based on Teacher-mediated Peer Evaluation - The Case of OpenAnswer

Maria De Marsico, Andrea Sterbini, Marco Temperini

2016

Abstract

In this paper we experimentally investigate the influence of several factors on the final performance of an automatic grade prediction system based on teacher-mediated peer assessment. Experiments are carried out by OpenAnswer, a system designed for peer assessment of open-ended questions. It exploits a Bayesian Network to model the students’ learning state and the propagation of information injected in the system by peer grades and by a (partial) grading from the teacher. The relevant variables are characterized by a probability distribution (PD) of their discrete values. We aim at analysing the influence of the initial set up of the PD of these variables on the ability of the system to predict a reliable grade for answers not yet graded by the teacher. We investigate here the influence of the initial choice of the PD for the student’s knowledge (K), especially when we have no information on the class proficiency on the examined skills, and of the PD of the correctness of student’s answers, conditioned by her knowledge, P(C|K). The latter is expressed through different Conditional Probability Tables (CPTs), in turn, to identify the one allowing to achieve the best final results. Moreover we test different strategies to map the final PD for the correctness (C) of an answer, namely the grade that will be returned to the student, onto a single discrete value.

References

  1. Anderson, L. W., Krathwohl, D. R. (eds.), 2000. A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Allyn and Bacon.
  2. Anderson, J.R., Corbett, A.T., Koedinger, K.R., Pelletier, R., 1995. Cognitive Tutors: Lessons Learned. The Journal of the Learning Sciences, 4(2), 167-207.
  3. Bloom, B.S., Engelhart, M.D., Furst,. E.J., Hill, W.H., Krathwohl, D.R., 1956. Taxonomy of educational objectives: The classification of educational goals. Handbook I. McKay.
  4. Castellanos-Nieves, D., Fernández-Breis, J., ValenciaGarcía, R., Martínez-Béjar, R., Iniesta-Moreno, M., 2011. Semantic Web Technologies for supporting learning assessment, Inf. Sciences, 181:9.
  5. Cho, K., MacArthur, C., 2010. Student Revision with Peer and Expert Reviewing. Learning and Instruction 20(4).
  6. Chung, H., Graf, S., Robert Lai, K., Kinshuk, 2011. Enrichment of Peer Assessment with Agent Negotiation. IEEE TLT Learning Technologies, 4(1), pp.35-46.
  7. Conati, C., Gartner, A. , Vanlehn, K., 2002. Using Bayesian Networks to Manage Uncertainty in Student Modeling. User Modeling and User-Adapted Interaction 12, pages 371-417.
  8. De Marsico, M., Sterbini, A., Temperini, M., 2015. Towards a quantitative evaluation of the relationship between the domain knowledge and the ability to assess peer work. Proc. ITHET 2015 (pp. 1-6). IEEE.
  9. El-Kechaï, N., Delozanne, É., Prévit, D., Grugeon, B., Chenevotot, F., 2011. Evaluating the Performance of a Diagnosis System in School Algebra, ICWL, LNCS 7048.
  10. Formisano, A., Omodeo, E.G., Temperini, M., 2000. Goals and benchmarks for automated map reasoning, Journal of Symbolic Computation, 29(2), pp. 259-297, Elsevier.
  11. Formisano, A., Omodeo, E.G., Temperini, M., 2001. Layered map reasoning: An experimental approach put to trial on sets. El. Notes in Theor. Comp. Sci., 48, pp. 1-28, Elsevier.
  12. Gentili, G.L., Marinilli, M., Micarelli, A., Sciarrone, F., 2001. Text categorization in an intelligent agent for filtering information on the web. Int. J. of Patt. Rec. and A. I. 15(3).
  13. Jackson, K., Trochim, W., 2002. Concept mapping as an alternative approach for the analysis of open-ended survey responses. Organizational Research Methods, 5, Sage.
  14. Li, L.X., Liu, X. , Steckelberg, A. L., 2010. Assessor or Assessee: How Student Learning Improves by Giving and Receiving Peer Feedback. Br. J. of Ed. Tech. 41 (3), pages 525-536.
  15. Limongelli, C., Miola, A., Sciarrone, F., Temperini, M., 2012. Supporting Teachers to Retrieve and Select Learning Objects for Personalized Courses in the Moodle LS Environment. In Proc. 14th IEEE ICALT, Workshop SPEL,pp. 518-520, IEEE Computer Society.
  16. Limongelli, C., Lombardi, M., Marani, A., Sciarrone, F., 2013. A teaching-style based social network for didactic building and sharing, LNAI 7926, pp.774- 777, Springer.
  17. Limongelli, C., Sciarrone, F., Temperini, M., 2015. A social network-based teacher model to support course construction. Comp. in Human Behav., 51, Elsevier.
  18. Limongelli, C., Lombardi, M., Marani, A., Sciarrone, F., Temperini, M., 2016. A recommendation module to help teachers build courses through the Moodle Learning Management System. New Review of Hypermedia and Multimedia, 22(1-2), Taylor & Francis.
  19. Metcalfe, J., Shimamura, A. P., 1994. Metacognition: knowing about knowing. Cambridge, MA: MIT Press.
  20. Miller, P., 2003. The Effect of Scoring Criteria Specificity on Peer and Self-assessment. Assessment & Evaluation in Higher Education, 28/4.
  21. Palmer, K., Richardson, P., 2003. On-line assessment and free-response input-a pedagogic and technical model for squaring the circle. In Proc. 7th CAA Conf. (pp. 289-300).
  22. Sadler, P. M., E. Good, P. M., 2006. The Impact of Selfand Peer-Grading on Student Learning. Ed. Ass., 11(1).
  23. Sciarrone, F., 2013. An extension of the Q diversity metric for information processing in multiple classifier systems: A field evaluation, Int. Journal of Wavelets, Multiresolution and Information Processing, 11(6).
  24. Sterbini, A., Temperini, M., 2009. Collaborative Projects and Self Evaluation within a Social Reputation-Based Exercise-Sharing System. Proc. IEEE/WIC/ACM WIIAT'09, Vol. 3, Workshop SPEL, pp. 243-246.
  25. Sterbini, A., Temperini, M., 2012. Dealing with openanswer questions in a peer-assessment environment. Proc. ICWL 2012. LNCS, vol. 7558, pp. 240-248. Springer, Heidelberg.
  26. Sterbini, A., Temperini, M., 2013a. OpenAnswer, a framework to support teacher's management of open answers through peer assessment. Proc. 43th Frontiers in Education (FIE 2013).
  27. Sterbini, A., Temperini, M., 2013b. Analysis of OpenAnswers via mediated peer-assessment. Proc. 17th IEEE Int Conf. on System Theory, Control and Computing (ICSTCC 2013).
  28. Topping, K., 1998. Peer assessment between students in colleges and universities, Rev. of Ed. Research, 68, pp. 249-276.
  29. Yamanishi, K. , Li, H. , 2002. Mining Open Answers in Questionnaire Data, IEEE Int. Systems, Sept-Oct, pp 58-63.
Download


Paper Citation


in Harvard Style

De Marsico M., Sterbini A. and Temperini M. (2016). An Analysis of Factors Affecting Automatic Assessment based on Teacher-mediated Peer Evaluation - The Case of OpenAnswer . In Proceedings of the 8th International Conference on Computer Supported Education - Volume 2: CSEDU, ISBN 978-989-758-179-3, pages 49-61. DOI: 10.5220/0005863200490061


in Bibtex Style

@conference{csedu16,
author={Maria De Marsico and Andrea Sterbini and Marco Temperini},
title={An Analysis of Factors Affecting Automatic Assessment based on Teacher-mediated Peer Evaluation - The Case of OpenAnswer},
booktitle={Proceedings of the 8th International Conference on Computer Supported Education - Volume 2: CSEDU,},
year={2016},
pages={49-61},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005863200490061},
isbn={978-989-758-179-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 8th International Conference on Computer Supported Education - Volume 2: CSEDU,
TI - An Analysis of Factors Affecting Automatic Assessment based on Teacher-mediated Peer Evaluation - The Case of OpenAnswer
SN - 978-989-758-179-3
AU - De Marsico M.
AU - Sterbini A.
AU - Temperini M.
PY - 2016
SP - 49
EP - 61
DO - 10.5220/0005863200490061