Author:
Nane Kratzke
Affiliation:
Lübeck University of Applied Sciences, Mönkhofer Weg 239, 23562 Lübeck and Germany
Keyword(s):
Automated, Programming, Assignment, Assessment, Education, MOOC, Code injection, Moodle, VPL.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence and Decision Support Systems
;
Blended Learning
;
Cloud-Based Learning and Assessment
;
Computer-Supported Education
;
e-Learning
;
e-Learning Hardware and Software
;
e-Learning Platforms
;
Enterprise Information Systems
;
Information Technologies Supporting Learning
;
Intelligent Tutoring Systems
;
Learning Analytics
;
Learning/Teaching Methodologies and Assessment
;
Simulation and Modeling
;
Simulation Tools and Platforms
;
Ubiquitous Learning
;
Virtual Labs and Virtual Classrooms
Abstract:
This case study reports on two first-semester programming courses with more than 190 students. Both courses made use of automated assessments. We observed how students trick these systems by analysing the version history of suspect submissions. By analysing more than 3300 submissions, we revealed four astonishingly simple tricks (overfitting, evasion) and cheat-patterns (redirection, and injection) that students used to trick automated programming assignment assessment systems (APAAS). Although not the main focus of this study, it discusses and proposes corresponding counter-measures where appropriate. Nevertheless, the primary intent of this paper is to raise problem awareness and to identify and systematise observable problem patterns in a more formal approach. The identified immaturity of existing APAAS solutions might have implications for courses that rely deeply on automation like MOOCs. Therefore, we conclude to look at APAAS solutions much more from a security point of view (
code injection). Moreover, we identify the need to evolve existing unit testing frameworks into more evaluation-oriented teaching solutions that provide better trick and cheat detection capabilities and differentiated grading support.
(More)