Authors:
Min-Chul Yang
;
Min-Jeong Kim
;
Hyoung-Gyu Lee
and
Hae-Chang Rim
Affiliation:
Korea University, Korea, Republic of
Keyword(s):
CBA, Computer-based Assessment, AES, Automated Essay Scoring, Writing, Fluency, Machine Learning.
Related
Ontology
Subjects/Areas/Topics:
Assessment Software Tools
;
Computer-Aided Assessment
;
Computer-Supported Education
;
Learning/Teaching Methodologies and Assessment
;
Metrics and Performance Measurement
Abstract:
Automated essay scoring (AES) systems have provided a computer-based writing assessment comparable to expert raters. However, the existing systems are inadequate to assess the writing fluency of non-English-speaking students, while they detect grammatical errors relatively well. The writing fluency is one of the important criteria in essay scoring, because most of non-English-speaking students have much difficulty in expressing their thoughts in English. In this paper, we propose an automated essay scoring system focusing on assessing the writing fluency by considering the quantitative factors such as vocabulary, perplexity in a sentence, diversity of sentence structures and grammatical relations. Experimental results show that the proposed method improves the performance in automated essay scoring.