IMPROVING COURSE SUCCESS PREDICTION USING ABET COURSE OUTCOMES AND GRADES

Muzaffer Ege Alper, Zehra Çataltepe

2012

Abstract

Modeling and prediction of student success is a critical task in education. In this paper, we employ machine learning methods to predict course grade performance of Computer Engineering students. As features, in addition to the conventional course grades we use fine grained student performance measurements corresponding to different goals (ABET outcomes) of a course. We observe that, compared to using only previous course grades, addition of outcome grades can significantly improve the prediction results. Using the trained model enables interpretation of how different courses affect performance on a specific course in the future. We think that even more detailed and systematically produced course outcome measurements can be beneficial in modeling students university performance.

References

  1. Butcher, D. F. and Muth, W. A. (1985). Predicting performance in an introductory computer science course. Commun. ACM, 28:263-268.
  2. Campbell, P. F. and McCabe, G. P. (1984). Predicting the success of freshmen in a computer science major. Commun. ACM, 27:1108-1113.
  3. Cetintas, S., Si, L., Xin, Y. P., and Hord, C. (2010). Predicting correctness of problem solving in its with a temporal collaborative filtering approach. Intelligent Tutoring Systems, 6094:15-24.
  4. Chawla, N. V., Bowyer, K. W., Hall, L. O., and Kegelmeyer, W. P. (2002). SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research, 16:321-357.
  5. Corbett, A. T. and Anderson, J. R. (1994). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modeling and User-Adapted Interaction, 4:253-278. 10.1007/BF01099821.
  6. Dekker, G., Pechenizkiy, M., and Vleeshouwers, J. (2009). Predicting students drop out: A case study. In EDM'09, pages 41-50.
  7. Felder, R. M., Engineering, D. O. C., Mohr, P. H., Engineering, C. O., and Dietz, E. J. (1993). A longitudinal study of engineering student performance and retention. i. success and failure in the introductory course. J. Engr. Education, pages 15-21.
  8. Genkin, A., Lewis, D. D., and Madigan, D. (2007). Largescale bayesian logistic regression for text categorization. Technometrics, 49(3):291-304.
  9. Gulgezen, G., Cataltepe, Z., and Yu, L. (2009). Stable and Accurate Feature Selection. In Buntine, W., Grobelnik, M., Mladenic, D., and Shawe-Taylor, J., editors, Machine Learning and Knowledge Discovery in Databases, volume 5781, chapter 47, pages 455-468. Springer Berlin Heidelberg, Berlin, Heidelberg.
  10. Guyon, I. and Elisseeff, A. (2003). An introduction to variable and feature selection. J. Mach. Learn. Res., 3:1157-1182.
  11. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., and Witten, I. (2009). The weka data mining software: An update. SIGKDD Explorations, 11(1).
  12. Hosmer, D. and Lemeshow, S. (2000). Applied logistic regression. Wiley series in probability and statistics: Texts and references section. Wiley.
  13. Marquez-Vera, C., Romero, C., and Ventura, S. (2011). Predicting school failure using data mining. In Pechenizkiy, M., Calders, T., Conati, C., Ventura, S., Romero, C., and Stamper, J. C., editors, EDM, pages 271-276. www.educationaldatamining.org.
  14. Oktug, S. (2007). Assessment of program outcomes by using pomas. In Proc. of International Conference on Engineering Education2007, Coimbra, Portugal.
  15. Pardos, Z. A. and Heffernan, N. T. (2011). Using hmms and bagged decision trees to leverage rich features of user and skill from an intelligent tutoring system dataset. Science, (2010).
  16. Peng, H., Long, F., and Ding, C. (2005). Feature selection based on mutual information criteria of maxdependency, max-relevance, and min-redundancy. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 27(8):1226 -1238.
  17. Romero, C. and Ventura, S. (2010). Educational data mining: A review of the state of the art. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 40(6):601 -618.
  18. Rubin, D. B. (1980). Using empirical bayes techniques in the law school validity studies. Journal of the American Statistical Association, 75(372):pp. 801-816.
  19. Thai-Nghe, N., Busche, A., and Schmidt-Thieme, L. (2009). Improving academic performance prediction by dealing with class imbalance. In Intelligent Systems Design and Applications, 2009. ISDA 7809. Ninth International Conference on, pages 878 -883.
  20. Thai-Nghe, N., Drumond, L., Krohn-Grimberghe, A., and Schmidt-Thieme, L. (2010). Recommender system for predicting student performance. Procedia Computer Science, 1(2):2811-2819.
  21. Vapnik, V. (2000). The nature of statistical learning theory. Statistics for engineering and information science. Springer.
  22. Yu, H.-F., Lo, H.-Y., Hsieh, H.-P., Lou, J.-K., Mckenzie, T. G., Chang, P. T., Po, C., Wang, C.-Y., Huang, Y.-H., Ruan, Y.-X., and et al. (2010). Feature engineering and classifier ensemble for kdd cup 2010. Science, pages 1-12.
Download


Paper Citation


in Harvard Style

Alper M. and Çataltepe Z. (2012). IMPROVING COURSE SUCCESS PREDICTION USING ABET COURSE OUTCOMES AND GRADES . In Proceedings of the 4th International Conference on Computer Supported Education - Volume 2: CSEDU, ISBN 978-989-8565-07-5, pages 222-229. DOI: 10.5220/0003922602220229


in Bibtex Style

@conference{csedu12,
author={Muzaffer Ege Alper and Zehra Çataltepe},
title={IMPROVING COURSE SUCCESS PREDICTION USING ABET COURSE OUTCOMES AND GRADES},
booktitle={Proceedings of the 4th International Conference on Computer Supported Education - Volume 2: CSEDU,},
year={2012},
pages={222-229},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003922602220229},
isbn={978-989-8565-07-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 4th International Conference on Computer Supported Education - Volume 2: CSEDU,
TI - IMPROVING COURSE SUCCESS PREDICTION USING ABET COURSE OUTCOMES AND GRADES
SN - 978-989-8565-07-5
AU - Alper M.
AU - Çataltepe Z.
PY - 2012
SP - 222
EP - 229
DO - 10.5220/0003922602220229