Formulating Module Assessment for Improved Academic Performance Predictability in Higher Education

M. Alsuwaiket, A. H. Blasi, R. A. Al-Msie'deen

Abstract


The choice of an effective student assessment method is an issue of interest in Higher Education. Various studies [1] have shown that students tend to get higher marks when assessed through coursework-based assessment methods which include either modules that are fully assessed through coursework or a mixture of coursework and examinations than assessed by examination alone. There are a large number of educational data mining (EDM) studies that pre-process data through conventional data mining processes including data preparation process, but they are using transcript data as they stand without looking at examination and coursework results weighting which could affect prediction accuracy. This paper proposes a different data preparation process through investigating more than 230,000 student records in order to prepare students’ marks based on the assessment methods of enrolled modules. The data have been processed through different stages in order to extract a categorical factor through which students’ module marks are refined during the data preparation process. The results of this work show that students’ final marks should not be isolated from the nature of the enrolled module’s assessment methods. They must rather be investigated thoroughly and considered during EDM’s data pre-processing phases. More generally, it is concluded that educational data should not be prepared in the same way as other data types due to differences as data sources, applications, and types of errors in them. Therefore, an attribute, coursework assessment ratio (CAR), is proposed to be used in order to take the different modules’ assessment methods into account while preparing student transcript data. The effect of CAR on prediction process using the random forest classification technique has been investigated. It is shown that considering CAR as an attribute increases the accuracy of predicting students’ second-year averages based on their first-year results.


Keywords


EDM; data mining; higher education; machine learning; module assessment

Full Text:

PDF

References


C. Romero, S. Ventura, M. Pechenizkiy, R. Baker, Handbook of Educational Data Mining, Chapman & Hall, 2010

R. S. J. D. Baker, K. Yacef, “The state of educational data mining in 2009: A review and future visions”, Journal of educational data mining, Vol. 1, No. 1, pp. 3-16, 2009

C. Romero, S. Ventura, “Educational data mining: A survey from 1995 to 2005”, Expert systems with applications, Vol. 33, No. 1, pp. 135-146, 2007

J. T. E. Richardson, “Coursework versus examinations in end-of-module assessment: A literature review”, Assessment & Evaluation in Higher Education, Vol. 40, No. 3, 2015

P. Bridges, A. Cooper, P. Evanson, C. Haines, D. Jenkins, D. Scurry, H. Woolf, M. Yorke, “Coursework Marks High, Examination Marks Low: Discuss”, Assessment & Evaluation in Higher Education, Vol. 27, No. 1, 2002

J. Heywood, Assessment in Higher Education: Student Learning, Teaching, Programmes and Institutions, Jessica Kingsley, 2000

G. Gibbs, C. Simpson, “Conditions Under Which Assessment Supports Students’ Learning”, in: Learning and Teaching in Higher Education, Vol. 1, pp. 3-31, University of Gloucestershire, 2005

R. Dearing, Higher Education in the Learning Society: Main Report, National Committee of Inquiry into Higher Education, 1997

G. Gibbs, “Using assessment strategically to change the way students learn”, From Assessment Matters in Higher Education,Vol. 4, pp. 41-53, 1999

P. E. Morris, C. Fritz, “Conscientiousness and procrastination predict academic coursework marks rather than examination performance”, Learning and Individual Differences, Vol. 39, pp. 193-198, 2015

J. M. Hellerstein, Quantitative Data Cleaning for Large Databases, United Nations Economic Commission for Europe, 2008

R. Asif, S. Hina, S. I. Haque, “Predicting student academic performance using data mining methods”, International Journal of Computer Science and Network Security, Vol. 17, No. 5, pp. 187-191, 2017

R. Kohavi, F. Provost, “Glossary of terms”, Machine Learning, Vol. 30, No. 2, pp. 271-274, 1998

A. P. Bradley, “The use of the area under the ROC carve in the evaluation of machine learning algorithms”, Pattern Recognition, Vol. 30, No. 7, pp. 1145-1159, 1997




eISSN: 1792-8036     pISSN: 2241-4487