Keywords

Grading, Interrater reliability, Online education, Scoring rubric, Students' online discussions

 

Authors

  1. LUNNEY, MARGARET RN, PhD
  2. SAMMARCO, ANGELA RN, PhD

Abstract

Required discussions of course readings provide motivation for students to learn course content and can be used to validate students' comprehension of course content and processes. A tool for grading the students' weekly discussions of course work was tested to evaluate the interrater reliability of grading by two faculty members. The purpose of this article is to describe psychometric testing of the interrater reliability of this grading method. Using the grading tool, independent ratings of five students' online discussion postings were recorded by both faculty members over a 5-week period, which provided the data for this study. Data were analyzed using Spearman [rho] and Kendall [tau]-b statistics. The findings revealed that the overall correlations of rater scores were satisfactory and indicated that an acceptable level of interrater reliability was obtained through use of the grading tool. Reliable tools for evaluation of students' online discussions contribute to the knowledge needed for the implementation of online courses.