Some Statistical Aspects of Measuring Agreement Based on a Modified Kappa
Keywords:
measuring agreement, Cohen’s kappa, modified kappa, asymptotic mean, asymptotic varianceAbstract
The focus of this paper is the statistical inference of the problem of assessing agreement or disagreement between two raters who employ measurements on a two-level nominal scale. The purpose of this study was to derive the approximate asymptotic variance of the modified kappa statistic. Further, a comparison of the proposed estimate and an estimated large sample variance of Cohen’s kappa is provided for all proportions expected to get a rating of 1 from each rater. When the value of the modified kappa is greater than or equal to 0.5 (or less than or equal to –0.5), the result of this study demonstrated that the sample estimate of the modified kappa is more efficient than the estimate of Cohen’s kappa for each probability of being classified by both raters as category 1.
Downloads
Published
How to Cite
Issue
Section
License
online 2452-316X print 2468-1458/Copyright © 2022. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/),
production and hosting by Kasetsart University of Research and Development Institute on behalf of Kasetsart University.