"Beyond Kappa: Estimating Inter-Rater Agreement with Nominal Classifications " by Nol Bendermacher and Pierre Souren
  •  
  •  
 

Abstract

Cohen’s Kappa and a number of related measures can all be criticized for their definition of correction for chance agreement. A measure is introduced that derives the corrected proportion of agreement directly from the data, thereby overcoming objections to Kappa and its related measures.

DOI

10.22237/jmasm/1241136540

Plum Print visual indicator of research metrics
PlumX Metrics
  • Citations
    • Citation Indexes: 2
  • Usage
    • Downloads: 310
    • Abstract Views: 48
  • Captures
    • Readers: 4
see details

Share

COinS