Abstract
Cohen’s kappa, an index of inter-rater agreement, behaves paradoxically in 2×2 tables. λA is derived, an index from the restricted quasi-independent model for 2×2 tables. Simulation studies are used to demonstrate λA has superior performance compared to Scott’s pi. Moreover, λA does not show paradoxical behavior for 2×2 tables.
DOI
10.22237/jmasm/1162354500
Included in
Applied Statistics Commons, Social and Behavioral Sciences Commons, Statistical Theory Commons