Cohen’s Kappa and a number of related measures can all be criticized for their definition of correction for chance agreement. A measure is introduced that derives the corrected proportion of agreement directly from the data, thereby overcoming objections to Kappa and its related measures.
Bendermacher, Nol and Souren, Pierre
"Beyond Kappa: Estimating Inter-Rater Agreement with Nominal Classifications,"
Journal of Modern Applied Statistical Methods: Vol. 8
, Article 10.
Available at: http://digitalcommons.wayne.edu/jmasm/vol8/iss1/10