Learn R Programming

vcd (version 0.1-3.5)

Kappa: Cohen's Kappa and weighted Kappa

Description

Computes two agreement rates: Cohen's kappa and weighted kappa, and confidence bands.

Usage

Kappa(x, weights = c("Equal-Spacing", "Fleiss-Cohen"), conf.level = 0.95)

Arguments

x
a confusion matrix.
weights
either one of the two options or a user-specified matrix with same dimensions as x.
conf.level
level for the confidence intervals.

Value

  • An object of class kappa with three components:
  • KappaKappa statistic, along with Approximate Standard Error (ASE) and 95% confidence bounds.
  • Kappa.Weightedidem for the weighted kappa.
  • Weightsweight matrix used.

Details

Cohen's kappa is the diagonal sum of the (possibly weighted) relative frequencies, corrected for expected values and standardized by its maximum value. The equal-spacing weights are defined by $1 - abs(i - j) / (r - 1)$, r number of colums/rows, and the Fleiss-Cohen weights by $1 - abs(i - j)^2 / (r - 1)^2$. The latter ones attach greater importance to near disagreements.

References

Cohen, Jacob (1960): A coefficient of agreement for nominal scales. Educational and Psychological Measurement. Everitt, B.S. (1968): Moments of statistics kappa and weighted kappa. The British Journal of Mathematical and Statistical Psychology. [object Object] agreementplot data(SexualFun) Kappa(SexualFun) category