Understanding Cohen’s Kappa

Wenxuan Zhang
3 min readNov 21, 2023

Cohen’s Kappa, or quadratic weighted kappa, is a metric to measure agreement between two ratings.

The Key idea of this metrics is comparing the probability of two classifiers actually agree and the probability two classifiers agree by accident.

The formula for this metrics is:

You may wonder why this metrics is defined in this way, not K = P_0*P_e or other form. Here is the reason:

  1. When two classifiers perfectly match each other: K = 1
  2. When the probability of two classifiers…

--

--