πŸ“Š Kappa Value

The Kappa value (Cohen’s Kappa) is a statistical measure of interrater agreement for categorical data, adjusted for the amount of agreement that could occur by chance.

It quantifies how consistently two or more observers classify the same items beyond random coincidence.

  • Ranges from –1 to 1:
    • 1 = Perfect agreement
    • 0 = Agreement equivalent to chance
    • < 0 = Worse than chance (systematic disagreement)
  • More robust than simple percent agreement because it accounts for expected agreement due to randomness.
Kappa (ΞΊ) Value Strength of Agreement
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”-
< 0 Poor (less than chance)
0.01 – 0.20 Slight agreement
0.21 – 0.40 Fair agreement
0.41 – 0.60 Moderate agreement
0.61 – 0.80 Substantial agreement
0.81 – 1.00 Almost perfect agreement

If two neuroradiologists independently evaluate MRI scans for presence of DVA and reach the same conclusion 76% of the time, but much of that agreement is expected by chance, the Kappa might only be 0.51, indicating moderate agreement.

  • Use Kappa to assess reliability of diagnostic tools or observer consistency.
  • Always report Kappa with confidence intervals and p-values for context.
  • For ordinal scales, consider using weighted Kappa.
  • kappa_value.txt
  • Last modified: 2025/06/14 15:23
  • by administrador