Table of Contents

πŸ“Š Kappa Value

The Kappa value (Cohen’s Kappa) is a statistical measure of interrater agreement for categorical data, adjusted for the amount of agreement that could occur by chance.

It quantifies how consistently two or more observers classify the same items beyond random coincidence.

🧠 Key Characteristics

πŸ“ˆ Interpretation Guide

Kappa (ΞΊ) Value Strength of Agreement
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”-
< 0 Poor (less than chance)
0.01 – 0.20 Slight agreement
0.21 – 0.40 Fair agreement
0.41 – 0.60 Moderate agreement
0.61 – 0.80 Substantial agreement
0.81 – 1.00 Almost perfect agreement

πŸ“Œ Example

If two neuroradiologists independently evaluate MRI scans for presence of DVA and reach the same conclusion 76% of the time, but much of that agreement is expected by chance, the Kappa might only be 0.51, indicating moderate agreement.

βœ… Best Practice