The Kappa value (Cohenβs Kappa) is a statistical measure of interrater agreement for categorical data, adjusted for the amount of agreement that could occur by chance.
It quantifies how consistently two or more observers classify the same items beyond random coincidence.
Kappa (ΞΊ) Value | Strength of Agreement |
ββββββ | βββββββββββ- |
< 0 | Poor (less than chance) |
0.01 β 0.20 | Slight agreement |
0.21 β 0.40 | Fair agreement |
0.41 β 0.60 | Moderate agreement |
0.61 β 0.80 | Substantial agreement |
0.81 β 1.00 | Almost perfect agreement |
If two neuroradiologists independently evaluate MRI scans for presence of DVA and reach the same conclusion 76% of the time, but much of that agreement is expected by chance, the Kappa might only be 0.51, indicating moderate agreement.