Statistic which measures inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, since κ takes into account the agreement occurring by chance.
- cohen_s_kappa_coefficient.txt
- Last modified: 2025/05/13 02:18
- by 127.0.0.1