Show pageBacklinksCite current pageExport to PDFBack to top This page is read only. You can view the source, but not change it. Ask your administrator if you think this is wrong. Statistic which measures inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, since κ takes into account the agreement occurring by chance. cohen_s_kappa_coefficient.txt Last modified: 2024/06/07 02:57by 127.0.0.1