Show pageBacklinksCite current pageExport to PDFBack to top This page is read only. You can view the source, but not change it. Ask your administrator if you think this is wrong. In statistics, inter-rater reliability (also called by various similar names, such as inter-rater agreement, inter-rater concordance, inter-observer reliability, and so on) is the degree of agreement among raters. It is a score of how much homogeneity, or consensus, there is in the ratings given by various judges. inter-rater.txt Last modified: 2024/06/07 02:53by 127.0.0.1