In statistics, inter-rater reliability (also called by various similar names, such as inter-rater agreement, inter-rater concordance, inter-observer reliability, and so on) is the degree of agreement among raters. It is a score of how much homogeneity, or consensus, there is in the ratings given by various judges.
- inter-rater.txt
- Last modified: 2024/06/07 02:53
- by 127.0.0.1