Inter-rater reliability/Related Articles
Jump to navigation
Jump to search

- See also changes related to Inter-rater reliability, or pages that link to Inter-rater reliability or to this page or whose text contains "Inter-rater reliability".
Parent topics
Subtopics
Bot-suggested topics
Auto-populated based on Special:WhatLinksHere/Inter-rater reliability. Needs checking by a human.
- Fleiss' kappa [r]: Statistical measure for assessing the reliability of agreement between a fixed number of raters when assigning categorical ratings to a number of items or classifying items. [e]