Cohen's Kappa Calculator
Measure inter-rater reliability between two raters
Enter the same number of classifications for both raters. Categories can be letters, numbers, or words.
Enter classifications from both raters to see results
About Cohen's Kappa
Cohen's Kappa measures inter-rater agreement for categorical items, accounting for chance agreement.
Values range from -1 to 1, where 1 indicates perfect agreement and 0 indicates chance agreement.
It is commonly used in psychology, medicine, and content analysis.