Inter-rater Reliability Calculator
Measure agreement between two raters using Kappa, ICC, and percent agreement
About Inter-rater Reliability
Cohen's Kappa: Adjusts for chance agreement. For categorical data.
ICC: For continuous/ordinal data. Accounts for both consistency and absolute agreement.
Kappa Interpretation (Landis & Koch):
- < 0: Poor
- 0.00-0.20: Slight
- 0.21-0.40: Fair
- 0.41-0.60: Moderate
- 0.61-0.80: Substantial
- 0.81-1.00: Almost Perfect