Cohen's Kappa Calculator

Measure inter-rater reliability and agreement

Calculate Kappa

What is Cohen's Kappa?

Cohen's Kappa (κ) is a statistic that measures inter-rater agreement for categorical items. It accounts for the possibility of agreement occurring by chance, providing a more robust measure than simple percent agreement.

Formula

κ = (Po - Pe) / (1 - Pe)

Where Po = observed agreement and Pe = expected agreement by chance

Interpreting Kappa

KappaAgreement Level
< 0.00Less than chance
0.01 - 0.20Slight
0.21 - 0.40Fair
0.41 - 0.60Moderate
0.61 - 0.80Substantial
0.81 - 1.00Almost perfect

Guidelines from Landis & Koch (1977)

When to Use Kappa

Weighted Kappa

For ordinal data where some disagreements are worse than others, use weighted kappa:

Example: Medical Diagnosis

Two doctors independently classify 100 X-rays as Normal, Suspicious, or Abnormal. Cohen's Kappa measures how much better their agreement is compared to what would be expected by chance.

Try the Calculator

CrossTabs.com calculates Cohen's Kappa with:

Calculate Now

Step-by-Step Example

Two doctors classify 100 X-rays as "Normal" or "Abnormal":

Doctor B: NormalDoctor B: AbnormalTotal
Doctor A: Normal70575
Doctor A: Abnormal101525
Total8020100

Step 1: Observed agreement (Pₒ) = (70 + 15) / 100 = 0.85

Step 2: Expected agreement (Pₑ) = (75×80 + 25×20) / 100² = (6000 + 500) / 10000 = 0.65

Step 3: Cohen's kappa = (Pₒ − Pₑ) / (1 − Pₑ) = (0.85 − 0.65) / (1 − 0.65) = 0.20 / 0.35 = 0.571

Interpretation: κ = 0.57 indicates moderate agreement beyond chance.

Interpreting Kappa Values

Kappa (κ)Agreement Level
< 0.00Less than chance (worse than random)
0.00 – 0.20Slight agreement
0.21 – 0.40Fair agreement
0.41 – 0.60Moderate agreement
0.61 – 0.80Substantial agreement
0.81 – 1.00Almost perfect agreement

Common Mistakes to Avoid