Stat Tools

Attribute Agreement Analysis

Measure PhaseGB

Kappa-based agreement analysis for attribute (pass/fail, categorical) measurement systems. Evaluates within-appraiser consistency, between-appraiser agreement, and agreement vs. a known reference standard.

Study Setup

10 parts entered

Trial 1
Trial 2
Trial 1
Trial 2
Trial 1
Trial 2

Overall Kappa: 0.859 — Good (0.8–0.9)

10 parts · 3 appraisers · 2 trials · 96.7% overall agreement vs reference

Overall Kappa

0.859

Good (0.8–0.9)

vs Reference

96.7%

Overall agreement

Between Appraisers

90.0%

9/10 parts

Study Size

10 parts

3 × 2 trials

Agreement by Appraiser
AliceBobCarol0%25%50%75%100%90% target
Appraiser Detail
AppraiserWithin %vs Ref %KappaInterpretation
Alice90.0%100.0%0.780Acceptable (0.7–0.8)
Bob90.0%90.0%0.798Acceptable (0.7–0.8)
Carol100.0%100.0%1.000Excellent (≥0.9)

Kappa Interpretation Guide

≥ 0.9

Excellent

0.8–0.9

Good

0.7–0.8

Acceptable

0.6–0.7

Marginal

< 0.6

Poor — investigate

MBB Coach

Hi! I'm here to help you choose the right tool and interpret your results.

Master Black Belt Coach — Attribute Agreement Analysis
Guidance only · No data shared

Guidance only — your analysis data stays in your browser. Nothing is sent to AI.