Attribute Agreement Analysis
Measure PhaseGBKappa-based agreement analysis for attribute (pass/fail, categorical) measurement systems. Evaluates within-appraiser consistency, between-appraiser agreement, and agreement vs. a known reference standard.
Study Setup
10 parts entered
Trial 1
Trial 2
Trial 1
Trial 2
Trial 1
Trial 2
Overall Kappa: 0.859 — Good (0.8–0.9)
10 parts · 3 appraisers · 2 trials · 96.7% overall agreement vs reference
Overall Kappa
0.859
Good (0.8–0.9)
vs Reference
96.7%
Overall agreement
Between Appraisers
90.0%
9/10 parts
Study Size
10 parts
3 × 2 trials
Agreement by Appraiser
Appraiser Detail
| Appraiser | Within % | vs Ref % | Kappa | Interpretation |
|---|---|---|---|---|
| Alice | 90.0% | 100.0% | 0.780 | Acceptable (0.7–0.8) |
| Bob | 90.0% | 90.0% | 0.798 | Acceptable (0.7–0.8) |
| Carol | 100.0% | 100.0% | 1.000 | Excellent (≥0.9) |
Kappa Interpretation Guide
≥ 0.9
Excellent
0.8–0.9
Good
0.7–0.8
Acceptable
0.6–0.7
Marginal
< 0.6
Poor — investigate