Gauge R&R (Measurement System Analysis)

Last updated 2026.02.13
GaugeRR측정시스템분석MSAANOVA품질관리데이터품질반복성재현성

Definition

Gauge R&R (Gage Repeatability and Reproducibility) is a statistical analysis technique used to evaluate the reliability of measurement systems. It employs an ANOVA (Analysis of Variance) based random effects model to distinguish whether measurement variation comes from actual product variation or from the measurement system itself. Repeatability refers to variation when the same operator measures the same part repeatedly, while Reproducibility indicates variation between different operators.

Application in Manufacturing

Quality Control Decision-Making

  • Measurement Equipment Validation: Determine suitability of new measuring instruments using %GRR values (typically <10% excellent, >30% unacceptable)
  • Operator Training: High reproducibility indicates significant operator-to-operator variation, requiring standardized training
  • Process Capability Analysis: Large measurement errors can underestimate actual process capability (Cp, Cpk)

AI Application Perspective

  • Data Quality Verification: Pre-validate measurement data reliability with Gauge R&R before AI model training
  • Automated Measurement Systems: Performance benchmarking for AI-based systems like vision inspection and automated gauges
  • Digital Twin: Simulation incorporating measurement uncertainty in virtual process models

Key Points

The typical experimental design involves 3 operators measuring 10 parts, 3 times each. The ANOVA method provides more accurate analysis than the crossing method by including operator-part interaction, decomposing measurement variation into Part Variation (PV), Equipment Variation (EV), and Appraiser Variation (AV). It serves as an essential pre-validation tool for ensuring input data reliability in AI quality prediction models.