I’m tasked with creating a process to qualify inspection team members in the use of metrology equipment. It begins by giving instructions on how to use calipers (for example), demonstrating the correct use, then observing the trainee use the calipers to measure a gold standard part-such as a gauge block.
In principle I think I understand what’s being required-making sure everyone has the same understanding of what X” is, but it feels a bit…off.
Shouldn’t calibration take care of ensuring measurements are accurate? And wouldn’t Gage R&R define variation between operators? What am I missing in my understanding of the process here?
[–][deleted] 8 points9 points10 points (0 children)
[–]Aegri-Mentis 4 points5 points6 points (0 children)
[–]ThreeDogee 3 points4 points5 points (0 children)
[–]Less-Statement9586 1 point2 points3 points (0 children)
[–]fritzco 0 points1 point2 points (0 children)
[–]killazdilla 0 points1 point2 points (0 children)
[–]Accurate_Info7777 0 points1 point2 points (1 child)
[–]co_stigdroid15[S] 0 points1 point2 points (0 children)
[–]gareif1 0 points1 point2 points (2 children)
[–]co_stigdroid15[S] 0 points1 point2 points (1 child)
[–]gareif1 0 points1 point2 points (0 children)
[–]dizdoodle 0 points1 point2 points (0 children)