Gage Block Temp Coefficients

Started by Hawaii596, 01-24-2008 -- 10:05:54

Previous topic - Next topic

Hawaii596

As an Ex-Navy PMEL tech (with RF/microwave background and an affinity for precision DC/LF and temp work), one of my weak areas is dimensional. I've calibrated just about everything (really).  But I have to get out the books about gage block stuff.

I work in a general type lab with "typical" temp control.  We do quite a bit of low end calipers/micrometers, and some higher accuracy digimatic Mitutoyo, Heidenhain, and other digital indicators and drop gages.  Some have tolerances of +/-50 uinches.  I recently took over the lab (couple of years ago) and until then, they hadn't applied temp correction factors for anything.  I created an Excel spreadsheet where you enter gage block length, cal'd error in uin., and temp.  Then in a protected field I entered the industry value for our steel block coefficient.  We tried it out on a digimatic (don't remember all the parameters) on a 75 degree day in the lab.  It made the difference between reading at 90% of tolerance to right on the money.

Here's the question.  A highly experienced and revered old metrology engineer friend explained to me recently that although there is a published coefficient for steel blocks (11.5 uin/in/Deg C), each set actually has it's own characteristic coefficients (material impurity differences, etc.), and there is also hysteresis associated with gage block growth/shrinkage.  Is this something that most dimensionally trained PMEL people know?  Or is this not well known?  Also, if it is well known, is there anything published about it?  Any thoughts from the dimensionally trained community?
"I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind."
Lord Kelvin (1824-1907)
from lecture to the Institute of Civil Engineers, 3 May 1883

ck454ss

Yes there is a difference between gage block coefficients of different sets produced.  BUT here is the kicker.  The resulting difference is so small you wont notice it in most system setups.  I love engineers.  Your standard coefficients of expansion will do for cal but when measuring gage blocks it is critical to check them as close to 68 Deg as possible.  By doing so this "theoretically" will eliminate any bias due to different expansion coefficients and/or impurity errors since all measurements should be referenced to 68 deg.  I would be very wary of calibrating mechanical equipment in an uncontrolled lab (ie 68+/-1 Deg Max) whenever possible.  I understand its not possible in some cases but your uncertainties should be higher in an uncontrolled setting versus a controlled setting due to your temperature drift.  As with most mechanical measurements temperature has the greates effect on your uncertainties.

Ex-Navy Pmel myself.  Was stationed at Subase Pearl