By continuing to use this site, you agree to our use of cookies. Find out more
Forum sponsored by:
Forum sponsored by Forum House Ad Zone

Calibrating Micrometers

All Topics | Latest Posts

Search for:  in Thread Title in  
James Alford12/11/2020 17:34:54
501 forum posts
88 photos

I have bought some second hand 2", 3" and 4" micrometers. Is there any way to check that they "zero" correctly without using a block of known size?

Regards,

James.

Samsaranda12/11/2020 17:41:05
avatar
1688 forum posts
16 photos

You need an accurate test piece preferably the size of the minimum reading of the micrometer in order to zero the scale, I don’t know of any other method by which you can achieve this, the test piece can be a micrometer test standard or slip gauges.
Dave W

old mart12/11/2020 17:43:36
4655 forum posts
304 photos

Difficult without the length bars they came with when new. To check the calibration of a micrometer properly, a box of slip gauges are used to measure odd sizes so that the micrometer leadscrew wear or damage is also checked as well as the zero which is what the length bar is for.

Michael Gilligan12/11/2020 17:43:48
avatar
23121 forum posts
1360 photos

Starrett make some excellent ‘rods’ for that very purpose, available at surprisingly modest prices.

... I believe the Mitutoyo equivalent is ‘reassuringly expensive’ [but haven’t checked recently]

MichaelG.

.

Edit: I beg Mitutoyo’s pardon ... This is not at all unreasonable:

https://www.hroberts-di.com/all-metrology-c49/setting-standards-c131/mitutoyo-167-141-setting-standard-1-p2142/s2142?cid=GBP

Edited By Michael Gilligan on 12/11/2020 17:53:25

Howard Lewis12/11/2020 17:58:23
7227 forum posts
21 photos

You can buy checking pieces for Micrometers.

I think that Cromwell Tools are where i got some.

If the mic does not zero after three clicks of the ratchet, the "C" part of the spanner supplied with the mic can be used to rotate the thimble until the two Zero lines co-incide.

Ideally, the mic and the standard should be allowed to soak to the same temperature for 24 hours. A warm mic set with a cold standard, or vice versa, are not the routes to the accuracy sought.

A secondhand mic may have defects, such as worn threads or even a strained frame (throwing the anvils out of parallelism ) or damage from trying to measure rotating work and consequently being thrown across the shop!

If the mics are modern and have carbide anvils, they can be easily be chipped.

Howard

Neil Lickfold12/11/2020 18:08:22
1025 forum posts
204 photos

Inspecting micrometers is a specialists job, and requires things like a optical flat and monochomatic light, apart from length stacks that check the micrometer at different positions. There are now newer methods of inspecting the spindle accuracy and the anvil runout at the same time, while it is rotated. But zero is still done with length bars.

Andrew Tinsley12/11/2020 18:43:16
1817 forum posts
2 photos

Just as Michael said, Starrett do the standard rods for zeroing micrometers. I purchase 4 about 18 months ago and could not believe how cheap they were!

Andrew.

Howard Lewis12/11/2020 18:49:46
7227 forum posts
21 photos

As Neil Licfold says, to check a measuring instrument properly requires a temperature and humidity controlled room with highly specialised equipment. Not to mention skill.

We have no control over the accuracy of the standards for each mic. What tolerances apply to the length standards?

Possibly not as stringent as a set of slips held in a Calibration Room. And even such a standard is traceable back to NPL standards or an international standard.

And do we question the accuracy with which the temperature and humidity of the Calibration Room is controlled?

So then we question then accuracy of the standards.

As hobbyists, lacking such extreme accuracy of environmental controls, we have to do the best that we can with what is available to us.

But since we are not in Industrial Clean, or Calibration, Room conditions, we have to live with what we have

So chasing hundredths of a thou is not really practicable.

Hence the advice to soak the instrument and standard for 24 hours, and to avoid holding either for too long, (Thermal insulation pads do help to reduce body temperature influencing the dimensions )

To exert a constant torque, and therefore, we hope, force on the anvils of the mic, we rely on the ratchet behaving consistently when we take our measurements or check against the standard.

Unless the force exerted is constant, the frame of the mic will deflect by a varying amount to reduce accuracy, just as temperature departures from the usual standard of 20 degrees Celsius will detract from absolute accuracy.

Let us not confuse ourselves with delusions of accuracy. If a 20 mm diameter piston produces an acceptable fit in a supposedly 20 mm cylinder, the parts are fit for purpose. This statement assumes that we are not manufacturing a sub atmospheric pump to produce a pressure level of a couple of mm Hg.

A plea to be realistic in our expectations.

Howard

Michael Gilligan12/11/2020 19:01:23
avatar
23121 forum posts
1360 photos

Here’s a drawing of the Starrett 1”

**LINK**

https://www.starrett.com/dms/download.aspx?b=starrett3d&p=234A-1&i=1

I do hope they are not serious about that fractional tolerance !! dont know 

MichaelG.

Edited By Michael Gilligan on 12/11/2020 19:05:11

Stuart Bridger12/11/2020 19:01:29
566 forum posts
31 photos

Excellent post from Howard, absolutely spot on.
When I was an apprentice at BAe Weybridge, I did a tour of the mechanical standards room, but no practical experience. I did however do a 3 month stint in the Electrical Standards Department. That was fully temperature and humidity controlled with a "room within a room" where the high precision work was carried out. Every meter on site was calibrated every six months and there were 100's of Avo's to get through. That was the routine stuff. The higher precision instruments needed much more specialised treatment. One rule I learned was that to calibrate something you need at least 10x the accuracy with the kit you are using to calibrate. All of course was traceable directly back to national standards.

Andrew Johnston12/11/2020 19:30:13
avatar
7061 forum posts
719 photos

I'm mystified as to how one can check zero with a length standard, unless it's of zero thickness?

To set zero i simply close the micrometer anvils in the normal manner and tweak the thimble to read zero. I then use length standards, gauge blocks or screw together length bars (depending on what I have and the size of the micrometer) to check the full scale reading. I might also check some intermediate values as a sanity check on linearity.

Of courese it's not up to toolroom standards, but it's plenty enough for the standard my work needs.

Andrew

Zan12/11/2020 19:44:35
356 forum posts
25 photos

Andrew. He said......“ I have bought some second hand 2", 3" and 4" micrometers. Is there any way to check that they "zero" correctly without using a block of known size?”

The only way to zero a 2” micrometer is with a 1” standard you refer to a 1” micrometer 

 

Edited By Zan on 12/11/2020 19:45:27

peak412/11/2020 20:34:43
avatar
2207 forum posts
210 photos

I guess some of the answer will depend upon whether you're making stuff purely for yourself, or for someone else to use at the far end. If your "customer" needed calibrated standards for the parts, you wouldn't need to be trying this yourself.
I discussed this with a friend of mine, who unlike me, is a proper engineer. His previous occupations included engineering and managerial roles in making micrometers, other precision measuring gear, surface plates etc. as well as being a consultant for NAMAS.

The general conclusion was that for my home use, consistency within the workshop is more important than absolute accuracy, so long as we are "close enough", for want of a better expression.
By the latter I mean, that it's no use making a hole or shaft that doesn't fit a bought in bearing for example.
Essentially, at home I'm making something which will fit something else I already have, or will then subsequently make.

I had a variety of second hand mics, of dubious origins, from 0-6" with no standard length bars; I did eventually buy a 1-2" with a 1" round disk standard in the box (I'd no idea how accurate that was).
I'm very much aware that my only methods of checking things for consistency could lead to a cumulative error, but had little choice. Using my rather dodgy methods, I found several needed a little adjustment.

My 0-1" was set to zero OK as normal, and checked wide open with the 1" standard from the 1-2" mic. All seemed to be OK

The same standard set the 1" end of the 1-2" Ball bearings are likely to be made to close tolerance, so a 1/2" ball really should be close to 0.500" for a mid range wear/sanity check.

Find something that's got a good surface finish and is very close to 1" & measure with the smaller mic; record the average of several readings. (maybe use the foot of one of your best squares, or even an unworn part of a lathe bed.)
Add the 1" standard next to it, and measure the total with the wide end of the 1-2" , which should tally with the sum of the two readings.

Having adjusted/"proved" the 1-2" mic at both ends, find something that's almost exactly 2" long/diameter, and measure it with the wide end of the 1-2", and then the short end of the 2-3" and make sure there is consistency between the two instruments.

Carry on and work through your external mics, doing sanity checks with internal mics as you work through everything. Your newly made/measured home 2" standard, doesn't have to be 2" of course, provided both 1-2" & 2-3" mics measure it as the same 1.994"

Just keep on thinking about cumulative errors and ways to mitigate them.

Since you are the only one using them, you will have set everything to your own "feel" as well.

Last year I picked up a set of gauge blocks in good condition; most of the ones I've used still wring together nicely.
Their first job was to check my guesstimated range of mic settings, done several years previously; I barely needed to tweak anything at all, certainly less than half a thou at the very most.

On the other hand, I recently picked up several larger mics up to I think 10".
Checking these out with my newly acquired gauge blocks, all but one needed tweaking by up to several thou.
I've no idea if it was due to wear, or being dropped, and certainly don't have things like optical flats to check for lack of parallelism of the anvils. They are though now usable for anything I'm ever likely to need.

Bill

Vic12/11/2020 20:35:33
3453 forum posts
23 photos

I believe you need a Micrometer Calibration Gauge Block Set to accurately check them. Unlike ordinary sets they contain specific sizes for the job. Checking with a single size is apparently pointless but I’m no metrology expert. laugh

Gauge Block Set

James Alford12/11/2020 21:53:38
501 forum posts
88 photos

Thank you for all of the replies, which will be really useful. It had never occurred to me think about how to check these micrometers when I bought them. I have a 1" mircometer which I have set by closing the anvils and tweaking the barrel.

Anything that I measure is purely for my own use, not for anyone else, so extreme accuracy is not needed. I bought them so that I could measure things like the journals on my crankshaft and other similar car parts.

I shall look for some gauges or, as suggested, large ball bearings.

Regards,

James.

Bill Davies 212/11/2020 22:35:32
357 forum posts
13 photos

Not far from Stuart's employment, I did an HNC unit on metrology at Brookland's College. My employer had multiple inspection departments and a temperature control standards room, which I worked in for a while.

For our purposes, it's worth remembering that the micrometer, length standards and workpieces are likely to be steel, so we can relatively ignore temperature control as all will expand or contract to the same extent. Allow a few hours to pass and ensure that you handle everything as little as possible, to avoid diferentially heating them from you hands. Use cloths as insulators.

Length standards thet leave the thimble in different positions from the zero will check for a periodic error. But I would generally say that simply checking at a known length will generally suffice for us. In the workshops I worked in, there was always a series of discs, in one inch steps, to check micrometers and calipers against. The most likely problem is a dropped measuring instrument, which brings the jaws closer together, and in identical error at all positions. Easiest solution is to buy a new instruments.

I would caution against using ball bearings, as they have a theoretical point contact, and unless using a light feel, the ratchet will tend to cause the size to read slighly under - especially smaller balls. Cylindrical shapes are similarly a potential problem, but less so due to a theortical line contact.

Bill

Hopper12/11/2020 23:14:50
avatar
7881 forum posts
397 photos
Posted by James Alford on 12/11/2020 17:34:54:

I have bought some second hand 2", 3" and 4" micrometers. Is there any way to check that they "zero" correctly without using a block of known size?

Regards,

 

James.

Yes there is. But you also need a 0 to 1" mike.

Set the 0-1" mike to read zero when the anvils are closed. Then use that mike to turn a piece of bar to read exactly 1.000" diameter. (Use emery paper to achieve final size .)

Then use use that bar to set your 1-2" mike to read exactly 1.000". Then use that mike to turn a piece of bar to read exactly 2.000" diameter and use that to set the 2-3" mike.

And so forth and so on.

You can double check your mikes by measuring the outside race diameter of ball bearings. They are made to pretty tight tolerances available online on the manufacturers websites.

Not toolroom metrology standard for sure, as those of a sensitive nature are sure to point out. But good enough for most home workshop use in a pinch. The only time it is likely to be critical is fitting together two parts such as boring a 1.500" cylinder to fit a 1.499" piston. But if both dimensions are measured with the same mike, you will get the desired clearance anyway.

The main reason industry sets mikes in air-con metrology rooms etc is to ensure consistency of parts size in mass production and so that parts made by different machinists will all fit together. So not necessarily relevant in a one-man home workshop where things are usually individually hand-fitted together.

 

 

Edited By Hopper on 12/11/2020 23:18:21

IanT13/11/2020 00:07:59
2147 forum posts
222 photos

Someone may have made this point all ready James but I've only quickly skimmed the previous posts

I've now used slip gauges to check my larger mics (good enough for my purposes) but had "uncalibrated" mics (that I had just cleaned up) for a good while before I did that.

If you are simply using your mics to make comparative measurements (and not absolute ones) then you don't need to know if they are zero'd or not. I should state that I'm not after lab accuracy either, just "good enough"

For instance, if you are checking for wear in two parallel flat surfaces (e.g. a Myford bed) then the mic will measure the difference between the two for you and be accurate. What you cannot do is measure what the actual width is with any confidence, just the difference in width. But for many things that is fine and still useful (till you get the setting gauges anyway). I was surprised how inexpensive gauges they were too btw - it's been a while but a 1" one was about £9 I think.

Regards,

IanT

Edited By IanT on 13/11/2020 00:08:36

duncan webster13/11/2020 00:35:43
5307 forum posts
83 photos

Big ball races are made to very tight tolerance on OD, better than I can work to. If you haven't got any in your 'come in handy' box, try your local garage for old wheel bearings etc.

Actually for most model engineering applications you measure the bit you've just made/bought and machine the next bit to fit, so calibration to external standards isn't that important. (Waits for the howls of protest)

Michael Gilligan13/11/2020 07:34:06
avatar
23121 forum posts
1360 photos
Posted by duncan webster on 13/11/2020 00:35:43:

.

(Waits for the howls of protest)

.

No protest from here, Duncan; you make a valid point yes

Regarding the broader discussion, however:

It is worth noting that ... despite its title this thread started with a simple question about setting the zero reference for some larger micrometers ... not about fully calibrating them.

MichaelG

All Topics | Latest Posts

Please login to post a reply.

Magazine Locator

Want the latest issue of Model Engineer or Model Engineers' Workshop? Use our magazine locator links to find your nearest stockist!

Find Model Engineer & Model Engineers' Workshop

Sign up to our Newsletter

Sign up to our newsletter and get a free digital issue.

You can unsubscribe at anytime. View our privacy policy at www.mortons.co.uk/privacy

Latest Forum Posts
Support Our Partners
cowells
Sarik
MERIDIENNE EXHIBITIONS LTD
Subscription Offer

Latest "For Sale" Ads
Latest "Wanted" Ads
Get In Touch!

Do you want to contact the Model Engineer and Model Engineers' Workshop team?

You can contact us by phone, mail or email about the magazines including becoming a contributor, submitting reader's letters or making queries about articles. You can also get in touch about this website, advertising or other general issues.

Click THIS LINK for full contact details.

For subscription issues please see THIS LINK.

Digital Back Issues

Social Media online

'Like' us on Facebook
Follow us on Facebook

Follow us on Twitter
 Twitter Logo

Pin us on Pinterest

 

Donate

donate