• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required as is 20 years of participation in forums (not all true). Come here to have fun, be ready to be teased and not take online life too seriously. We now measure and review equipment for free! Click here for details.

Calibration of measurement equipment

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
#1
I’m posting here to try to find out more about how the development of measurement equipment (e.g. audio analysers, measurement mics, etc.) tries to address the inherent conundrum that any measurement device must be calibrated to another measurement device which in turn must be calibrated to another measurement device, ad infinitum...

I’m sure there are practical ways to attempt to deal with this problem, but I’m curious as to whether anyone knows in any detail precisely how the baseline is obtained for something like the APX555 or any other field-leading device.

Thx,
Andreas
 
Last edited:

RayDunzl

Major Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
10,148
Likes
8,725
Location
Riverview FL
#2

Wombat

Major Contributor
Joined
Nov 5, 2017
Messages
6,025
Likes
4,882
Location
Australia
#3
Stated accuracy requirement limits that chain.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
1,678
Likes
3,635
Location
Suffolk UK
#4
Calibration labs ultimately go back to fundamental quantities, and everything else is derived from those. As Wombat mentioned, it then becomes an issue of accuracy the more steps removed one is from a fundamental quantity.

Quid custodiet ipsos custodes.

I have five meters that indicate AC volts, and just this week I put them all in parallel and measured the output of one of my signal generators. I used 50Hz as the frequency as three of the meters concerned were multimeters designed for mains frequencies rather than audio. All five were different, but within 2-3% so good enough for the purpose, but I have no idea which are correct, if any. I have one main meter that I use most, so tend to go by that, on the basis that it's well within 1dB of any other, so good enough for hobby purposes, if not necessarily for doing any paid work, which I don't do.

The greatest deviation was a very old AVO 8, on AC volts, although on DC volts, agreed very closely with the other two multimeters, one analogue, one digital.

When I worked for a Test and Measurement equipment manufacturer, calibration of our instruments was a very useful income stream, as professional users like the BBC used to get their kit calibrated regularly. We calibrated against a standard chemical cell that was itself calibrated by the National Physical Laboratory or one of their agents.

Fascinating subject this.


S
 
OP
andreasmaaan

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
Thread Starter #5
Calibration labs ultimately go back to fundamental quantities, and everything else is derived from those.
What would the fundamental quantities be in respect of something like an Audio Precision Analyser?
 

March Audio

Major Contributor
Manufacturer
Joined
Mar 1, 2016
Messages
5,904
Likes
7,595
Location
Albany Western Australia
#6
I’m posting here to try to find out more about how the development of measurement equipment (e.g. audio analysers, measurement mics, etc.) tries to address the inherent conundrum that any measurement device must be calibrated to another measurement device which in turn must be calibrated to another measurement device, ad infinitum...

I’m sure there are practical ways to attempt to deal with this problem, but I’m curious as to whether anyone knows in any detail precisely how the baseline is obtained for something like the APX555 or any other field-leading device.

Thx,
Andreas

This is my background, test and measurement with Rolls Royce Aero Engines. RR has so many measuring instruments and sensors in the company (possibly 10s of thousands) that they have their own in house calibration labs in Bristol and Derby. Essentially in its simplest terms, any instrument is periodically checked against a more accurate reference instrument (see about 25 mins in on the vid below - might be typically 10x more accurate lower uncertainty). The reference is periodically checked against a further more accurate instrument. This chain extends up to a national standard maintained by the countries metrology authority. The period between checks is determined by MTBF, starts small and is increased. All instruments in RR are meticulously calibrated and records maintained as to where and when it has been used. If any instrument subsequently fails a calibration check a process is invoked where the data it collected is checked to ascertain if any significant error or problem could affect product quality. Dont want your engine going tits up at 30000 ft as a result :)

Below is a link to video about Keysights (Agilent, Hewlett Packards) Australian mobile cal lab.


So with an AP unit a reference signal generator would be used to input known level and accuracy signals and the values that the AP reads checked. There will be a manufacturer stated tolerance limit. Reality is that for anything we do here is simply not that critical and modern instruments such as the AP are generally very stable. Amirs AP would have been factory tested and he may have a calibration report.

Agilent Melbourne Cal Lab

 
Last edited:

March Audio

Major Contributor
Manufacturer
Joined
Mar 1, 2016
Messages
5,904
Likes
7,595
Location
Albany Western Australia
#7
OP
andreasmaaan

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
Thread Starter #8
This is my background, test and measurement with Rolls Royce Aero Engines. RR has so many measuring instruments and sensors in the company (possibly 10s of thousands) that they have their own in house calibration labs in Bristol and Derby. Essentially in its simplest terms, any instrument is periodically checked against a more accurate reference instrument (see about 25 mins in on the vid below)). The reference is periodically checked against a further more accurate instrument. This chain extends up to a national standard maintained by the countries metrology authority. The period between checks is determined by MTBF, starts small and is increased. All instruments in RR are meticulously calibrated and records maintained as to where and when it has been used. If any instrument subsequently fails a calibration check a process is invoked where the data it collected is checked to ascertain if any significant error or problem could affect product quality. Dont want your engine going tits up at 30000 ft as a result :)

Below is a link to video about Keysights (Agilent, Hewlett Packards) Australian mobile cal lab.


So with an AP unit a reference signal generator would be used to input known level and accuracy signals and the values that the AP reads checked. There will be a manufacturer stated tolerance limit. Reality is that for anything we do here is simply not that critical and modern instruments such as the AP are generally very stable. Amirs AP would have been factory tested and he may have a calibration report.
Thanks for that. Will watch the video over the weekend.

Given this is the case, I'm wondering then how or to what the reference signal generator is calibrated?
 
OP
andreasmaaan

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
Thread Starter #11
Thanks for the input everyone.

I'm still not sure I understand what the ultimate references are here.

For example, if I want to know whether my lump of ice cream weighs a kilo, I can weigh it with scales calibrated against "Big K" (or scales that have been calibrated with reference to a device calibrated... against Big K).

In audio, I guess the ultimate references are clocks when it comes to the time/frequency domains. Is this right? If so, which clock(s), and on what basis are they considered reference level?

When it comes to sound pressure and electrical amplitude I have even less idea...
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
28,548
Likes
75,597
Location
Seattle Area
#13
In audio, I guess the ultimate references are clocks when it comes to the time/frequency domains. Is this right? If so, which clock(s), and on what basis are they considered reference level?
If you read the calibration report that SIY posted, you see all the measurements that are calibrated/tested. It includes such things as levels, frequencies, impedances (input and output), etc.

Every device is calibrated to one at the higher accuracy than it. And at the top of the chain, there are exotic dedicated sources that are not general purpose.
 
OP
andreasmaaan

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
Thread Starter #14
If you read the calibration report that SIY posted, you see all the measurements that are calibrated/tested. It includes such things as levels, frequencies, impedances (input and output), etc.

Every device is calibrated to one at the higher accuracy than it. And at the top of the chain, there are exotic dedicated sources that are not general purpose.
I don't doubt this :)

I guess I'm not so much interested to know that it's done correctly (which I'm sure it is), but rather interested in the theoretical basis behind it. Or in other words, from what are the ultimate references derived and on what rationales are they based...
 

SIY

Major Contributor
Technical Expert
Joined
Apr 6, 2018
Messages
4,892
Likes
9,834
Location
Phoenix, AZ
#15
If you poke around on www.nist.gov you'll find more information than you could ever want to know. I think that what you're asking about is primary standards, and they have a great deal of information related to that.

There's also ISO documents on certifying lab standards to primary standards. I've taken a couple labs through ISO10012 and ISO17025, and the documents related to those are very clear and worth reading if you're curious about translating things like atomic spectral lines to audio frequency measurement.
 

maverickronin

Major Contributor
Joined
Jul 19, 2018
Messages
1,316
Likes
1,259
Location
Midwest, USA
#16
In audio, I guess the ultimate references are clocks when it comes to the time/frequency domains. Is this right? If so, which clock(s), and on what basis are they considered reference level?
To get around that problem the second had been redefined in terms of the vibration of a caesium atom. Similarly distance is now defined by the speed of light and people are working on better definitions of the kilogram as well.

The theoretical basis of this is by using a definition based precise natural phenomena or physical constants anyone can follow the definition to produce a standard for calibrating against.
 
OP
andreasmaaan

andreasmaaan

Major Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
4,845
Likes
4,936
Thread Starter #17
To get around that problem the second had been redefined in terms of the vibration of a caesium atom. Similarly distance is now defined by the speed of light and people are working on better definitions of the kilogram as well.

The theoretical basis of this is by using a definition based precise natural phenomena or physical constants anyone can follow the definition to produce a standard for calibrating against.
Nice, thanks :)

If you poke around on www.nist.gov you'll find more information than you could ever want to know. I think that what you're asking about is primary standards, and they have a great deal of information related to that.

There's also ISO documents on certifying lab standards to primary standards. I've taken a couple labs through ISO10012 and ISO17025, and the documents related to those are very clear and worth reading if you're curious about translating things like atomic spectral lines to audio frequency measurement.
And also really helpful :)
 

RayDunzl

Major Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
10,148
Likes
8,725
Location
Riverview FL
#18

Speedskater

Addicted to Fun and Learning
Joined
Mar 5, 2016
Messages
639
Likes
422
Location
Cleveland, Ohio USA
#19
Because few of us are writing legally binding documents, we really don't need traceable calibration. Most modern test instruments will remain in or near calibration for a very long time. At work I spent a lot of time sending all the test equipment to an independent cal lab. Seldom did the ever adjust anything.

But calibration microphones are a different story.
 

Blumlein 88

Major Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
9,874
Likes
13,360
#20
I don't doubt this :)

I guess I'm not so much interested to know that it's done correctly (which I'm sure it is), but rather interested in the theoretical basis behind it. Or in other words, from what are the ultimate references derived and on what rationales are they based...
So I'm thinking your looking for something like knowing that ice freezes at 32 degrees F and boils at 212 F you could do some calibration of a thermometer by checking its readings in an ice slurry and a slow boiling pot of water. Is this correct?

So you want to know the physical theoretical basis for SPL or amps or volts something like that? Not really so much how calibration is done or that is correct just the idea it is based upon for the different things we measure in audio.
 
Last edited:
Top Bottom