• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

How to measure EIN (Equivalent Input Noise) of an interface's Mic preamp

I don't get it, what has the max input level to do with this?
What I miss in this is a way to find the relationship between dBu and dBFS.
I would expect you have to measure some how that by applying a known dBu value and measure the dBFS?
What am I missing here?
This simplified method is relying on supplier's specs.
The dBFS to dBu relation, in that case, is based on the max input level (a value one usually finds in specs), related to a given gain + the (maximum) gain for which we want to evaluate EIN.

In the second method only, we will measure this relation between dBu and dBFS.
 
calibrated on a serious lcr meter, the use of high wattage non-inductive resistance would not do the trick to get away from the heating considerations?
There is no current flowing through the resistor, so I don't think resistor wattage would make any difference.
The temperature shift is just due to thermal conduction from air and from the interface to the plug/resistor.
 
There is no current flowing through the resistor, so I don't think resistor wattage would make any difference.
The temperature shift is just due to thermal conduction from air and from the interface to the plug/resistor.
I've used the resistor at the end of a half meter of balanced cable. It doesn't seem to cause any issues. My results seem reasonable. This gets the resistor away from the interface and puts it at room temperature. I suppose I could try it with and without the cable to see if it makes any difference.
 
In a sense, you're measuring a mixture of the current noise and the voltage noise. Why not measure each separately (most usefully as input-referred noise density), then for any given source, you can determine the total input-referred noise?
 
In a sense, you're measuring a mixture of the current noise and the voltage noise. Why not measure each separately (most usefully as input-referred noise density), then for any given source, you can determine the total input-referred noise?
How do you do that?
 
The voltage noise is obtained by doing the input shorted measurement and dividing the output noise by gain to get input-referred noise. You have to back into the current noise by measuring with the input open, getting the input referred open circuit voltage noise (dividing by gain), and dividing that noise by the input impedance (hopefully resistive!). The noise spectra can be converted to noise densities for ease of calculation with arbitrary source impedances.
 
The voltage noise is obtained by doing the input shorted measurement and dividing the output noise by gain to get input-referred noise. You have to back into the current noise by measuring with the input open, getting the input referred open circuit voltage noise (dividing by gain), and dividing that noise by the input impedance (hopefully resistive!). The noise spectra can be converted to noise densities for ease of calculation with arbitrary source impedances.
I should have also said that the open circuit measurement needs to have the voltage noise subtracted out in a root square calculation to isolate the current noise. Mea culpa.

Here's a nice article outlining the calculations. The series resistor for the current noise, in this case, is the input impedance.
 
The voltage noise is obtained by doing the input shorted measurement and dividing the output noise by gain to get input-referred noise.
That 's our EIN "Short"

You have to back into the current noise by measuring with the input open, getting the input referred open circuit voltage noise (dividing by gain), and dividing that noise by the input impedance (hopefully resistive!).
Open circuit noise I can get easily.
Input impedance, I'd need to measure.

The noise spectra can be converted to noise densities for ease of calculation with arbitrary source impedances.
Interesting.
 
That 's our EIN "Short"
Exactly. For low source impedances, it's usually the dominant term. This is especially true for FET input circuits. Once current noise is determined, then the total input-referred noise is just the RMS sum of the source Johnson noise, the current noise times source impedance, and input voltage noise.
 
I sometimes use FS/V when comparing the sensitivity of ADCs.
 

Attachments

  • 01_focusrite_2i2_2nd_01.jpg
    01_focusrite_2i2_2nd_01.jpg
    209.9 KB · Views: 73
  • 10_ff_2i2_min_db.png
    10_ff_2i2_min_db.png
    172.7 KB · Views: 76
  • 11_ff_2i2_half_db.png
    11_ff_2i2_half_db.png
    176.9 KB · Views: 74
  • 12_ff_2i2_max_db.png
    12_ff_2i2_max_db.png
    173.7 KB · Views: 71
I've used the resistor at the end of a half meter of balanced cable. It doesn't seem to cause any issues.
It's better, indeed.
Then you just compensate for room temperature and resistance, which is pretty straightforward, to get normalized results.
 
If you are all OK with this method, I'll start a thread asking everyone to publish his own measurements following it.
Then we could build a more exhaustive EIN database.
 
I added some content about the accuracy of a DMM or a combination of a DMM and an interface
 
Does not impact the methods you outlined. Perhaps a good way to do a check of line level accuracy of your DMM is if you have a DAC Amir has measured. I'm sure they vary a bit, but likely not too much. For instance I have a Topping D10B which he measured as 4.237 volts out.

Comparing three MM I have on hand.

An old Radio Shack model with the RS-232 port measured .4% low. It measured consistently (within the number of digits you could read) down to a surprising -60 db from that level. I did this first at 60 hz and then in steps up to 1 khz where it was still consistent. It drooped a bit as you went to 2 khz and higher.

A recent Klein MM700 (less than $100 new) was .5% high. It also was consistent up to 1 khz and drooped more rapidly at 2 khz and above. It was not particularly good at lower levels losing accuracy once you went below -20 db.

An ancient Simpson 260 analog meter was 1.4% low. At lower levels it was as consistent as you could read to -20 db, but you cannot really read it below that level. It was consistent at higher frequencies up to 15 khz (I didn't measure higher).
 
Exactly. For low source impedances, it's usually the dominant term. This is especially true for FET input circuits. Once current noise is determined, then the total input-referred noise is just the RMS sum of the source Johnson noise, the current noise times source impedance, and input voltage noise.
And it doesn't make any difference what type of resistor you use in this case. All of them (carbon film, metal film, thick-film) have the same thermal noise.
The "quality" of a resistor comes into play once current is flowing in the resistor -> excess current noise. See e.g.

To me it's a pain to see tiny 0402 thick-film SMD resistors in the reference voltage divider for the DAC. The excess current noise has a 1/f character and thus filtering isn't easy.
 
If you are all OK with this method, I'll start a thread asking everyone to publish his own measurements following it.
Then we could build a more exhaustive EIN database.
What all information will you include in the database? EIN obviously.

The largest group of EIN info I know of on the internet is Julian Krause. I don't know his method of obtaining it, but it will be useful for comparing the same gear with his results in a few cases. He does not just list the specs from the manufacturer he does test it. He uses a Spectral dscope M1 for doing his measurements. I'm not familiar with the workings of the dScope maybe it has some automated test suite for such things. Julian is a sometimes poster here so I suppose we could ask him his procedure.
 
Last edited:
It's better, indeed.
Then you just compensate for room temperature and resistance, which is pretty straightforward, to get normalized results.
Temperature is not that critical for the final result. It's the absolute temperature and it appears under the root.
This means less than 1.7% error or 0.14dB error for 310K vs 300K
1724652346303.gif
 
Temperature is not that critical for the final result. It's the absolute temperature and it appears under the root.
This means less than 1.7% error or 0.14dB error for 310K vs 300K
View attachment 388572
Well, if you want to get accurate, reproducible result within 0.1dB or so, you need to correct it,

And it still gives 0.1 to 0.2dB shift over time.
But that's mainly due to the interface heating, so the trick to use 0.5m cable works.
 
What all information will you include in the database? EIN obviously.
It would be very useful to a) derive the input voltage noise density from the EIN, and b) from that the noise factor / noise figure, since that is the only quantity that is actually human-readable.
If one device has -140dBu EIN and another has -150dBu, what does it tell us? One is better than the other but how good/bad they are in absolute terms cannot be deciphered.
Isn't the amount of S/N degradation a device introduces the thing we really want to know? Noise figure is exactly that amount.
 
Back
Top Bottom