• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

How to measure EIN (Equivalent Input Noise) of an interface's Mic preamp

Rja4000

Major Contributor
Forum Donor
Joined
May 31, 2019
Messages
3,059
Likes
5,491
Location
Liège, Belgium
For audio interfaces with Mic preamp, low level / high gain noise is a key performance parameter.
I therefore always measure EIN when reviewing an interface.

But what is EIN, and how do we measure it ?
I thought it was worth adding a proposed methodology that everybody could use.
Of course, this is subject to discussion and improvement.


_R9A7993_dxo_FHD.jpg


Equivalent Input Noise

Your Mic preamp gives some noise at the ouput, for a given gain.

Let's imagine for a minute a perfect preamp, adding no noise at all by itself, and giving the exact same gain.
The Equivalent Input Noise is the noise voltage that would give the same level of noise as your real mic preamp
if it was provided by a generator at the input of this ideal - noise free - preamplifier and amplified by the same amount of gain.

Why is this useful to measure a mic preamp noise ?

If you measure SNR on a mic preamp, the value will greatly vary with the preamp gain.
Example: At 60dB gain, your SNR will be 20dB lower than at 40dB, if the preamp self noise is the same.
Just because you'll amplify the noise more.
Therefore, to link SNR and actual preamp noise is not easy.

In real life, in practice, we'd like to know what a given source SPL, at a given distance, with a given mic,
will give us in terms of noise at the ouput of the preamp.
We want to compare this noise for different amplifiers.
And we'd also like to compare the noise for different gains.

That's what the EIN gives us.


How is EIN measured ?


To measure EIN, you'll need a test "load"
So you'll first need to


Prepare test loads
  • Take 2 XLR-3 male plugs.
  • Solder a 150 ohm metal film resistor between pin 2 and 3 in one plug and mark it "150"
  • Solder a copper conductor between pin 2 and 3 in the other plug and mark it "Short"
You'll need a software to measure noise too.
In this example, I'll use REW. But any good FFT measurement software should work.



1. The quick, not accurate, simplified, method

First read the interface manual or specs

a. What's the maximum input level of the interface without pad or trim, and for what gain ?
Let's take a RME UCX II as an example: RME says max input level is 18dBu at 0dB gain for Mic inputs.

1724328511276.png


b. What's max gain ?
Almost always, EIN is maximum at max gain.
So what's max gain ?
For the same RME UCX II, it's 75dB, as we may read above.


Make sure Phantom is OFF for the mic input you want to measure
Then measure

c. Insert the "150" XLR plug in the Mic input

1724327430788.png


Have gain set to max
Use REW or any FFT software, set RTA to 20Hz-20kHz BW, "rectangle" window, 32 averages, and measure Noise (in dBFS).

1724328699756.png


1724328762114.png


1724432292736.png


Notes:
  • There is no signal, so SNR won't give you anything. Don't even look at the Noise level value. Look at the total dBFS rms value only.
    So for REW, use the top-right value.
  • It's been suggested to use a 0.5m mic cable to connect the plug to the Mic input, to avoid impact if interface's own temperature fluctuation.
    This, indeed, provides more stable results

d. Compute EIN

[EIN (dBu) ] =
[Max input level (dBu)] - ( [Measured Gain (dB)] - [Gain for Max Level (dB)] ) + [Measured Noise Level (dBFS)]
= 18 - (75 - 0) + -70 dBu
= -127 dBu

1724432182046.png


Simple, isn't it ?


Notes

  • Here, I measured un-weighted.
    If you measure your Noise with A-Weighting, your EIN will be A-weigthed.
    In the example above, REW gives you both: Noise level is -72 dBFS (A), so [EIN (A) (dBu)] = -129 dBu (A)
    (which, by the way, is 1 dB better than the Specs)

  • This is an approximation, so don't get excited about the figures after the dot.


OK, now what if you want to get more accurate value ?
That will become a bit more complicated.



2. The accurate method

For this, you'll also need a (ideally calibrated) True RMS Digital Multi Meter (DMM)

Precautions

EIN is usually measured with an input load connected that ressembles a microphone.
Of course, we won't use a microphone, that would pickup ambiant noise.
We'll use a resistor instead.
Because each resistor generates its own thermal noise - a function of the resistance and of the temperature - we have to standardize the resistor value.
Usually, a 150 ohm resistor is used.


Also, the noise level is a function of the bandwidth you measure it for.
Usually, we'd consider 20Hz-20kHz

Some vendor consider 10Hz-30kHz or other BW.
IF the noise profile is flat and measurement is un-weighted, it's easy to estimate the BW impact.
If weighting is applied, it's a bit more tricky.

Finally, the temperature of the resistor should remain around a standardized value.
Usually 20°C.
This part is a bit tricky, but we'll see later how we can lower the impact of T° fluctations.

For the curious who don't already know, here is the formula for the Thermal noise for a pure resistance.

1724257520606.png

with, in our case,
1724334456103.png



a. Measure Max input level at max gain

This step is critical, since this will be your only reference to a voltage in the measurement.

If you want to get accurate measurement, you should not rely on manufacturer's specs in terms of max input level or exact gain.
You'd want to measure the actual max input level for the gain you're considering
In other words, you want to accurately measure [Max input level (dBu)] - [Gain (dB)] in above equation.

The good news is that you need to measure only one value.
The bad news is that this value is quite low.
In my RME UCX II case, at 75dB gain, the max input value is around 1mV.
And you want, say, 0.05dB accuracy. That's an error less than 0.6%.
Not easy if you don't own a proper, accurate, calibrated True RMS microvoltmeter.
Good ones are quite expensive.
So I don't.

Here is how I proceed, with a normal True RMS calibrated Multimeter.

This is based on the fact modern interfaces are quite linear vs level.
(You know, this linearity measurement Amir includes in each DAC review)
But we can't really rely on a DAC's level for accurate voltage measurements, since the actual voltage it will deliver will depend on its output impedance and the input impedance of the mic preamp we'll connect it to.

So I rather use a line-level interface input instead.
That will most likely be much more accurate than your standard DMM.
So we'll use it as a millivoltmeter, to measure the level with enough accuracy.
But we'll need to calibrate it with the DMM first.


I set the mic input I want to measure to minimum gain (Phantom OFF)
I use one of my interface's balanced outputs. (Could be any decent DAC, actually)
With REW, I send a 1kHz sine wave through it.
I plug this output in parallel to one of the interface line inputs (that I'll use as a millivolmeter) and to my mic input, using an XLR Y cable.

1724327296394.png


At that stage, the mic input is still at minimum gain.

I set my output level just below the lowest range of my True RMS Digital Multimeter, because that's where the DMM is the most accurate.
In my case, that's 500mV, so I send, say, 499mV rms to my interface and measure the exact value in dBFS on my interface line input.
I note this value and the DMM Vrms value.

Then I lower the output level to around 10 dB below the expected max level of the mic input at max gain (see the quick method to estimate it).
In our example, I target a level of around -67dBu, or 0.3mV.

Note: To improve this low level signal's SNR, you may want to reduce the level by using a passive balanced pad after the interface output.
I often use Shure A15AS for that. SNR is then increased by several dBs.
Look at the linearity for the loopback for a 1kHz signal over a full 20kHz Bandwidth below.
The 20kHz BW increases significantly the impact of noise on the result.
(Here, I intentionally zoomed the Y scale to 0.05 / -0.05dB to really assess the accuracy we will get. Amir's plot is ususally +/-5dB)

1724505587404.png


We'd like less than +/- 0.01dB linearity error over 80dB for our measurement.
Without the passive pad, we are borderline.
By adding the passive resistor, we'll basically shift the X axis 0dBFS by 25 dB to the right, and only use down to -55dBFS.
Linearity is then well beyond 0.01dB.



I then maximize the gain of the mic input and look at the level (in dBFS, from 20Hz to 20kHz) on both the Mic input and the line input.
(If your mic input has, say, more than 0.1% THD at that level, lower the level a few dBs.)


Given those measurements
[Calibration DMM level (V rms)]
[Calibration Line input level (dBFS)]
[Measure Line input Level (dBFS)]
[Measure Mic level (dBFS)]

My max input level is then

[Max Mic input level (dBu)] = 20 * LOG10([Calibration DMM level (V rms)]) - 10 *LOG10 (0.001*600)
- [Calibration Line input level (dBFS)]
+ [Measure Line input level (dBFS)]
- [Measure Mic level (dBFS)]

This should give you an accurate value.


b. Measure the noise with the "150" plug in the mic input

1724327416242.png


This gives you [Noise (dBFS)]
You may then compute a more accurate EIN
[EIN (dBu)] = [Max Mic input level (dBu)] + [Noise (dBFS)]

Well, OK, but this values fluctuates with time.
Why ?
The temperature of the resistor.
The resistor's noise is far from negligible.
If the interface heats up, the resistor temperatur will also increase.
For a good preamp, it may actualy be higher than the preamp's noise itself.

But I can't measure it. So what ?


c. Measure the noise with the "Short" plug in the mic input

1724327371566.png


If we measure noise with the shorted plug, our "resistor"'s thermal noise is going to become very small.
Negligible if compared to the preamp's own noise.

With above's method, we'd get
[EIN Short (dBu)] = [Max Mic input level (dBu)] + [Noise Short (dBFS)]

then we may deduct the theoretical thermal nois eof a 150 ohm resistor to get a normalized EIN at 150 ohm 20°C, without impact of the temperature.
The thermal noise of the resistor is -130.92 dBu, or 220nV rms
[EIN (dBu)] = 10 * LOG10( 10^([EIN Short (dBu)]/10) + 10^(-130.92/10))

By the way, the preamp noise in V rms is then
[EIN Short (V rms)] = 10^(([EIN C/C (dBu)]+10*LOG10(0.6))/20)

(Of course, if you do step c., you don't need step b.)


But you're cheating ? A bit, sure.
But now we have a stable an accurate measurement that may be compared to any EIN 150 value.
It's much more reproducible.
And still it matches exactly the value we would get if we were able to measure EIN directly while keeping the temperature of the resistor at exactly 20°C
and with a resistor value of exactly 150 Ohm
And if you want to compare to an EIN with another resistor value (like 200 ohm), you may as well compute it.


Here is a practical example, with the UCX II

1724767244265.png


1724432617055.png


Note that, here-above, I also measured EIN 150 Ohm directly (plot not shown).
I then recorded the ambiant temperature during the measurement and the actual resistor value.
After computing a compensation factor from those values, to normalize to standard 150 ohm / 20°C conditions, corrected measurement is almost identical to the value deducted from the Short measurement, at -127 dBu vs -126.70 dBu.

RME UCX II's internal noise at Max gain is 282 nanoV rms.


As a check, I also measured the Short with Virtins Multi Instrument 3.9.11.1

1724432110300.png



If I summarize the required steps:
  1. Calibrate the line input you'll use to measure the low level signal using your multimeter and a signal close to its lower range upper limit
  2. Set the gain on the Mic input for the gain you want to measure (usually, we start with Max gain)
  3. With the line input plugged in parallel with the Mic input you want to measure, calibrate the Mic input's maximum level, using a signal 6-12dB below the expected max input level
  4. Plug the "Short" XLR plug and measure noise
  5. Compute EIN for various resistor values

You may want to repeat from 2. for a few other gains.
I like to measure for gains where the max input level is 10mV, then 100mV, for comparison between interfaces.
(EIN get worse with lowering gain)

If you measure more gains, you may get a comparison like this
(In the plot below, reference (0) gain is the gain that gives a maximum input level of 100mV for full scale)

1724476513791.png


Comments welcome...
 
Last edited:
Accuracy estimate for 0dBFS level at max gain

DMM:
  • Brymen BM869S (around 200€)
  • Keithley DMM6500 (around 1800€)
Interfaces
  • RME UCX II
  • RME ADI-2/4 Pro SE in Mono mode
    In both cases, I measured with the Shure A15AS to lower the level for Mic level

1724511929509.png


Notes:


To the Accuracy error of UCX II or ADI-2/4 at Mic level, we should add the Accuracy error of the DMM used to calibrate at Line level
Still, the accuracy remains much better with the interfaces:
  • as low as 0.02dB if using the DMM6500 for calibration and the ADI-2/4 (what I'm using for my measurements) for accurate measurement
  • around 0.16dB if using the BM869s for calibration and the UCX II for accurate measurement
The accuracy error at mic level for the DMMs is probably underestimated: the accuracy figures are ususally given for min 5% of the range

It' s very obvious that a direct measurement with a (pretty good) average price True RMS DMM is not accurate enough.
 
Last edited:
Good. That's exactly how we terminate the input, with XLR resistance slugs. The most telling resistance is common (zero ohm) as it eliminates any source contribution. We recently did a custom preamp for NASA to measure laser diode noise used for the upcoming LISA mission. These high-power laser diodes have a characteristic resistance (impedance) of 2.5 ohms, so we tested with a 2.5R XLR slug. Quietest small-signal amplifier we've ever achieved, at -141dBuEIN, broadband (20Hz - 22kHz), unweighted at 2.5R, or around -145dBuEIN at 0R.

The NASA Goddard guys told us they will encapsulate the circuit and immerse into liquid nitrogen, which (in theory) would achieve somewhere around -158dBuEIN. I think we settled on a gain of around 76-77dB.
 
I find EIN quite useless and non-intuitive, as you don't know where the brickwall is, so cannot easily judge a number.
Noise figure is much better, as it shows us how many dB's worse the actual circuit is compared to the noiseless one, with the same terminating resistor (value, temp) and gain and bandwidth.

You know -- or measure -- the resistor (metal film), temperature, gain.
You measure output noise (preferably as noise density). Now with cross correlation it's relatively easy to get reliable result even for very low gains and standard audio interfaces.
Subtract theoretical value, done.
 
You know -- or measure -- the resistor (metal film), temperature, gain.
You measure output noise (preferably as noise density).
The whole question is: how do you relate the noise density in a digital interface to a voltage ?
That's mostly what my explanation is about.
 
The whole question is: how do you relate the noise density in a digital interface to a voltage ?
Sorry, I don't fully understand the question... but you simple measure it, with REW. Or do you mean how to calibrate the ADC so you can tell REW the 0dBFS voltage?
 
how to calibrate the ADC so you can tell REW the 0dBFS voltage?
Exactly.
How to calibrate the ADC so you can tell REW the 0dBFS voltage...
for a 0 dBFS around 1mV
Without a True RMS calibrated millivoltmeter or an AP.
 
Last edited:
The voltmeter requirements aren't as strict as you might think - IMO a prosumer TrueRMS multimeter should be able to pull this off quite adequately. First of all a 1% deviation is only 0.1 dB, second you don't necessarily have to measure at 1 mV - you can also measure the source at some saner level of maybe 1-2 Vrms while unplugged and then apply a known amount of digital attenuation in the (software) signal generator to generate your ~1 mV reference before plugging things back into the mic input (if you have a super linear DAC, you might as well take advantage of it, right?). Just don't forget about the voltage divider between output and input impedance and how it's going to affect levels at the input (150R into 3k already makes for a 0.4 dB drop).

In theory you can measure (well, estimate) EIN entirely without a voltage reference if you have a range of resistors kept at ambient temperature (clamp in pliers or similar if in doubt). This assumes you're in a frequency range (and source impedance range) that permits neglecting the largely 1/f contributions of input current noise, making input voltage noise the dominant noise source. The basic idea is that input voltage noise can be treated like thermal noise from a resistor Rx, so if you have a range of external resistors Rn=R1...RN (e.g. 0, 22, 47, 100, 150, 220, 330 ohms), the noise level in dB must follow 20log(√(4kT(Rn||Rin + Rx))) + C = 10log(4kT(Rn||Rin + Rx)) + C.
You then have to find constant C and the value of Rx that makes the calculated curve match your measured data points. Rx should be about where the curve rises 3 dB above shorted input level, as you would expect when (Rn||Rin + Rx) = 2Rx. With Rx, T and a given bandwidth you can then easily calculate thermal noise.

It's generally a job for a spreadsheet and regression and all that jazz (the more different values in a sensible range the more I'd trust it), though for something like 2 different values (like 0R and 150R) you can just about do the math by hand - basically 2 equations with 2 unknowns. Should make for a good sanity check if nothing else.

It goes without saying that results from this method are going to be best when input voltage noise really is largely flat, e.g. BJT input - I'd have my reservations about anything MOSFET input for obvious reasons.
 
you have a super linear DAC, you might as well take advantage of it, right?).
Just don't forget about the voltage divider between output and input impedance and how it's going to affect levels at the input (150R into 3k already makes for a 0.4 dB drop).
That's exactly the issue.
Yes, the DAC is linear.
But we need to take the input impedance into account.
And, maybe, it will vary with gain.
(Although most interfaces will use chip-based mic preamps, for which it is unlikely)

Otherwise, we agree: my 200€ True RMS DMM is good enough to calibrate at 0.5V. Then I rely on the ADC linearity.
 
In theory you can measure (well, estimate) EIN entirely without a voltage reference if you have a range of resistors kept at ambient temperature
Well. Is that more practical or accurate ?

The problem is : as soon as you plug a resistor to the input, its temperature will shift to follow the device's temperature.
So you never know exactly what temperature your resistor is.
 
calibrated on a serious lcr meter, the use of high wattage non-inductive resistance would not do the trick to get away from the heating considerations?
 
The voltmeter requirements aren't as strict as you might think - IMO a prosumer TrueRMS multimeter should be able to pull this off quite adequately. First of all a 1% deviation is only 0.1 dB, second you don't necessarily have to measure at 1 mV - you can also measure the source at some saner level of maybe 1-2 Vrms while unplugged and then apply a known amount of digital attenuation in the (software) signal generator to generate your ~1 mV reference before plugging things back into the mic input (if you have a super linear DAC, you might as well take advantage of it, right?).

You can't use the digital attenuation and expect to be anywhere near accurate near 1mV. You've got the broadband noisefloor of the buffer and IV stage which is a constant, plus the D/A's self noise.

Attenuation after all the active devices is needed.
 
You can't use the digital attenuation and expect to be anywhere near accurate near 1mV. You've got the broadband noisefloor of the buffer and IV stage which is a constant, plus the D/A's self noise.

Attenuation after all the active devices is needed.
That sort of what I found a better way. Make a little attenuator with metal film or foil resistors. Feed it a higher signal and reduce it. OTOH, I found the digital attenuation method did work almost as well despite my misgivings about it. The difference wasn't much more than a quibble. I do use the Topping D10B for the source as it has a rather low noise level. SNR is -120 db so 60 db below a -60 dbFS signal.
 
Last edited:
Exactly.
How to calibrate the ADC so you can tell REW the 0dBFS voltage...
for a 0 dBFS around 1mV
Without a True RMS calibrated millivoltmeter or an AP.
You calibrate the ADC with a sine wave at low frequency with a good multimeter, preferably a True RMS type, near 0dBFS. You don't need uV precision here.
150Ohm with 40dB of noiseless gain gives 22uV (20...20k, 25°C), -93dBV.
An actual DUT will be several dB's worse.
Any decent ADC with a noise floor of -100...-110dBFS can measure this, in it's most sensitive range (2Vrms or so). With 2-channel cross-correlation averaging we can have another 10dB at least, so as low as 30dB gain should not pose a problem to measure. But at some point we will need a well-spec'd LNA.

I'm happy to have a standalone noise meter (Meguro MN-445B) but I've used my RME for noise measurements of MicPres in the 50...60dB gain range without problems.
 
Last edited:
You calibrate the ADC with a sine wave at low frequency with a good multimeter, preferably a True RME type, near 0dBFS. You don't need uV precision here.
150Ohm with 40dB of noiseless gain gives 22uV (20...20k, 25°C), -93dBV.
An actual DUT will be several dB's worse.
Any decent ADC with a noise floor of -100...-110dBFS can measure this, in it's most sensitive range (2Vrms or so). With 2-channel cross-correlation averaging we can have another 10dB at least, so as low as 30dB gain should not pose a problem to measure. But at some point we will need a well-spec'd LNA.

I'm happy to have a standalone noise meter (Meguro MN-445B) but I've used my RME for noise measurements of MicPres in the 50...60dB gain range without problems.
Well, you use a line level ADC to measure the level.
That's exactly what I do.

If you measure an analog mic preamp, you may just measure its output and its gain.
If you measure a digital mic preamp/interface, you measure its 0dBFS level and the (digitally converted) noise level.

So, in short, I think we say the same thing.
 
Last edited:
If you measure a digital mic preamp/interface, you measure it's 0dBFS level and the (digitally converted) noise level.
Ah OK, that's the thing I missed, testing mic pres of integrated interfaces with no access to the analog signal after gain. That requires different strategy.
 
e. [EIN (dBu)]= [Max input level (dBu)] - [Gain (dB)] + [Noise (dBFS)]
If we have, say,
[Max input level (dBu)] = 18 dBu
[Gain (dB)] = 75 dB
[Noise (dBFS)] = -71 dBFS
then
[EIN (dBu)]= 18 - 75 - 71.4 = -128 dBu
I don't get it, what has the max input level to do with this?
What I miss in this is a way to find the relationship between dBu and dBFS.
I would expect you have to measure some how that by applying a known dBu value and measure the dBFS?
What am I missing here?
 
Last edited:
Back
Top Bottom