I have a pair of Revox M3500 dynamic microphones that are specified as 600 ohm.Are there mics with 50 k output impedance? Most common is 50 ohm to 200 ohm. I've seen a few at 300. I think I recall one at 600 ohms
Indeed.My apologies to Rja4000, maybe I've hijacked your thread for a different purpose. Didn't intend that. Maybe this belongs in the thread Amir had asking for what to test for. That turned into the Tower of Babel for mic pre testing. As a result Amir hasn't indicated any additional testing of interfaces on the preamp side.
Then you'll have to measure it, I'm afraid.Ok,I did E-MU per your guide but can't find gain in specs:
View attachment 389544
Shorted
View attachment 389545
150 Ohm
View attachment 389546
Specs
emu announces a +60 on its sister 0404 ;-)Ok.
View attachment 389555
Min gain at 0dB is -6dB RMS
View attachment 389556
Max gain at -70dB is -16.6dB RMS
View attachment 389557
min gain at -70dB is -76dB RMS
So close to 60dB?
Thinking about what I would do to get a full picture of how good a digital mic pre is wrt noise...So I'm still puzzling over the least number of things we can test to tell us what we want to know about microphone preamps practically speaking. Here is my current thinking.
A Shure SM7b is about the least sensitive microphone you run across. Most modern ribbons are more sensitive. It puts out 1.12 mV at 94 dbSPL at 1 khz. Shure suggests you need 60 db of gain minimum to make it work.
The more sensitive LDC microphones are 28 or 29 mV at 94 dbSPL at 1 khz. Let us call that a 30 to 1 ratio roughly. Which is also roughly 30 db vs an SM7b. So if we know how things work at max gain and 30 db less than max gain we have most microphones covered.
Test #1
We need to test for max input at minimum gain. You can do that with a wide range of voltages. I suggest .775 volts as that is 0 dbu. You can read it straight off the dbFS graph to see what is the max dbu input level. That and the gain range is enough for this part.
Test #2
EIN with 150 ohm resistor at max gain and 30 db less than max gain.
Test #3
EIN at max gain with a short.
(Do we need noise level at minimum gain with a short?)
Would anything be missing to tell us most of what we want to know for practical purposes?
I'm hoping to make this simple enough Amir will do this when he tests audio interfaces. Generally speaking the majority of audio interfaces use the same microphone circuit padded down for line level. Amir already tests those for distortion, frequency response and dynamic range.
So what do you guys think? Will these three tests do the job?
@Rja4000 @AnalogSteph @nanook @KSTR @restorer-john @SIY
Let's see if I got it.Then you'll have to measure it, I'm afraid.
What's the 0dBFS level at max gain on mic input ?
How to do that is what I'm trying to describe in my initial post.
Fully agree.As an addendum, some interfaces might implement soft-clipping directly in front of the ADC and never reach 0dBFS, so for these one would need to back off the sensitivity baseline to a point where a certain distortion, like 1%, is reached.
That one is interesting and, in my method above, I'm completely ignoring it.measure source impedance correction factor, that is, sensitivity vs source impedance at a fixed gain setting, say three source impedances 15R, 150R and 1500R
At 1kHz, AC Voltage accuracy for your DMM is given for +/-(2.0% reading + 3 dgts)Let's see if I got it.
View attachment 389579
-3dBFS is 0.953V
View attachment 389580
-24bBFS is 0.078V
and at near max gain just before it gets really messy looks like this:
View attachment 389582
so it gets to 0.56dBFS
Did I got it right so far?
Input impedance can be as low as 1kOhm and that would reduce sensitivity by 1.2dB with a 150R source, so that certainly is a factor to consider when high precision is our goal.Is that a fact in reality ?
How much impact could we expect ?
When using the Shure A15AS attenuator, the output impedance is close enough to 150 ohm at -15dB. So I don't need to modify anything.Input impedance can be as low as 1kOhm and that would reduce sensitivity by 1.2dB with a 150R source, so that certainly is a factor to consider when high precision is our goal.
In practice, we would use a source (eg DAC) with a known output impedance -- ideally modified (upped) to give our 150R nominal -- and a known no-load voltage, and use that value rather than the measured voltage at XLR connector.
If the output and input impedance is exactly known (and hopefully not gain-dependent) we could of course simply calculate the correction from the measured value at the connector.
Calibrated with 6V range (same as the above,autoranging have put it there) and line I/O and got about the same reading (0.953 -0.951V) for the corresponding -3dB (XLR's-TRS's difference).At 1kHz, AC Voltage accuracy for your DMM is given for +/-(2.0% reading + 3 dgts)
So in your case, that's 0.0078 +/- 0.0005
So your value at -24dBFS is +/- 0.58dB.
The only way to get an accurate measurement with your DMM at such voltage is to use your interface's line input, calibrate it with your Multimeter as close of 6V as possible (where your DMM accuracy will be around 0.2dB) and to use it to measure low level voltages.
It's the setup I used in post #110 where the 0.953V reading corresponded to -3dBFS while for the second reading I used -6dBFS to cover the difference between the inputs.Is this -3 dB with the same setup you used for post #101? If so, the -4.1 dBFS rms @ 0 dB from the min gain measurement gives me a min gain 0 dBFS level of 0.953 V (= + 1.79 dBu) + 3 dB + 4.1 dB = +8.89 dBu. That closely agrees with the +8.7 dBu official spec.
So you current EIN estimates for the direct measurement would be -122.3 dBu(A) / -121.7 dBu unweighted (20-20k). The estimate based on noise level delta between short and 150 ohms remains unchanged at -120±0.5 dBu unweighted (20-20k).
That means we still have a ca. 1.7±0.5 dB discrepancy unaccounted for between both methods. Given the basic tools used and limited number of measurements, that's not too bad at all.