128.7 dBu(A) EIN is perfectly competent but not actually any better than a number of sub-$200 audio interfaces. (Spec seems to be -133 dBu shorted @ 30 kHz, which should have resulted in -129.4 dBu @20 kHz unweighted or -131.6 dBu(A). Might be time to double-check the generator setup / output noise level.
That was the whole goal of this thread: how to generate proper test signal, which basically means, with low enough noise at very low level.
I'm pretty confident my signal is "good enough" down to 1mV, which is low enough to measure up to 80dB gain.
But for EIN, there is, of course, no generator.
For noise measurement:
The setup is
An XLR with R150 1% metal resistor is soldered berween pins 2 and 3 in an XLR connector.
This connector is plugged in the HV-3C input
HV-3C output is plugged in an Y type XLR cable, which feed bot inputs of the RME ADI-2 Pro fs R.
The RME is set for +4dBu input range and M/S mode. The M channel (Sum/average) signal is then used in Virtins Multi Instrument 3.9.
I use an FFT with rectangle window, 65k size, 48kHz frequency and I'm measuring RMS level on 20-20kHz band, with A weighting. I read it after average on 15 frames.
If I plug the 150 ohm resistor XLR to the Y cable, and then directly in the RME, I read -124.7dBFS, which is way lower than any reading through the Millennia, of course.
I spent time trying to measure the matching of Input reading to actual dBu for the RME. I miss a good enough voltmeter but I'm pretty sure, by cross-checking, it's just a few hundreds of dB lower than displayed. I compensated for that.
I also compensated for temperature, since it's 29.5°C in my room by now.
Measuring the Millennia gain, which is part of the formula and therefore requires the same level of accuracy, is easier, since it's comparing the level with or without it. So you don't care of any RME error anymore, since it's removed at the end.
Any mistake here, in your opinion ?