-40dBV could work. But I prefer keeping the convention of using dBu.
Users will want to compare this figures to microphone sensitivity, which is usually given in mV/Pa (94 dB SPL).
Or to dBu indeed, when using with a line level signal.
So dBu and mV both make sense.
-34dBV maybe slightly too high in level. At high gain is where the difference shows and when the noise becomes audible.
-35dBu then ?
That's -37dBV, as you said.
Or max gain if the interface/preamp can't reach that level.
Remember this is the max level. Average recording level should be -18 to -20 dBFS, with peak signal around -6 dBFS, if you follow the usual rules.
MAX, -40dBV, -20dBV works for me.
For EIN, representing low level performance, one figure should be enough.
Of course, more values are welcome -and a full plot vs all gain settings is very interesting, especially since sudden jumps are quite common- but that's time consuming.
On top of that, we need the full analysis at low gain, high input level.
For which gain ?
That's a good question.
If you look at microphones, their sensitivity range between around 0.8mV/Pa (like the very good Audix OM7) and 40mV/Pa (for the legendary DPA 4006). That's a 34dB range.
And, of course, source's SPL and distance to source can vary hugely.
So, at least 40dB less gain than low noise.
If we do it for 0dBFS = 4V (14dBu), that may be a bit high for some lower cost, 5V powered interfaces.
And that's low for a pro interface.
So, maybe, the best would be to measure at min gain (without PAD), with a signal at -6dBFS (which is supposed to be the peak value you'll using).
This will favor the high end gear, but they'll actually perform better in reality, given their higher input level tolerance.
This is beneficial for the user, so it makes sense to show it, IMO.
A similar approach could be used with ADCs:
Measure noise for a standardized input level setting, if you have a choice, but do the rest of the measurements near max level.