Adaptation to the 800mV reference/clipping level of my amp is done via ~7dB precision resistor attenuators in the XLR connector (>2KOhm input impedance, ~600Ohms output) with +13dB output setting. This improves the S/N and channel balance a bit over using the vol. pot.
I'm not sure there is a benefit of using the +13dBu range
Here is 800mV input, measured with ADI-Pro fs R (Black) and ADI-Pro fs (Red) ADC, using range +4dBu
(+4dBu-3.7dB = 0.3dBu = -1.9dBV=0.8V) for different scenario
DAC is the ADI-Pro fs R for all. (So when comparing black and red, we are comparing ADCs)
First
straight from the DAC, +4dBu output range, with some digital attenuation (RME ADI-2 Pro fs R - new version)
Then with
DAC set for +13dBu output range, with passive attenuation
Finally, with
DAC set to +4dBu output range, full output, with a light passive attenuation
That illustrates several things, I think
1. ADC is working (slightly) better on "Pro fs R" version than on "Pro fs Non R" version
(That was actually why I was measuring this to begin with)
2. Using +13dBu to generate a signal close to 0dBu is less efficient than using the +4dBu range.
I guess this is true for the ADI-2 "Non Pro" fs too...
Notes:
1. I never "calibrate" the output.
So when you see in the signal generator "Output amplitude" = 1V, that means 0dBFS for the DAC, which is the full range (+4dBu or +13dBu in those examples).
2. Those measurements are all loopback, which accumulates noise and distortions from the DAC and the ADC (and from the passive attenuators where it's used). So 114.5dB SINAD here, while attenuated by 3 7dB, means that both DAC and ADC have SINAD better than that. How much better we can't tell.