restorer-john
Grand Contributor
I read an article on the Benchmark website that said you can't even get the SINAD values the DAC is capable of if you limit yourself to 2Vrms outputs. To me, I think a selectable output voltage should be a standard feature to help match the amp gear selected. The standard output levels are pretty outdated and don't match current technology.
Benchmark just want bigger numbers to advertise, so, like plenty before them, they want to play with output levels to push their signal further away from the relatively fixed (on SOTA gear) residual noise floor.
Amplifier makers have also realized they can also tout bigger numbers if they dispense with, or at least reduce the gain of (and the consequent noise) their front end (VA) stage.
It's just a case of playing hot-potato with the consumer stuck in the middle.
2V was 13 times and 22dB above the standard for consumer gear at the time. It was a marketing number driven figure and was changed at the absolute last second to get better numbers.
Now we seem to have few real preamplifiers out there, so the 150mV sensitivity for full rated output is disappearing and the inroads of the 'pro' level and XLRs are creeping into the home. It's funny, for many decades, the ubiquitous Cannon XLR was seen as defining gear made for the sound reinforcement scene and not of audiophile quality. Just the presence of an XLR immediately turned audiophiles off. In the 90s, they started sticking them on anything and everything- mostly with a degraded performance when compared to the RCAs. Now audiophiles can't bring themselves to lust after products without the silly things. Swings and roundabouts.
So where will it end and, what's next? Until we have a D/A converter that can drive +/-70V into a separate power stage, the games will continue. And we'll need a new connector of course. A safety retractable multipin derivation of the XLR because you don't want a fatal shock from your D/A converter do you?