Something has gone horribly wrong here. This is not a rabbit hole the intelligent objectivists should go down.
Forget about all the papers and DBT tests for a moment and think about what actually happens and happened in the past in the recording studio. Better still, go to a studio and see for yourselves.
Pre digital mixing desks.
Many of the microphones used in studios were response limited and prone to distortion at extremes.
What the microphones did pick up went to the mixing desk; a pile of electronics.
We know what electronics can do to signals, there are measurements here that show well regarded electronics doing some pretty awful things to signals.
A lot of music was recorded to tape or even acetate as the physical storage medium. I can’t prove this but I doubt even the best gear and storage medium matched the dynamic range a decent modern dac can manage. Basically, the information isn’t on the master tapes, or acetates to begin with.
The analogue version of hi res then was the twelve inch record. I could definitely hear a difference on those but even then I never knew if I was listening to what an engineer had changed, or the advantages of the twelve inch.
More recently more music got recorded on a digital desk and these have ADC built in. Do we think ADC are of superior specifications to DACs?
I’ve not seen one measured but somehow I doubt it. Back to the state of the art DAC tests then where I think the performance may be comparable.
Of course, sound engineers are the absolute masters of “Oh I’ll just”. All those tools on their desk, they can’t help themselves.
Then you’ve got the interpolation algorithms, none of which are perfect.
The list goes on.
So, what is it you are listening to?
Unless the recording was done digitally in the first place with equipment that was capable of capturing the full range of the instruments played you may well be listening to noise, because that is what interpolation will sound like on it’s own. It only makes sense with adjacent frequencies and even then it may not have been on the original recording.
I thought the idea was high fidelity. With replay equipment you can aim for fidelity to whatever medium you have. Unfortunately that doesn’t necessarily mean fidelity to what went on in the studio.
A point was made earlier that classical music fares better in general.
I can’t listen to recorded classical music and no it’s not because my replay gear is rubbish. Stereo just doesn’t hack it.
If I want to listen to High Resolution music I go to a concert.
None of the above means I don’t think aiming to get the highest resolution possible on the master medium isn’t a great idea which I fully support. What it does mean for me, is if I have trouble distinguishing between CBR 320 and redbook in a DBT, but a recording engineer can tell which limiters were used in a recording then I’m probably too deaf, or untrained to worry about it. I think many here would fall into this category under test.