I've read several pieces from people in the recording industry on why I should not be able to tell the difference between 16/44 and high res, and I generally believe them. Of course there are also many articles saying the same about 360mp3/256AAC vs 16/44, but I can totally tell that difference on quality speakers. I decided to try and test 16/44 vs hi-res the other day and I could hear a difference. I'm assuming it's my test that's flawed, but I'm not sure why (other than it's not blind). On a Mac, using Audio MIDI Setup, you can change your audio output format on the fly between 16/44 and 24/192 (and other formats). It'll drop audio for about 1/2 sec while it switches. That's what I used to down sample.
I have the
75th anniversary Blue Note collection, and playing Kind of Blue 24/192 thru a NAD 2030 V2 into Revel M105s near field, if I switch from 24/192 down to 16/44, I can tell a very very subtle degradation. It's so subtle that I can only hear it when going down in resolution. I cannot tell when I switch it back up to 24/192. I also don't think I could tell if it was a 2 second gap instead of 1/2 second. But I do hear it in this test, just every so barely, in the sound stage. I'd think that if I hear can it when going down, then I should hear also it get better when I flip back up, but I can't.
What I do know is that the change is so insanely subtle that I can't reliably tell with just a 1/2 gap, while focusing my ears as hard as I can, listening to an album I know inside and out, looking for the change, and knowing it's going to happen (not blind). It convinced me that any differences there may be are so infinitely subtle as to be not meaningful in any way. I have don't wonder if I'm somehow missing out on anything by listening to 16/44 content most of the time.
I've been thinking that any difference I was hearing could be due to the how the Mac was down sampling it on the fly, but I have no idea. That's the most curious part to me, I really wonder why I can even hear the change at all?