DBT is just double blind test. The rest of it would include level match, us the same media content same equipment, same room, same conditions basically, it's difficult to do but not that complicated.
In a real controlled DBT, there is no right or wrong, only whether one can identify the amps consistently.
In a sighted test, especially if not tightly controlled, the what you cited in your post will have effects, in that sense, I am in agreement. But then, you cannot base on such comparison tests to say the amps tested sound diiference, let alone one sounding "better" than the other, for the same reason you cited.
I agree with you in many ways because I'm beginning to question a lot of things and it wasn't just the DTS vs DTS Master HD (3x more resolution) that has thrown me off. That's not a blind test but the AVRs were identical except for the fact that one supports DTS Master HD and the other doesn't.
Here are 2 more tests that have really thrown me off completely. They involve my eyes which are much more precise than my ears and they've made me question everything I take for granted.
Test #1
It turned out about a month ago that I had been watching Netflix in 720p (1Mbps) for several years. I was livid because I was used to 1080p with (6-7 Mbps) at the very least but since Netflix offered their Premium Plan, they had downgraded the Standard plan's quality from 1080p to 720p. The content's bitrate had dropped to a dismal 1Mbps for almost everything. I can download the entire 1Mbps movie in 5 seconds with my connection. There is still some content like Our Planet which plays in 1080p and 6-7Mbps.
Needless to say, I needed to find out the difference and I upgraded to Netflix's Premium Plan and turned to my OLED laptop (DELL XPS). I launched Chrome and I played a movie but it was still 720p. I then switched to IE Chromium and it started playing in 4k Dolby Vision. So now I have 2 Windows, one playing 4k Dolby Vision and the other 720p SDR

. Well, let's test. I pull up everything and after syncing, I start alt-tabbing like crazy between the 2 browsers.
My hypothesis was that 4k at 16Mbps would crush 720p at 1Mbps - after all it has 16 times more resolution ... and Dolby Vision vs SDR!!! Not even a contest, right?
So I pause scenes and start alt-tabbing until I can't tell what I'm looking at. I honestly could not tell the 4k version from the 720p. I got my glasses which allow me to look at pixels and I looked close and I looked far. In Bullet Train, there was 1 still where I could detect more detail in the background in the 4k. I'd be honest, I would not be surprised if Netflix's 4k version is sub 1080p in terms of resolution. In Our Planet there was a shot of trees where they were more detailed in the 4k.
The biggest shock was SDR vs Dolby Vision. I thought there was something wrong with the laptop so I got my phone out and measured Dolby Vision highlights @ 900 nits and SDR at 950 nits. DV looked less bright which was quite impressive and the nits showed that. SDR was brighter but it also covered a larger area so it appeared much brighter. But the contrast of DV was better because of darker areas next to brighter areas. I couldn't choose between them.
My conclusion was that 4k is not much of an improvement except in terms of bandwidth (not on the screen), DV is cool and a nice alternative filter. What I definitely concluded was that SDR looks amazing on an OLED that has proper SDR brightness.
Test #2 defies all logic
I've been dying to buy an OLED TV but I can't get rid of my 3d TV. If I have a choice between OLED and LED, I always choose OLED (if I can afford it, well now they are cheap so I can buy them without a 2nd thought). However my main TV is a Sony 2014 IPS Edge-lit (their 2nd Quantum Dot LED). It's 1080p, it's IPS, it's edge-lit -it's crazy.
Should be the worst TV ever made based on its technologies. And I still have not replaced it. When I got my OLED laptop, I put the laptop in my lap and created the same viewport as my TV under it. I played the same content. The first thing I noticed was how vibrant the OLED was compared to the LED. But then I changed the Sony's colors to Warm 2, bumped the color to 72, and eventually the colors were identical. Okay, colors are now identical so let's compare the quality.
But the OLED was brighter which didn't make sense since OLEDs are supposed to be dim. I cranked the Sony's brightness from 7 to MAX and it got closer but the OLED was markedly brighter. I started realizing that the OLED was not normal and it was clearly way over the 400 nits on Dell's website. I realized that I have the light sensor on my Sony. I turn the light sensor off and boom, equal brightness. So now I can watch 2 TVs at the same time - one OLED, one LED, one 4k, one 1080p.
My expectation is that the OLED will destroy the 2014 dinosaur.
I played a lot of the intro to Casino Royale because of the colorful intro and the black & white bathroom scene and the dark scene where Bond kills for the first time in the office at night. I'm looking at the OLED and I'm so impressed by it but then I see the LED and it looks near identical or identical. How is that possible? I switch to youtube demos where I can guarantee 4k vs 1080p. Same thing. It got crazy - I spent countless hours trying to salvage OLED's honor.
During the day, the blacks are identical. It's impossible to tell the difference on the OLED and the LED.
In terms of contrast, vibrancy, color volume, my family and friends can't see the difference between the two. I can a little bit but I've had so much time comparing them. I'd give the Sony a 98 and the OLED a 100 - that's how close they are. At night, I prefer the OLED but because I always have a little bit of light and some bias lighting behind the TV, the blacks are perceptually augmented to 3,000-5,000 contrast ratio and I'm fine with that. But this OLED laptop display is also much brighter than a regular OLED (no OLED TV can match its full window brightness) - I can't buy a large TV that will be as bright.
The fact that a 10 year old edge-lit IPS 1080p panel can compare to an OLED that hits 1,000 nits and we can barely tell the difference is a shock. The OLED destroys my 2020 Frame in my bedroom and, by extension, so does my 2014 Sony. It also destroys my calibrated computer screen which is laughably HDR capable (400 nits).
But it failed to destroy a 10 year old IPS, edge-lit, 1080p screen despite having better specs across the board. My hat is off to whoever designed my Sony TV - amazing accomplishment.
Naturally, I've started questioning all my assumptions about what's better and what's worse.