Well, it's true that it's easier to hear 0.1% of 15th harmonic at a 100 Hz fundamental. But actual physical modern amps? The last commercial solid state amp on my bench (a Class D unit) showed <0.01% across the audio spectrum with a loudspeaker load. Looking at some of Stereophile's measurements, this is not at all unusual.I'm not sure that I agree about 'which is the case for modern amplifiers'; most of them I've encountered are not. But I have encountered a few; IMO what this seems to come down to is exactly how low the distortion actually has to be before its masked- and that is lower than many think. This is entirely due to the fact that the ear uses the higher ordered harmonics to sense sound pressure, so if the amp adds any of its own, it can be detected. In a nutshell we've been hearing this for the last 50 years and most of it simply has to do with the simple fact (or maybe not so simple) that most amps simply lack the gain bandwidth product needed to really control distortion. If you graph distortion vs frequency then you see how this bears out; it can be a mistake to only look at the distortion curve at one (usually low, like 100Hz) frequency. Almost any amp with feedback has enough feedback at 100Hz
Here's the first one I grabbed at random (because I've heard of the brand!), which is an old-fashioned AB type:
(from https://www.stereophile.com/content/rotel-michi-m8-monoblock-power-amplifier-measurements)
And, of course, non-gross levels of HD above, say, 6-7 kHz (3rd is ultrasonic for folks of our generations, 4th and higher are for anyone) don't impact sound. I would certainly not consider levels like this "gross."
So, it would be good to have some actual controlled listening behind some of the audibility assertions.
Side note: I would have grabbed some results from one of your interesting tube amps, but Stereophile seems to have skipped over you. And I don't have one here to test, though it's tempting to build one just for curiosity sake.
Edit: plus ca change, plus c'est la meme chose