The objective of every amplifier is--should be--to linearly enlarge the incoming signal with the least distortion (for me, distortion is any departure from linearity). Any amplifier meeting that objective should sound the same as any other amplifier meeting that objective, no matter what the amplification technology.
There is a threshold of "non-linear" where it becomes detectable. That threshold varies based on the type of distortion and the hearing skills of the listener. We know quite a lot about that. My own threshold for relatively undetectable harmonic distortion is very low, perhaps under 40 dB.
All these challenges were limited to amplifiers operated within their linear range. The question is: At what point, and under what load, does a given amplifier lose linearity? Did it happen during the test? Does it happen in any stated use case? Any test of amplifiers should evaluate this, it seems to me. There was one article from the deep past suggesting that most amplifier evaluations were made with distortion levels elevated by pushing the amps into their nonlinear range. I think the suggestion of that article (which I cannot now recall the citation for) was 1%--40 dB. Most distortions at that level will be audible by most people by then, particularly in the way high-power transients are being amplified (or not).
Of course, not all amplifiers are trying to be linear. Some apply a coloration very much on purpose. But what grinds my gears is when their creators then claim that they are linear.
That is what the gentleman should have explored, given his conviction that the differences were "ridiculously easy" to hear. If there were differences that were easy to hear, then one of the amps, at least, was nowhere close to being linear. I doubt the manufacturer would be willing to admit that.
Rick "thinking high-power transients happen a lot more often than people realize" Denney