The point is can you actually hear distortion measured in microvolts?
I
suspect that one could if one uses high-sensitivity loudspeakers. Some
ones of us, here and there in the world, still do, you see?
Thus measuring the ratio of distortion to signal voltage (% or dB) is probably more meaningful in most cases.
Full disclosure, I haven't measured the average
output power (much less
voltage) of my amplifier when driving the ca. 104-dB SPL/watt @ 1 meter sensitive, Altec driver-loaded loudspeakers I usually listen to, at the (low-ish) level at which I typically listen, but I reckon the absolute values of power and voltage are
pretty low.
Fermi estimate: if 1 watt (2.83 VAC into 8 ohms) gives 104 db SPL at 1 meter, and I might typically listen at 80 dB (i.e., 24 dB less, requiring ca. 251-fold less
input power), we're talkin' (and, by all means, check my arithimetic!) on the order of 4 mW amplifier power (11 mVAC into 8 ohms). 1% distortion would be 110 uV.
Truth be told, at the sub-mV level,
noise is more of a problem (in my own amplifier's case) than harmonic distortion, though.
Are you absolutely sure?:
Dude. The
tone comes from the
speaker.
Sometimes, the laminations on the transformers may sing a little, too, though -- in full disclosure.