Now, the golden standard for a hi-fi amplifier worthy of its name is that it is capable of x watts at @0.1% THD+N (60dBs down)...
I would like to understand the reasoning behind certain criteria.
THD is a failed metric when it comes to predicting perception of distortion. Better metrics exist but they are much more difficult to derive, so THD persists.
how do you notice 0.1% THD if even the best speaker is going to produce, realistically speaking, 0.5-1% THD (that is, 14-20dB higher distortion) at normal listening volumes? What I'm asking is, I suppose, is that 0.1% figure just a conservative engineering target (based on the real world expectation that your listening levels are going to be around 60dB over the background noise, and that if you have a perfect speaker system, you need to achieve this figure)?
Loudspeaker harmonic distortion is always very low order and is therefore perceptually relatively benign.
Earl Geddes and Lydia Lee conducted an investigation into distortion perception that resulted in the GedLee Metric, which correlates well with distortion perception. Here is an interview with Earl Geddes cued up to where he begins to discuss correlations between metrics and perception:
Last edited: