If your -xxdB above is greater than 60dB, then music can be played as loud as you like. If -xxdB is the most distortion possible, at any level, distortion will only ever be lower at levels lower than maximum.You missed the point of the thread.
It wasn't:
"At what level is distortion free"
It was:
"If this is the dynamic range of one's hearing, and harmonic distortion is discernable from music, during music, then up to how loud can music be played with 0% chance of harmonic distortion being audible, if peak distortion is at -xx dB"
Purely theoretical. The logic tracked. We had an interesting conversation otherwise though
(I switched things around a bit for your explanation, but only because it's a lot longer to summarize exactly what I wrote, and this means the same thing [+ I'm explaining what you didn't read in the first sentences of the OP]
What you're asking, I think, assumes that at some peak level, distortion will be audible, and therefore greater than 1% or -40dB. That's just not the case with any modern (say after 1960, definitely after 1980) amplifier design provided the amplifier isn't clipping. Earlier amplifiers (and modern ones to early designs, like SETs) can provide sufficient distortion to be audible, but I don't think that's what you mean.
The dynamic range of one's hearing isn't really relevant here because again with decent designs, distortion either remains constant as a percentage, with level, or reduces with reducing level. Only in a few early transistor amplifiers did percentage distortion increase with reducing level due to crossover distortion, which is why they made a Big Thing about only quoting distortion at high level.
S.
Last edited: