• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

I have a harmonic distortion question, or maybe it's a semantics question ;)

mhardy6647

Grand Contributor
Forum Donor
Joined
Dec 12, 2019
Messages
13,324
Likes
29,550
... or maybe I am just far too naive. ;) Let me see if I can ask this question clearly...

I "realize" (or maybe "I believe" :)) that, for sine wave testing (i.e., single frequency, steady state test signals), harmonic distortion in an amplifier would manifest itself as any added harmonic content to the pure fundamental (e.g., a 2 kHz signal of any magnitude appearing in the output of a pure 1 kHz sine wave signal after passing through an amplifier).

But what about a signal that (in a hypothetical case), contains both 1 kHz and 2 kHz components -- let's say an admixture at a 4:1 ratio (by amplitude) of 1 kHz and 2 kHz sinusoodal components? If the output of the amplifier under test added harmonic distortion to the signal, wouldn't the output then contain proportionally more of the 2 kHz component relative to the 1 kHz component ("all else being equal", e.g., blithely ignoring any intermodulation distortion)? Say from 4:1 in to 3:1 out (yeah, that'd reflect a lot of harmonic distortion; it's a Gedankenexperiment, OK? :)) -- not to mention, of course, higher-order harmonic distortion components.

Here's why I ask. I guess my belief is that any increase in a harmonic relative to a fundamental in the output of an amplifier would represent harmonic distortion by definition.
Isn't that, in a grossly simplified sense, what a tone control does (if output above a certain frequency is boosted)? E.g., suppose I turn up a treble control on a hypothetical amplifier (you know... a spherical amplifier of mass m and radius r, etc.) such that it boosted the output at 2 kHz by 3 dB relative to 1 kHz. Why is the effect of that tone control not harmonic distortion?

I am sure I must be missing something very fundamental (ahem, no pun intended) here, so, if anyone has the patience and gumption to help me understand, it'd be appreciated. Thanks for reading!
 
I think it is pretty simple. Just think about each tone separately. 1 khz tone with 2khz distortion let us say distortion is 1%. Your 4:1 ratio means the 2khz signal is -12 db. Yes you would have 2 khz harmonic at 1% which relative to the 2 khz signal is 4%. So if they were in phase the harmonic distortion and 2 khz signal is not 4:1 (100% vs 25%) instead it is 100% vs 29%. It probably won't be in phase and also could reduce the 2 khz signal by 4%. So you would have 100% vs 21% if the harmonic distortion was out of phase. In terms of db the 2 khz signal would be somewhere between .3 db higher or lower. As you can see once you get distortion very much lower the effect is not going to be audible at all.
 
Last edited:
harmonic distortion in an amplifier would manifest itself as any added harmonic content to the pure fundamental (e.g., a 2 kHz signal of any magnitude appearing in the output of a pure 1 kHz sine wave signal after passing through an amplifier).
Yes, but I THINK most measurements include EVERYTHING that's not 1kHz, including noise.

But what about a signal that (in a hypothetical case), contains both 1 kHz and 2 kHz components -- let's say an admixture at a 4:1 ratio (by amplitude) of 1 kHz and 2 kHz sinusoodal components? If the output of the amplifier under test added harmonic distortion to the signal, wouldn't the output then contain proportionally more of the 2 kHz component relative to the 1 kHz component
Yes, plus any additional higher-harmonic distortion from the 2kHz signal.

Here's why I ask. I guess my belief is that any increase in a harmonic relative to a fundamental in the output of an amplifier would represent harmonic distortion by definition.
Isn't that, in a grossly simplified sense, what a tone control does (if output above a certain frequency is boosted)? E.g., suppose I turn up a treble control on a hypothetical amplifier (you know... a spherical amplifier of mass m and radius r, etc.) such that it boosted the output at 2 kHz by 3 dB relative to 1 kHz. Why is the effect of that tone control not harmonic distortion?
Yes, boosting the treble boosts the distortion. But if you're talking about the 2kHz signal, that's NOT distortion.

With a signal that contains both 1kHz and 2kHz components, you aren't going to hear any added 2kHz distortion.
 
The goal of a tone control is to introduce only linear distortion - purely a change in amplitude of frequencies in the signal. A pure 5 khz tone at 80% full scale should come out as a pure 5 khz tone at 85% full scale as an example of what a tone control adjustment should do, without adding non-linear (harmonic) distortion.

Non-linear distortion results in a change in the spectral composition of the signal between signal in and signal out. Non-linear distortion can be harmonic and non-harmonic, but even harmonic distortion changes the spectral composition.

If a tone control added harmonic distortion, it would change the timbre of the sound and not only the tone.
 
It is possible for harmonic distortion to be effectively increased by linear filters following in the signal path, much like a bass reflex tube can amplify internal box resonances. That is effectively what causes speaker distortion peaks related to breakup modes. If that breakup modes creates a +20 dB frequency response peak, internally generated distortion will be amplified accordingly when the harmonic hits that area.
 
Back
Top Bottom