- Joined
- Dec 12, 2019
- Messages
- 13,324
- Likes
- 29,550
... or maybe I am just far too naive. Let me see if I can ask this question clearly...
I "realize" (or maybe "I believe" ) that, for sine wave testing (i.e., single frequency, steady state test signals), harmonic distortion in an amplifier would manifest itself as any added harmonic content to the pure fundamental (e.g., a 2 kHz signal of any magnitude appearing in the output of a pure 1 kHz sine wave signal after passing through an amplifier).
But what about a signal that (in a hypothetical case), contains both 1 kHz and 2 kHz components -- let's say an admixture at a 4:1 ratio (by amplitude) of 1 kHz and 2 kHz sinusoodal components? If the output of the amplifier under test added harmonic distortion to the signal, wouldn't the output then contain proportionally more of the 2 kHz component relative to the 1 kHz component ("all else being equal", e.g., blithely ignoring any intermodulation distortion)? Say from 4:1 in to 3:1 out (yeah, that'd reflect a lot of harmonic distortion; it's a Gedankenexperiment, OK? ) -- not to mention, of course, higher-order harmonic distortion components.
Here's why I ask. I guess my belief is that any increase in a harmonic relative to a fundamental in the output of an amplifier would represent harmonic distortion by definition.
Isn't that, in a grossly simplified sense, what a tone control does (if output above a certain frequency is boosted)? E.g., suppose I turn up a treble control on a hypothetical amplifier (you know... a spherical amplifier of mass m and radius r, etc.) such that it boosted the output at 2 kHz by 3 dB relative to 1 kHz. Why is the effect of that tone control not harmonic distortion?
I am sure I must be missing something very fundamental (ahem, no pun intended) here, so, if anyone has the patience and gumption to help me understand, it'd be appreciated. Thanks for reading!
I "realize" (or maybe "I believe" ) that, for sine wave testing (i.e., single frequency, steady state test signals), harmonic distortion in an amplifier would manifest itself as any added harmonic content to the pure fundamental (e.g., a 2 kHz signal of any magnitude appearing in the output of a pure 1 kHz sine wave signal after passing through an amplifier).
But what about a signal that (in a hypothetical case), contains both 1 kHz and 2 kHz components -- let's say an admixture at a 4:1 ratio (by amplitude) of 1 kHz and 2 kHz sinusoodal components? If the output of the amplifier under test added harmonic distortion to the signal, wouldn't the output then contain proportionally more of the 2 kHz component relative to the 1 kHz component ("all else being equal", e.g., blithely ignoring any intermodulation distortion)? Say from 4:1 in to 3:1 out (yeah, that'd reflect a lot of harmonic distortion; it's a Gedankenexperiment, OK? ) -- not to mention, of course, higher-order harmonic distortion components.
Here's why I ask. I guess my belief is that any increase in a harmonic relative to a fundamental in the output of an amplifier would represent harmonic distortion by definition.
Isn't that, in a grossly simplified sense, what a tone control does (if output above a certain frequency is boosted)? E.g., suppose I turn up a treble control on a hypothetical amplifier (you know... a spherical amplifier of mass m and radius r, etc.) such that it boosted the output at 2 kHz by 3 dB relative to 1 kHz. Why is the effect of that tone control not harmonic distortion?
I am sure I must be missing something very fundamental (ahem, no pun intended) here, so, if anyone has the patience and gumption to help me understand, it'd be appreciated. Thanks for reading!