Hi John. You've asserted in your posts that damping factor is a critical determinant of amplifier quality in an audio system. For years I've held output impedance as important when I've gone amplifier shopping, but to be honest I've always felt a little silly about it due to reading many articles which assert technical reasons for the opposite view. Like this one:
https://www.audiofrog.com/community/tech-tips/damping-factor-and-why-it-isnt-much-of-a-factor-2/
I've owned speakers that have had low impedance and simultaneous capacitive phase angles, which at least for my buying decisions have been sufficient justification for demanding very low output impedance. I'm not planning to change my buying criteria at all, even though my current speakers (Salon2s) are an easy load, but how do you respond to these damping-factor-isn't-important assertions?
The paper above is technically correct, but it is focused on the wrong problem.
The problem is that a low damping factor can cause significant frequency response problems. As the paper points out, the slight changes in driver damping are relatively insignificant. Unfortunately the paper completely misses the real issue which is frequency response.
This confusion is caused by an unfortunate choice of terminology.
The term "damping factor" is very misleading. It has very little to do with the damping of the driver motion. In contrast, it has everything to do with maintaining a flat frequency response and linear phase response (measured at the amplifier terminals). With adequate speaker cables, this frequency and phase response will be delivered to the speaker terminals.
It would have been better if the audio industry had called this the "impedance ratio" instead of the "damping factor".
Damping_Factor = 8/Output_Impedance, where 8 is the nominal speaker impedance.
Damping Factor can also be specified at other nominal impedances, and this adds to the confusion.
Headphone amplifiers are specified differently:
Fortunately it is common practice to specify the output impedance of headphone amplifiers (instead of the damping factor). The audio industry should do the same with power amplifiers.
So, in answer to your question, the output impedance of an amplifier must be low to maintain a predictable frequency and phase response. This means the "damping factor" must be much higher than an analysis of damping would suggest.
The frequency response of a given speaker is only repeatable, from amplifier to amplifier, when all of the amplifiers have a high damping factor. If you connect the speaker to an amplifier with a low damping factor, you will get a different frequency response. This is especially problematic at the low end of the response and at each of the crossover frequencies.
Here is some math:
Let suppose a speaker with an 8-Ohm nominal impedance has one or more frequencies where the impedance drops to 2 Ohms (this is not uncommon).
At a damping factor of 60, the output impedance of the amplifier is 8/60 = 0.133 Ohms.
The attenuation at the 2-Ohm impedance point will be 2/(2+0.133) = 0.938
Converting to dB: 20*Log(0.938)= - 0.56 dB
At a damping factor of 370, the output impedance of the amplifier is 8/370 = 0.0216 Ohms.
The attenuation at the 2-Ohm impedance point will be 2/(2+0.0216) = 0.989
Converting to dB: 20*Log(0.989)= -0.093 dB
In this example, the higher damping factor keeps the frequency response variations just under 0.1 dB. With the lower damping factor, the variation is 0.56 dB.
0.56 dB may not sound like a lot, but keep in mind that it has been shown that A/B and A/B/X tests should be level matched to +/- 0.1 dB.
But there are more issues:
At the crossover frequencies, the speaker impedance can change very rapidly. If the amplifier output impedance is too high this variation in the load impedance can impact that phase alignment between the two drivers at the crossover point. This change in phase alignment can have a big impact on the amplitude response at the crossover frequency.
How high does the damping factor need to be?
Find your speaker's minimum impedance (lowest impedance on the impedance vs. frequency curve) and then apply one of my personal rules of thumb:
Output Impedance Rule of Thumb:
For a 0.1 dB amplitude variation, the output impedance of the amplifier must be 1/86th of the speaker's minimum impedance.
Example:
If the minimum speaker impedance = 2 Ohms:
2/86 = 0.023 Ohms, maximum output impedance
Lower output impedances will reduce the 0.1 dB amplitude variation, but this difference should not be audible.
Damping Factor Rule of Thumb:
For a 0.1 dB amplitude variation, the required 8-Ohm damping factor = 688/(minimum_speaker_impedance).
Example:
If the minimum speaker impedance = 2 Ohms:
The minimum 8-Ohm damping factor = 688/2 = 344
Higher damping factors will reduce the 0.1 dB amplitude variation, but this difference should not be audible.