First of all, I wish all ASR-members a happy and healthy 2024!
The "Damping factor" difference between the NAD298 and the AHB2 is not the reason for the difference in bass production Kai heard. In the following I try to explain this in a (simplified) way.
"Damping factor" is merely a (confusing) definition of the output impedance of the amplifier (
nominal speaker impedance/output impedance) and its name hints to a phenomenon where it has (some) influence: the damping of the loudspeaker cone.
Let's have a look at this cone damping:
Suppose the amplifier sends a pure sign wave to the loudspeaker causing the cone to move with a certain frequency. Suppose now the amp suddenly stops feeding the loudspeaker with this signal. Ideally the cone would need to stop moving at once as well. It doesn't do this however due to the inertia of the cone, voice coil and other mechanical components attached to it. There always will be some purely mechanical damping due to mechanical resistance of the spider etc. which slows down the movement.
There is an additional source of damping: The moving voice coil in the permanent magnetic field produces a voltage that tends to create a current and accompanying magnetic force in the opposite direction of its movement. This is Lenz's Law acting and helps to dampen the movement. The magnitude of this damping is determined by the current that's able to flow; the larger, the faster the damping.
This means that the total of the impedances of the electrical circuit formed by the amplifier output, speaker cable, passive cross over and voice coil needs to be as low as possible.
To see which of these elements are important we need to look at the relative magnitudes of the impedances. For simplicity I assume that the impedance of the loudspeaker itself is in the order of a few Ohms. The speaker cable adds a few hundredths of Ohms max. In order to substantially influence the total impedance of the circuit the output impedance should be in the same order of magnitude as the loudspeaker impedance or larger. This probably used to be the case in the era of tube amps, but with modern transistor amps the output impedance is a few tenths of an Ohm max in most situations.
The conclusion is that with modern transistor amplifiers the main factor determining the amount of electromagnetic cone damping is the impedance of the voice coil/cross over circuit in the speaker itself: The influence of the output impedance is negligible.
This is an important reason to abandon the term "Damping Factor" and shows that whatever difference Kai heard between the NAD and the Benchmark was not caused by the difference in output impedance (damping factor).
The second and most important aspect where the output impedance plays a role lies in the fact that it's part of a voltage divider circuit between the amp output and the loudspeaker. Because of the fact that the loudspeaker impedance is complex and therefore varying with frequency it is important to keep the combination of speaker cable and output impedance sufficiently low. As mentioned in a thread before this is described in an excellent article by John Siau of Benchmark Audio (
Myth - "Damping Factor Isn't Much of a Factor" Myth - "A Damping Factor of 10 is High Enough" Myth - "All Amplifiers Have a High-Enough Damping Factor" Where did these Myths Originate? These myths seem to trace back to a well-known paper (PDF) written by Dick Pierce. His analysis shows that a...
benchmarkmedia.com
).
In view of the above it is my strong wish that the industry stops specifying the output impedance as "Damping Factor". Perhaps once a brilliant marketing concept to distinguish transistor from tube amps, but today a source of confusion and misunderstanding.