• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

How much bass do I lose for DAC into low input impedance preamp?

Rantenti

Member
Joined
Mar 11, 2021
Messages
93
Likes
19
Hello, I've read that one would lose bass if the input impedance of a preamp is less than 10 times the output impedamce of the DAC.

Why is the general recommendation ten times? Is there a calculator or chart I can use to understand the actual loss of different frequencies in various output to input impedance ratios?

Thanks
 

wwenze

Major Contributor
Joined
May 22, 2018
Messages
1,331
Likes
1,883
Well, first thing that will definitely need clarification, is what the word "output impedance" here is referring to. In this case this is not just referring to the "impedance @ 1kHz" number that is often thrown around, but impedance at all frequencies of concern. 20Hz, 100Hz... etc.
Why I'm mentioning this point as important, is because a typical lineout will have output impedance of <100ohm at 1kHz, but >1kohm at 20Hz.

Next, understand the typical circuit and the source of the bass loss: There is almost always an output capacitor in series with the signal, which is needed to block DC. However a capacitor in series with the load resistance creates a high pass filter.

And in most cases, knowing the value of the capacitor and the value of the load resistor will give you the cutoff frequency. And in reverse you can choose your desired cutoff frequency and calculate the required R and C combination.

I would believe the above method is more intuitive and is what most people would do. Basically find the size of the output capacitor and the load resistance.

Now to tackle this statement, "one would lose bass if the input impedance of a preamp is less than 10 times the output impedance of the DAC"

Capacitor's impedance is a function of frequency. The lower the frequency, the higher the impedance. At low frequencies your output impedance is dominated by capacitor impedance.

This impedance forms a voltage divider with the load. If your load impedance is just 0.1 times of the capacitor impedance at that frequency, then your only get around 10% of the voltage, or in decibel terms, -20dB. That in itself is not yet a problem, but because capacitor impedance decreases with higher frequency, at a higher frequency, maybe your load will receive 99% of the voltage, which means your bass will be 20dB lower than your treble.

Frankly, this is the first time I have seen people recommend things this way in the form of impedances, and I'm impressed with its technical correctness. I can't say I agree with the suggested values however, because the typical definition of cutoff is -3dB and indeed anything less loud than that is audibly reduced. So if I were to rewrite the statement, I would say "load impedance needs to be at least equal to source impedance across all frequencies of concern".
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,397
Likes
3,348
Location
.de
I would believe the above method is more intuitive and is what most people would do. Basically find the size of the output capacitor and the load resistance.
More specifically, R should be the sum of load and output resistance, although the latter is generally much smaller than the former and as such doesn't make a great deal of difference. (The same is not necessarily true when we're talking headphones, where an added 75 ohms in series can save otherwise marginal bass reproduction with 32 ohm headphones when coupling capacitors are no more than 100 µF - a typical scenario for onboard audio outputs. It also makes the headphone driver's job easier. This "fix" also increases interaction with non-constant headphone impedance though, so is not generally Hi-Fi approved.)
 
Top Bottom