• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

DAC/Amp Impedance Matching?

FourT6and2

Member
Joined
Oct 4, 2019
Messages
80
Likes
58
I understand the relationship between headphone amp output impedance and headphone impedance.

But what is the relationship between DAC and headphone amp? For example, RCA or XLR output on the DAC and the impedance rating of the RCA or XLR input on the headphone amp? Do they need to be matched? Does one need to be higher than the other? Do you want them both as low as possible? Just trying to figure out if the DACs and the amps I'm considering will play nicely with each other.
 
You'd like the amp input impedance to be at least 10 times the output impedance of the DAC. Usually this is not a problem.
 
It's the same idea. You want the input impedance (load; amp) to be significantly greater than the output impedance of the source (dac), neglecting the impedance of their interconnect.
 
I understand the relationship between headphone amp output impedance and headphone impedance.
But what is the relationship between DAC and headphone amp? For example, RCA or XLR output on the DAC and the impedance rating of the RCA or XLR input on the headphone amp? Do they need to be matched? Does one need to be higher than the other? Do you want them both as low as possible? Just trying to figure out if the DACs and the amps I'm considering will play nicely with each other.
Just about any line-output (3.5mm or RCA) on a DAC, will be able to work with just about any line-input (3.5mm or RCA) on an amplifier.
Just about any balanced (XLR or 1/4" TRS) output on a DAC, will work with just about any balanced (XLR or 1/4" TRS) input on an amplifier.
So the chance of any DAC not working with any amplifier, is very remote.
 
Hello everyone,

This is my first post in the forums. I am not knowledgeable about audio technologies. That is why I found this site interesting and decided to sign up. As the OP, I was aware of the rule of thumb about headphone impedance vs. output impedance. But I did not know that also the output impedance of a DAC and the input impedance of an amplifier follow the same logic before. Glad that I saw this topic. :)

Currently I have a portable headphone amp connected to the analog output of the sound card in my PC, an Asus Xonar Essence STX. The reason for this apparently unnecessary connection is that the output impedance in the portable headphone amp, a Fiio E12 Mont Blanc, is much lower than that of the headphone output in the STX and therefore more suitable for driving low impedance headphones (I have a few of them). I do not know the output impedance of the analog output (stereo RCA) of the sound card, but according to the specs of the portable headphone amp, it has an input impedance superior to 5,000 ohms. So I guess I have nothing to worry about. Hehe.
 
Hello everyone,

This is my first post in the forums. I am not knowledgeable about audio technologies. That is why I found this site interesting and decided to sign up. As the OP, I was aware of the rule of thumb about headphone impedance vs. output impedance. But I did not know that also the output impedance of a DAC and the input impedance of an amplifier follow the same logic before. Glad that I saw this topic. :)

Currently I have a portable headphone amp connected to the analog output of the sound card in my PC, an Asus Xonar Essence STX. The reason for this apparently unnecessary connection is that the output impedance in the portable headphone amp, a Fiio E12 Mont Blanc, is much lower than that of the headphone output in the STX and therefore more suitable for driving low impedance headphones (I have a few of them). I do not know the output impedance of the analog output (stereo RCA) of the sound card, but according to the specs of the portable headphone amp, it has an input impedance superior to 5,000 ohms. So I guess I have nothing to worry about. Hehe.
The output impedance of a line-output jack (like RCA or 3.5mm) and the impedance of the line-input jack, do not follow the same guide lines (or logic), as headphone amplifier output impedance to headphone input impedance (Ohm).
So the 8 to 1 or 10 to 1 impedance guideline about head amp to headphone, does not apply to a line signal, between input and output jacks.
I've never seen impedance guidelines for connecting line jacks.
 
Where did you read this?

You do not have to read something somewhere that can be stated using just common sense and some experience.

actually it is ≥ 9.869604401 times because it sounds so scientific with pi and 'square' in its calculations :D.
Personally I like to keep it > 100x but that's just my line of thinking and merely to be safe.
 
Last edited:
You do not have to read something somewhere that can be stated using just common sense and some experience.
actually it is ≥ 9.869604401 times because it sounds so scientific with pi and 'square' in its calculations :D.
Personally I like to keep it > 100x but that's just my line of thinking and merely to be safe.
Where is this common sense learned from?
About the impedance (Ohm) issues between a DAC output and an amplifier's into?
In all my years being involved with the website forum Head-Fi, never seen any real talk about impedance issues with line jacks.
 
That may be because there aren't any issues regarding impedances between line-level sources and line-level inputs as long as there aren't specific load impedances mentioned in the manuals or cable lenghts become way too long.

The common sense could be a combination of practical experiences and electronics 'theory' learned somewhere over a not specified time period.
I never read the >10 x somewhere myself nor do I suspect Blumlein88 has. So ... common sense and no rule.
More like a guideline for those wanting to go by numbers. >10x is probably O.K. >100 is better unless noise becomes an issue.

As said, I keep it at>100x usually unless it is important to deviate for technical reasons.

ForT6and2 as well as Jumbotron have nothing to worry about.
This is something we all agree about.
 
Where did you read this?
It is the way solderdude described it. My preference is 20 to 1 or better, but 10 is okay.

My experience is in regard to using passive volume controls between DACs and power amps. Normally between line level sources and amp inputs there just isn't a problem. Likely why it normally isn't a topic one is likely to see.

Now you may be thinking one reason to keep output impedance low with speakers or headphones is due to variable impedance vs frequency which you usually don't have between line outs and amp inputs. That part is true. For the benefit of using long lines and not having high frequency issues keeping a 10 to 1 ratio is good advice. Doing so is easy and certainly causes no issue.
 
Only reason I asked is because I didn't know if it mattered. Seems to matter with headphones, maybe it matters (or doesn't) with DAC-to-Amp connection. And since all (or most) manufacturers do list the DAC output impedance and the amp input impedance, I want to make sure before I buy.

That said, I have come across some specs that are very near the edge. For example, two of the DACs I'm considering have output impedances on the XLR jacks of (A) 135ohms and (B) 1,000ohms respectively. While two of the amps I'm considering have input impedances of (1) 50,000ohms and (2) 10,000ohms.

If the relationship should be 10x, then DAC B wouldn't work with Amp 2.
If the relationship is 100x, then Neither DAC A nor DAC B would work with Amp 1 or Amp 2.

And by "work" I mean meet whatever rule of thumb exists.
 
What DAC specified the output resistance as 1k ?
Perhaps they meant minimum load resistance ?
a 10k to 100k load is typical for amplifiers.
This is a resistive load.
No problem connecting sources to it between 0 Ohm and 1000 Ohm.

It will work perfectly and flawless in all situations.
 
Cayin iDAC-6MK2 has two modes: solid state and tube (for some reason). SS output impedance = 20ohms and tube output impedance = 1,000ohms. In any event, it seems like a non-issue for the majority of "normal" DACs out there.
 
Only reason I asked is because I didn't know if it mattered. Seems to matter with headphones, maybe it matters (or doesn't) with DAC-to-Amp connection. And since all (or most) manufacturers do list the DAC output impedance and the amp input impedance, I want to make sure before I buy.
That said, I have come across some specs that are very near the edge. For example, two of the DACs I'm considering have output impedances on the XLR jacks of (A) 135ohms and (B) 1,000ohms respectively. While two of the amps I'm considering have input impedances of (1) 50,000ohms and (2) 10,000ohms.
If the relationship should be 10x, then DAC B wouldn't work with Amp 2.
If the relationship is 100x, then Neither DAC A nor DAC B would work with Amp 1 or Amp 2.
And by "work" I mean meet whatever rule of thumb exists.
There is no 10X or 100X "issues" with a normal line signal, between a DAC and a headphone amplifier.
As long as the DAC's line-output signal has a decent voltage (which I'm guessing 99% of them do), your ok.
 
Impedance matching is pointless at audio frequencies. You instead want impedance bridging (low Zout and at least 10x higher Zin). As already pointed out many time in this thread. It gives good integrity (maximum voltage transfer) for your line level signals and a nice high damping factor between your amp and drivers.

That's my limited understanding at least. I bet someone other (?) can explain it a lot better.

Impedance matching is only good for RF applications (including digital signals). If anybody tries to sell you the idea that impedance matching is the recently uncovered key to audio nirvana, it should be a major red flag to you. Don't trust the technical expertise of people who claim that.

There's more than one very good reason as to why impedance bridging is the industry standard.
 
Cayin iDAC-6MK2 has two modes: solid state and tube (for some reason). SS output impedance = 20ohms and tube output impedance = 1,000ohms. In any event, it seems like a non-issue for the majority of "normal" DACs out there.

Yes, a tube output would do that. Depending on the way the tube is used (as a gain device or signal follower) the 10k amp may lower the output signal about 0.8dB. The 50k input amp would give 0.17dB attenuation which is negligible.
Higher capacitance RCA cables will not influence the frequency response in case of the tube output either.
So aside from the 0.8dB the FS output will still be 2V (instead of 2.2V) on the 1k output.

No problem, even in this particular case.

EDIT: Given the high amount of added harmonics (avoiding the negative term distortion) I suspect there will be an audible difference between the 2 output modes and given that amount it is most likely a tube in amplification mode with an attenuator after it to get back to line levels.
 
Last edited:
What DAC specified the output resistance as 1k ?
Perhaps they meant minimum load resistance ?
a 10k to 100k load is typical for amplifiers.
This is a resistive load.
No problem connecting sources to it between 0 Ohm and 1000 Ohm.

It will work perfectly and flawless in all situations.


so even if the output impedance of a DAC is nearly 0 ohm, there would be no difference to the same DAC design was with 500 ohms of output impedance.

A commenter from REDDIT thinks in a different way;
 
Back
Top Bottom