• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Help me choose the output voltage from my DAC

Purpl3n3ss

Member
Joined
Jul 26, 2022
Messages
13
Likes
2
Tagging @DonH56 because of his understanding of this topic.

My DAC output voltage has 4 discrete settings: 0.252 Vrms, 0.797 Vrms, 2.52 Vrms, and 7.957 Vrms.

The input sensitivity of my integrated amplifier is listed as 0.6 Vrms.

Now, between the preamp and power amp circuits of the integrated amp, there is a stepped attenuator for volume control. I understand the input sensitivity of the power amp section is 1.2 Vrms. So, it seems to me that the preamp circuit (which cannot be bypassed) amplifies a 0.6 Vrms signal to 1.2 Vrms. Anything above 1.2 Vrms is "thrown away" by the stepped attenuator (not sure if this is done before or after the volume control). My listening volume setting is about 60%, when using 0.797 Vrms input.

My question: In this scenario, should I use the 0.797 Vrms DAC output, or is there any advantage to using 2.52 Vrms? (I assume the other two choices are totally unsuitable.)
 
Aside: All voltages are RMS so I am leaving that qualifier off the units (lazy).

The attenuator does not "throw away" voltage, assuming the maximum it can handle is well over 1.2 V, it just reduces (attenuates) whatever voltage is applied. But if more than 1.2 V is applied to the power amp, and that is max input for rated output, the amp may clip if the voltage exceeds that amount. There is probably a little headroom above the sensitivity number, but we've no way to tell without tests and measurements, so assume it clips over 1.2 V input.

A passive volume control is an attenuator; thus the stepped attenuator is the volume control. The stepped part means it changes the volume in discrete steps (e.g. 0.1, 0.5, 1.0 dB or whatever) instead of be a continuous control. That does not make it digital, though the steps may be under digital control, it just means you change the volume in steps. The signal remains analog throughout, but attenuation is in discrete steps.

The preamplifier normally includes a gain stage to increase small input levels to meet the needs of the power amplifier so unless you are connecting the DAC directly to the amplifier's input the 1.2 V does not really matter, assuming it has sufficient internal gain to take 0.6 V to 1.2 V, a voltage gain of 2 V/V or 6 dB. The gain stage can be before, after, or there can be gain on both sides of the volume control (a design choice by the manufacturer of your integrated amp).

60% of 0.797 V is 0.478 V, but that does not mean that is the average voltage through the amp. The average is probably significantly less than that, and peaks likely a little above that. Peak to average levels for music are about 17 dB for very dynamic music, a ratio of about 7.1 in voltage. In other words, your average level is likely to be about 1/7 the peak level. But just knowing the volume control is at about 60% doesn't tell us what the actual signal levels are, since we do not know how much gain there is inside the integrated amp, nor the actual average and peak levels of the source (music). It is interesting because it provides insight into the gain structure of the amp and where you like to listen; in the middle of the volume range is not a bad place to be.

Given the input sensitivity is 0.6 V I would use either 0.252 V or 0.797 V output from the DAC. Note 0.252 V is 42% or -7.5 dB (20*log10(0.42) ) below the 0.6 V sensitivity rating, so the preamp must provide that much gain (7.5 dB, 2.37 V/V) for the amp to be driven to maximum output. Most preamps offer at least 10 dB, so that should work, but it also means you'll be turning up the volume control, and potentially hear more noise. Using 0.797 V, the max DAC output is 2.5 dB more than the 0.6 V sensitivity spec, a very minor loss in max dynamic range. Or no loss as long as the input does not clip and simply attenuates the input by 2.5 dB, then the dynamic range will be set by the amplifier's noise floor (assuming higher than the DAC's, a reasonable assumption IME). The 2.52 V max output level is 12.5 dB above the input spec of 0.6 V, so you're losing a lot of signal by attenuating that much, and may well be overdriving (clipping) the preamplifier's input with loud signals.

I would stick with the 0.797 V setting. It is unlikely to clip the input stage, allows for full output power, and probably stays well above the noise floor. You are just above the middle of your volume control, leaving plenty of margin to increase or decrease the level as desired.

HTH/IMO/etc. - Don
 
This is such a wonderfully detailed answer - thanks @DonH56! I will stick to the 0.797 V setting.

The gain stage can be before, after, or there can be gain on both sides of the volume control (a design choice by the manufacturer of your integrated amp).

They told me the preamp is before the volume control (stepped attenuator).

60% of 0.797 V is 0.478 V, but that does not mean that is the average voltage through the amp. The average is probably significantly less than that, and peaks likely a little above that.

Say the average voltage is 0.35 V. It seems like the preamp has a (fixed?) gain factor of 2 V/V. Wouldn't that send a 0.7 V signal to the power amp, which would be much less than it's 1.2 V input sensitivity?

Note 0.252 V is 42% or -7.5 dB (20*log10(0.42) ) below the 0.6 V sensitivity rating, so the preamp must provide that much gain (7.5 dB, 2.37 V/V) for the amp to be driven to maximum output.

But 0.6 V is the input sensitivity of the integrated amp as a whole, not the input sensitivity of the power amp section, which is 1.2 V. Wouldn't this mean the preamp would have to provide 7.5 dB + 6 dB = 13.5 dB, to reach 1.2 V? And that if the preamp gain factor is fixed at 2 V/V (6dB), it would not be able to provide this?
 
This is such a wonderfully detailed answer - thanks @DonH56! I will stick to the 0.797 V setting.
You are welcome.
They told me the preamp is before the volume control (stepped attenuator).
I don't know who "they" are but that's fine. Gain can be before, after, or both places without substantially impacting the sound.

Say the average voltage is 0.35 V. It seems like the preamp has a (fixed?) gain factor of 2 V/V. Wouldn't that send a 0.7 V signal to the power amp, which would be much less than it's 1.2 V input sensitivity?
I have no idea the preamp's gain; 2 V/V would be 6 dB. There's no good way to determine the average voltage without a lot more information, and to prevent clipping you want the maximum (peak) voltage to reach 0.6 V (or 1.2 V at the power amp inputs), not the average voltage. I would expect your average to be well below 0.35 V. I could go into a big long discussion but my advice is to not worry about it.

A little more: I do not know if 60% volume is a fraction of a linear or logarithmic (e.g. in dB) control. If it was linear as I assumed above (but could very well be wrong) then 60% of 0.6 V is 0.36 V, but frankly this is a case of precision without accuracy (or more than passing relevance). We know the peak-to-average ratio for dynamic music is around 17 dB, with much music coming in around 10 dB, and a max of maybe 20 dB. That means your average level is about 1/3 to 1/10 the peak level. Say you are peaking at 0.36 V, then the average level is probably some where between 10 and 20 dB less, or between 0.036 V and 0.114 V. OTOH, peaks could be well above 0.6 V depending upon the DAC's output level and preamp's gain. There's really no way to tell from outside without measurements. If you are not hearing clipping then it is not an issue. If it is loud enough for you, there is nothing wrong with not driving the amp to max output as long as it is not so quiet that all you hear is noise (hiss).

But 0.6 V is the input sensitivity of the integrated amp as a whole, not the input sensitivity of the power amp section, which is 1.2 V. Wouldn't this mean the preamp would have to provide 7.5 dB + 6 dB = 13.5 dB, to reach 1.2 V? And that if the preamp gain factor is fixed at 2 V/V (6dB), it would not be able to provide this?
Again, nobody knows exactly what the peak voltage is from your DAC under your playing conditions and playing your music. We know the maximum the DAC will output is about 0.8 V, more than enough for full output power. Nor do we know if the preamp's gain is fixed at 6 dB (I would expect it to be higher), or exactly the attenuation with the volume control set at 60% of max. This is a case of GIGO -- garbage in, garbage out -- because there are too many unknowns. If it plays loudly enough for you without clipping, and you do not hear excessive noise, then you are done. Not worth obsessing about IMO.

Is there a problem you are trying to address? I assumed since you said you were at 60% volume that it was loud enough for you, and if so, IMO nothing more need be done. Just enjoy it!
 
Last edited:
I don't know who "they" are but that's fine.

Should have specified: "they" = the manufacturer of the amp (Linear Tube Audio).

If it is loud enough for you, there is nothing wrong with not driving the amp to max output as long as it is not so quiet that all you hear is noise (hiss).
If it plays loudly enough for you without clipping, and you do not hear excessive noise, then you are done. Not worth obsessing about IMO.

Is there a problem you are trying to address? I assumed since you said you were at 60% volume that it was loud enough for you, and if so, IMO nothing more need be done. Just enjoy it!

Got it, thanks!

No, no problem at all. The amp sounds great. I posted the question because I was wondering if I "should" use the 2.5V output (instead of 0.8V) for some reason. I came across stuff about higher voltage = higher SNR, and wondered if it might be important for me.
 
No, no problem at all. The amp sounds great. I posted the question because I was wondering if I "should" use the 2.5V output (instead of 0.8V) for some reason. I came across stuff about higher voltage = higher SNR, and wondered if it might be important for me.
Higher voltage may yield higher SNR if the larger signal is that much greater than the noise floor. Doubling the signal level (i.e. by 6 dB) yields about 3 dB SNR increase all else equal. In practice the noise floor of DACs today is so low that is not an issue. Higher output usually means higher distortion and heat, though for a DAC those considerations are again negligible. Using the 2.5 V output is much more likely to overdrive the integrated amplifier's input, which would cause way more distortion, or you will have to turn down the volume control a lot more, leading to effectively higher noise (and thus lower SNR) at the speakers. You are generally better off keeping the input and output levels matched as best as possible.

For a deeper dive (and a very long read) see e.g. https://www.audiosciencereview.com/...opagate-through-my-system.33358/#post-1165118
 
Back
Top Bottom