• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required as is 20 years of participation in forums (not all true). Come here to have fun, be ready to be teased and not take online life too seriously. We now measure and review equipment for free! Click here for details.

Vrms in DACs

Joined
Mar 13, 2019
Messages
28
Likes
5
#1
What is the Vrms analog output and how dose it affect the sound? Will a DAC with nearly 5 Vrms deliver better performance then 2.1 Vrms DAC output?
The DAC will be connected to an Analog amplifier with an input sensitivity of 500 mv.
 

restorer-john

Major Contributor
Joined
Mar 1, 2018
Messages
3,042
Likes
5,441
Location
Gold Coast, Queensland, Australia
#2
The input sensitivity of your amplifier is the voltage required to develop its rated power output. If you significantly exceed the rated sensitivity, you will lose volume control range, you may overload the front end more easily and introduce distortion and degrade S/N.

D/A converters can gain several dB in S/N and DR by having ridiculously high rated outputs. Whether that is at all beneficial depends on the sensitivity of the following stages. In your case, a 2.1V rated output D/A will still need some attenuation to enable correct range of a typical volume pot.
 
Joined
Mar 13, 2019
Messages
28
Likes
5
#3
The input sensitivity of your amplifier is the voltage required to develop its rated power output. If you significantly exceed the rated sensitivity, you will lose volume control range, you may overload the front end more easily and introduce distortion and degrade S/N.

D/A converters can gain several dB in S/N and DR by having ridiculously high rated outputs. Whether that is at all beneficial depends on the sensitivity of the following stages. In your case, a 2.1V rated output D/A will still need some attenuation to enable correct range of a typical volume pot.
I read somewhere that amplifier manufacturers often don't write in specs the maximum amplifer input voltage. Would an DK Design reference vs-1 mkii / CAV A10 handle better a 2.1V from a DAC?
 

restorer-john

Major Contributor
Joined
Mar 1, 2018
Messages
3,042
Likes
5,441
Location
Gold Coast, Queensland, Australia
#4
Many amplifiers and preamplifiers can handle input voltages greatly in excess of their rated sensitivities. For instance, a 150mV for 1V out preamplifier may be able to take >5-10V on an input before overload/clipping. Unfortunately, there are also plenty of modern amplifiers that can't deal with much more than a few volts on line inputs before overloading.

The best rule of thumb is to match, or be a little above the the rated sensitivity. So your 2.1V @0dBFS rated D/A converter will likely be fine, although your volume range will be less. You will achieve maximum rated power lower in the rotation/range of your volume pot, that's all. The 5V rated one is way too much and would need significant attenuation to have a usable volume range IMO.
 
Joined
Mar 13, 2019
Messages
28
Likes
5
#5
Many amplifiers and preamplifiers can handle input voltages greatly in excess of their rated sensitivities. For instance, a 150mV for 1V out preamplifier may be able to take >5-10V on an input before overload/clipping. Unfortunately, there are also plenty of modern amplifiers that can't deal with much more than a few volts on line inputs before overloading.

The best rule of thumb is to match, or be a little above the the rated sensitivity. So your 2.1V @0dBFS rated D/A converter will likely be fine, although your volume range will be less. You will achieve maximum rated power lower in the rotation/range of your volume pot, that's all. The 5V rated one is way too much and would need significant attenuation to have a usable volume range IMO.
Is there a way to adjust the rated Voltage? My first plan was to use a Raspberry Pi with Khadas Tone Board as an DAC and Media Player. The Khadas has 2.05 Vrms. Will it do the job or do you recommend different DAC board ?
 

MZKM

Senior Member
Joined
Dec 1, 2018
Messages
452
Likes
377
Location
Land O’ Lakes, Florida
#7
you will lose volume control range,
Do you mean simply that the lowest volume setting (one above mute) would be louder than if fed its rated input?

Also, what happens when you use a 2Vrms DAC output with an amp with an input sensitivty of say 1.5Vrms? Does the amp simply reach its rated max output at a lower volume control level, and then clip after that (whereas it wouldn’t clip if fed 1.5Vrms)?
 

restorer-john

Major Contributor
Joined
Mar 1, 2018
Messages
3,042
Likes
5,441
Location
Gold Coast, Queensland, Australia
#8
Do you mean simply that the lowest volume setting (one above mute) would be louder than if fed its rated input?

Also, what happens when you use a 2Vrms DAC output with an amp with an input sensitivty of say 1.5Vrms? Does the amp simply reach its rated max output at a lower volume control level, and then clip after that (whereas it wouldn’t clip if fed 1.5Vrms)?
Yes, a high level input (considerably above rated) will mean you will get all your volume rang in a small rotation. Right near the lowest point it may be louder than you expect. Also, conventional pots track (L-R) very poorly down that far.

HiFi Amplifiers (generally) have a fixed gain with an attenuator up front to scale the input voltage to the following stage/s to give a semblance of level control.

D/A converters (generally) have a fixed output level for 0dbFS. Back in the earliest days of CD, before the machines had even hit the market, the 'standard' was set at 2.0V (after being revised up from a previously agreed output of 1.4V (see below post). There are various reasons for this, not the least of which was marketing and a desire to get as bigger S/N numbers as they could and also be able to actually use and demonstrate the full 16bit capability of the format.

https://www.audiosciencereview.com/forum/index.php?threads/sinad-measurements.4071/#post-95590

A 2V (at 0dBFS) rated D/A converter into a 1.5V rated sensitivity amplifer would be a reasonable match, depending on the type of content you listened to. Heavily compressed modern recordings where the peaks and averages are within a few dB of 0dBFS would mean your amp would be unlikely to over driven regularly, just sometimes. A fan of wide dynamic range classical music however would be left wanting as the output would be insufficient most of the time and only 'right' at crescendos. That said, it'd be a safe option.

A typical 2.0V rated output digital device is fine into anything from 150mV to 1.5V as long as you exercise judgement with the volume control. Bear in mind, there is nothing wrong (and a whole lot of right) in having a volume control up near its maximum if your levels are appropriate. S/N is often very much improved near max position.
 

Similar threads

Top Bottom