• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Output voltage specifications make no sense to me

Superfly

New Member
Joined
Dec 15, 2020
Messages
1
Likes
1
Location
USA
There are a lot of past threads on source or preamp output voltages and amp pairings. My question is much more basic.

When a spec sheet for a CD player says "Output Voltage (RCA) - 2V", that makes no sense to me. First, output voltage is constantly swinging up and down in response to the music signal. There is no constant voltage unless you're playing a test signal, like a 1kHz tone.

You might think that the 2V being quoted is RMS (average). But that makes no sense either, since the average voltage actually being output depends on the setting of the volume control (in the case of a preamp or variable output source) as well as the level of the recording being played. So an RMS voltage spec can only be relevant at a specific volume setting and a specific recording level.

About a month ago I wanted to use the preamp outputs from an integrated amp to drive a separate amplifier. The manufacturer's quoted spec for the preout voltage is 1V. The outboard amp requires 2.1V to reach full output. So I asked the manufacturer of the integrated amp what the 1V spec means, but received no response.

So I got out my DVM and played a 400 Hz test tone through the system, while playing with the volume control and measuring the voltage at the pre-outs (which were not connected to anything). It got up to about 3.5V before triggering the amp's protection circuitry which caused it to shut down.

So the spec obviously isn't a max voltage, and if it's RMS then how is anyone but the manufacturer supposed to know what assumptions or parameters led to the quoted spec?

Here's a problem. Suppose someone like me wants to buy a power amp that has a stated input sensitivity of 1.6V, but if the preamp (or pre-out in the above case) has a stated output voltage of 1V, then that might cause a lot of people to avoid that pairing. And that would be a shame, since it's highly likely that a preamp with a spec of 1V can easily output enough voltage to drive that power amp to full output.

It's commonly accepted that the "standard" for digital devices is an output of 2V for single-ended and 4V for balanced. But there is no actual standard set by any trade group or governing body. I've seen plenty of sources that output anywhere from 1V to 6V, at least according to their dubious specs.

Somebody please help me understand where these output voltage specs are coming from, and how they can be useful when the real output voltage is constantly variable.
 

alex-z

Addicted to Fun and Learning
Joined
Feb 19, 2021
Messages
914
Likes
1,694
Location
Canada
The CD player is a digital source. When it says 2 volts output, that means when the digital signal is 0dBFS (maximum possible level), the output voltage will be 2V RMS. Most CD players should have a digital output, allowing you to use an external DAC which provides your preferred output voltage.

For a sine wave, the RMS will be .707 of the peak. So when you checked the RCA voltage output of your integrated amp and saw 3.5, the RMS value was likely about 2.475. Although that depends on the operating mode of your digital volt meter.

As for why the manufacturer spec was inaccurate, they may have purposefully listed a lower voltage, at which their distortion profile was better. Much like when speaker manufacturers claim a 20-20000Hz frequency response, there is no regulatory body that encourages or forces companies to meet standards.

This is why third party measurements matter.
 

rationaltime

Member
Joined
Jan 30, 2023
Messages
68
Likes
55
There are a lot of past threads on source or preamp output voltages and amp pairings. My question is much more basic.

When a spec sheet for a CD player says "Output Voltage (RCA) - 2V", that makes no sense to me. First, output voltage is constantly swinging up and down in response to the music signal. There is no constant voltage unless you're playing a test signal, like a 1kHz tone.

You might think that the 2V being quoted is RMS (average). But that makes no sense either, since the average voltage actually being output depends on the setting of the volume control (in the case of a preamp or variable output source) as well as the level of the recording being played. So an RMS voltage spec can only be relevant at a specific volume setting and a specific recording level.

About a month ago I wanted to use the preamp outputs from an integrated amp to drive a separate amplifier. The manufacturer's quoted spec for the preout voltage is 1V. The outboard amp requires 2.1V to reach full output. So I asked the manufacturer of the integrated amp what the 1V spec means, but received no response.

So I got out my DVM and played a 400 Hz test tone through the system, while playing with the volume control and measuring the voltage at the pre-outs (which were not connected to anything). It got up to about 3.5V before triggering the amp's protection circuitry which caused it to shut down.

So the spec obviously isn't a max voltage, and if it's RMS then how is anyone but the manufacturer supposed to know what assumptions or parameters led to the quoted spec?

Here's a problem. Suppose someone like me wants to buy a power amp that has a stated input sensitivity of 1.6V, but if the preamp (or pre-out in the above case) has a stated output voltage of 1V, then that might cause a lot of people to avoid that pairing. And that would be a shame, since it's highly likely that a preamp with a spec of 1V can easily output enough voltage to drive that power amp to full output.

It's commonly accepted that the "standard" for digital devices is an output of 2V for single-ended and 4V for balanced. But there is no actual standard set by any trade group or governing body. I've seen plenty of sources that output anywhere from 1V to 6V, at least according to their dubious specs.

Somebody please help me understand where these output voltage specs are coming from, and how they can be useful when the real output voltage is constantly variable.
In my opinion the question is not well stated.

The connected system has a source impedance and a load impedance which creates a
kind of voltage divider. That you measured with the outputs open is but one data point.
Further, when we read reviews we find the output signal to noise ratio (abbreviated
"SINAD" on this site) is measured at some output levels and frequency. If there were one
number the data sheet output level should be measured while meeting all of the other
specifications. Again referring to reviews, we see various graphs showing performance.
Despite a preference for simplicity they do not reduce to one number.

You are not even measuring the primary outputs of the amplifier. I see your frustration.
However, to suggest that every pin should be characterized to interface protocol standards
seems to me an unrealistic expectation.
 

solderdude

Grand Contributor
Joined
Jul 21, 2018
Messages
16,051
Likes
36,427
Location
The Neitherlands
CD and DAC output voltages are RMS at FSD (so max. voltage swing) and can be anything between between 1V (for dongles and some USB fed devices) up to several volt.
When there is an output level control this is then set at max. level or set at a standard level.

Amir uses 2V and 4V to level the playing field (so measurements are comparable) and if they can go higher or do not reach it then it is mentioned or shown.
2Vrms = 5.66Vpeak-peak (2.83V peak).

Balanced usually is double the RCA output voltage.

I have not seen any recent pre-amps that have max. 1V out. This usually is much, much higher.
In the old DIN plug era 1V could have been max. (0.4V was the norm).
we find the output signal to noise ratio (abbreviated
"SINAD" on this site)

SINAD is not S/N ratio. S/N ratio is shown in 'the dashboard' b.t.w.
SINAD is referenced to 2V (or 4V) and shows distortion opposite a 1kHz sine wave at that specific level only.
It does not show S/N ratio but the SINAD can be worse because of a poor S/N ratio (noise dominant over distortion).
 
Last edited:
Top Bottom