There are a lot of past threads on source or preamp output voltages and amp pairings. My question is much more basic.
When a spec sheet for a CD player says "Output Voltage (RCA) - 2V", that makes no sense to me. First, output voltage is constantly swinging up and down in response to the music signal. There is no constant voltage unless you're playing a test signal, like a 1kHz tone.
You might think that the 2V being quoted is RMS (average). But that makes no sense either, since the average voltage actually being output depends on the setting of the volume control (in the case of a preamp or variable output source) as well as the level of the recording being played. So an RMS voltage spec can only be relevant at a specific volume setting and a specific recording level.
About a month ago I wanted to use the preamp outputs from an integrated amp to drive a separate amplifier. The manufacturer's quoted spec for the preout voltage is 1V. The outboard amp requires 2.1V to reach full output. So I asked the manufacturer of the integrated amp what the 1V spec means, but received no response.
So I got out my DVM and played a 400 Hz test tone through the system, while playing with the volume control and measuring the voltage at the pre-outs (which were not connected to anything). It got up to about 3.5V before triggering the amp's protection circuitry which caused it to shut down.
So the spec obviously isn't a max voltage, and if it's RMS then how is anyone but the manufacturer supposed to know what assumptions or parameters led to the quoted spec?
Here's a problem. Suppose someone like me wants to buy a power amp that has a stated input sensitivity of 1.6V, but if the preamp (or pre-out in the above case) has a stated output voltage of 1V, then that might cause a lot of people to avoid that pairing. And that would be a shame, since it's highly likely that a preamp with a spec of 1V can easily output enough voltage to drive that power amp to full output.
It's commonly accepted that the "standard" for digital devices is an output of 2V for single-ended and 4V for balanced. But there is no actual standard set by any trade group or governing body. I've seen plenty of sources that output anywhere from 1V to 6V, at least according to their dubious specs.
Somebody please help me understand where these output voltage specs are coming from, and how they can be useful when the real output voltage is constantly variable.
When a spec sheet for a CD player says "Output Voltage (RCA) - 2V", that makes no sense to me. First, output voltage is constantly swinging up and down in response to the music signal. There is no constant voltage unless you're playing a test signal, like a 1kHz tone.
You might think that the 2V being quoted is RMS (average). But that makes no sense either, since the average voltage actually being output depends on the setting of the volume control (in the case of a preamp or variable output source) as well as the level of the recording being played. So an RMS voltage spec can only be relevant at a specific volume setting and a specific recording level.
About a month ago I wanted to use the preamp outputs from an integrated amp to drive a separate amplifier. The manufacturer's quoted spec for the preout voltage is 1V. The outboard amp requires 2.1V to reach full output. So I asked the manufacturer of the integrated amp what the 1V spec means, but received no response.
So I got out my DVM and played a 400 Hz test tone through the system, while playing with the volume control and measuring the voltage at the pre-outs (which were not connected to anything). It got up to about 3.5V before triggering the amp's protection circuitry which caused it to shut down.
So the spec obviously isn't a max voltage, and if it's RMS then how is anyone but the manufacturer supposed to know what assumptions or parameters led to the quoted spec?
Here's a problem. Suppose someone like me wants to buy a power amp that has a stated input sensitivity of 1.6V, but if the preamp (or pre-out in the above case) has a stated output voltage of 1V, then that might cause a lot of people to avoid that pairing. And that would be a shame, since it's highly likely that a preamp with a spec of 1V can easily output enough voltage to drive that power amp to full output.
It's commonly accepted that the "standard" for digital devices is an output of 2V for single-ended and 4V for balanced. But there is no actual standard set by any trade group or governing body. I've seen plenty of sources that output anywhere from 1V to 6V, at least according to their dubious specs.
Somebody please help me understand where these output voltage specs are coming from, and how they can be useful when the real output voltage is constantly variable.