• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

why are XLR outputs typically 4V and unbalanced RCS 2V?

VientoB

Addicted to Fun and Learning
Joined
Apr 19, 2024
Messages
729
Likes
480
Location
United Kingdom
Why is there this difference and doesnt it mean that you have to turn your volume knob down when using XLR and isnt that not so good as the pot is attenuating the signal more?
 
Back when I started, 'line out' was typically half a Volt, but the gain structures of those amps meant that their volume controls were set well over half-way before any real risk of clipping occurred I remember.

'Red Book' CD players decided to standardise on 2V being the output and the volume controls of these amps were then cramped down to just off minimum and usually never more than 25% up before the amp began to clip and the old gain structures of this type are still seen too often today in new products as measured by Amir when he tries to set a 2V maximum preamp output level and finds they barely make 1.6V before distortion rises. Other amps had circuit buffers before the volume control and these often clipped irrespective of the volume control setting.

Over the years, some amp designs have adjusted their internal gain settings and later sources could offer higher outputs too, but not all and power amp sensitivity of half a volt for full output is still seen I gather.

As for balanced XLR at 4V or so, I think that's an established 'pro' level and I for one, have always liked the concept of pushing as much 'signal' down the cables as possible. Maybe the CD single ended levels of 2V is half that for a genuine reason, but maybe, the snail-pace adaptation of domestic audio (using the same old circuits in tweaked form for decades) has prevented many amps from catching up - and anyway, some tech-ignorant 'audiophiles' think that full volume at 25% off 'zero' makes their amp more 'powerful' than adjusting gain for full output at 25% off *maximum.*
 
So we could really do with amps that have lower gain these days? Or at least a "low gain" setting?
 
It all depends of the design of the the electronics before the XLR output and the design of the electronics after the XLR input.

There is no substitute to proper specifications or circuit description provided by the manufacturer to know precisely how an output or input work.

Usually, a balanced XLR* outputs twice the voltage as the corresponding unbalanced output because most of the balanced outputs are just made with the unbalanced output on the hot pin of the XLR connector and an inverted voltage of the unbalanced output (usually obtained with an inverting operational amplifier) on the cold pin connector. Hence twice the voltage is available on this kind of XLR output compared to the unbalanced RCA output. This is usual, but far from universal.

And yes, if you keep the gain the same, you have to turn the volume down if the input voltage is increased.

* Note that it is possible to use XLR connectors for outputting or inputting something else than a balanced signal, such as an unbalanced signal.
 
Redbook made 2 v the unofficial standard. Rca is a signal of ground and the positive or hot side varying around zero volt ground. Balanced xlr is two signals in opposite polarity from each other. Like if you had two hot rca signals with one inverted. So the difference becomes 4 volts instead of two. +2 volts and -2 volts.

So a simple way to do Balanced output is going to be twice the voltage. It doesn't have to be this way and sometimes it's not. It is the most common way.
 
I don't know where the 4v number came from, as the UK normal peak level is +8dBu or 2v. This allows 10dB headroom to the EBU standard 0dBFS = +18dBu or 6v, (or 12dB headroom if peak level is +6dBu as is common elsewhere in Europe.)

In the US, 0VU is +4dBu, or 1.2v, so that's different again, and many US Pro digital products have 0dBFS set to +20dBu (8v) or even +24dBu.

So, no, there's no standard except that CD unbalanced has commonly been 2v for 0dBFS, and balanced is often 2x unbalanced, so who knows.

S
 
… it mean that you have to turn your volume knob down when using XLR and isnt that not so good as the pot is attenuating the signal more?
Not if the amplifier with the XLR input is designed to recognise that the likely incoming signal with be twice the voltage. It should be half the sensitivity of the Unbalanced inputs.

I agree with comments above that it’s all about the implementation of both the source outputs and amplifier inputs, however.
 
Balanced outputs in recording studios often went to low impedance transformer inputs, 150, 200, or 600 Ohms, capable of running hundreds of feet. Single-ended inputs went to bridging inputs 10 Kilohms on up and most people would keep them short, under 10 feet.
 
Balanced outputs in recording studios often went to low impedance transformer inputs, 150, 200, or 600 Ohms, capable of running hundreds of feet. Single-ended inputs went to bridging inputs 10 Kilohms on up and most people would keep them short, under 10 feet.
I'm not sure if you quite meant what I understood you to write:-

Balanced outputs have for something like the past 60 years been low impedance outputs, going into bridging (10k or thereabouts) inputs.

600 ohms terminating impedance outputs and inputs haven't been used for the past 60 years except for the very rare long-distance (hundreds of Km) analogue connections. These have mostly been replaced with digital feeds, originally something like NICAM, then later some IP-based method.

As for unbalanced connections, these have always similarly been a low sending to high receiving impedance for longer connections. At home, many valve equipments had a high output impedance, several Kohms, going into a 1M input impedance, as few had cathode-follower outputs due to cost and complexity. Those indeed did require interconnections to be a few feet. Quad and Leak I recall both specified around 5 feet (1.5m) of cable between pre and power amps or tape recorders to preserve HF levels.

S.
 
Last edited:
'Red Book' CD players decided to standardise on 2V being the output and the volume controls of these amps were then cramped down to just off minimum and usually never more than 25% up before the amp began to clip
Well, no. Nominal (average) unbalanced line level in the olden days was 300 mV (-10 dBV). 2 Vrms then provided a rather sensible 16.4 dB of headroom (for pop music, anyway). Amplifiers were typically designed to provide up to 6 dB gain to spare for quieter cartridges or whatnot, so they would have a 150-200 mV input sensitivity.

It was the CDs that got louder and ruined the whole deal. 10 dB often is enough to get amplifier volume pots into "touchy with declining channel balance" territory.

Whenever you can choose between "home" or "pro" levels these days, it tends to be between a nominal -10 dBV and +4 dBu, a roughly 12 dB difference. Now that would mean CD peak levels would have to be at +20.5 dBu, and even semi-pro audio interfaces tend to max out around +18 dBu, with +12 to +16 dBu being common around home studio gear. 4 Vrms (~+14 dBu) is right in that ballpark, and as previously mentioned it happens to line up with the 6 dB gain of common balanced line driver circuits.
 
Last edited:
Well, no. Nominal (average) unbalanced line level in the olden days was 300 mV (-10 dBV). 2 Vrms then provided a rather sensible 16.4 dB of headroom (for pop music, anyway). Amplifiers were typically designed to provide up to 6 dB gain to spare for quieter cartridges or whatnot, so they would have a 150-200 mV input sensitivity.

It was the CDs that got louder and ruined the whole deal. 10 dB often is enough to get amplifier volume pots into "touchy with declining channel balance" territory.
And indeed, playing some of my CDs from the early 1980s, there was typically something like 10dB of headroom left. None of this slamming the audio to 0dBFS and keeping it there for a long time with compression and limiting.

All my Dire Straits and Moody Blues CD which I bought at the time they came out on CD were like that. Classical music similarly never peaked above -10dBFS.

S.
 
Your peak meters must have been of the slow analog variety, I don't think any of my '80s classical CDs have peaks below -5 dBFS, and a whole bunch are reaching -1 dBFS or above. I even have one with an "over" peak (the exact magnitude of which depends upon the resampler I use during RG scanning, from about 1.03 to 1.277), and that's a Telarc release from 1981!
 
Last edited:
Back
Top Bottom