• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Why is line-level low-single digit volts (or, why do amps have gain?)

Fryguy

Member
Joined
Mar 19, 2018
Messages
8
Likes
2
So, after having a satisfactory audio system for a while, I'm back shopping for some new headphone stuff, and I've decided to get involved in some forums, do some research, learn some more from where I left off.

I've been re-reading a lot of nwavguy o2 stuff, and the new material that's come out about objective based source and amplification, and generally learning more about how amplifiers work. During shopping, I remembered that the last O2 I bought was a 1x gain model, and so here I am learning more about gain stages. Since a 1x gain doesn't actually amplify, that got me wondering.

I have a few questions. Let's assume we have a fixed speaker system with something like 90db/1w response, 8 ohm nominal impedance. And we have a digital source that outputs a 2v signal (seems pretty typical in the range of unbalanced signals). So the variables here we are going to talk about are pre-amp (if any), and amplifier.

To summarize the questions:

1. Why is 2V (or thereabouts) the typical output voltage from a source? Why isn't it much higher, and then attenuated down the line? We instead take this 2V signal, attenuate THAT, and then apply gain.

2. Essentially the same as number 1. Why is the gain in the amplifier section of power amps? Why don't there exist "true separates" that separate the pre-amp gain stage from a 1x gain power amp? I *think* this is sort of what we see in tube hybrid amps, but all in 1-box so no room for experimentation.

3. Given the same 2V source, and speakers, why does a +30db gain 500w amplifier get louder than a +30db gain 250w amplifier?


If we want to get a very low amperage, very high voltage signal out of a source, we could do this seemingly easily I think? DACs already have the ability to turn *something* into 2V, why don't they turn that *something* into 30, 40, 50+ volts? Then a power amplifiers job is providing sufficient current to drive the lower impedance speakers.

For example, let's say we wanted to drive our 90db speakers with 100 watts. Doing some quick math, that needs something like 28.28 volts. Instead of generating this gain at the power amp, let's output a 30v signal from our source instead of 2 and attenuate it a tiny bit. Now our unity gain power amplifiers job is to take that super high impedance 30v signal and generate enough current for it (which is ~3.5 amps here I think?)

Is it just that there are no benefits to having the gain stage in a separate box from the power stage? 2V seems kind of arbitrary as the basis for a gain stage.

I'm sure I'm missing plenty of fundamental concepts here. Look forward to learning some stuff about amplifier design!
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,202
Likes
16,982
Location
Riverview FL
Given the same 2V source, and speakers, why does a +30db gain 500w amplifier get louder than a +30db gain 250w amplifier?

I'll answer the easy one:

It doesn't (shouldn't) get louder.

The 500W amp with 30dB gain will have a higher "sensitivity" than the 250W amp.

Sensitivity being the input voltage that will drive it to its specified output.

Example:

Krell Spec:

upload_2018-3-19_23-49-59.png


Note that a 2.53V input would (should) drive all three amps in this series to the same output level, only the little one (mine) would be driven to its rated maximum output.
 
Last edited:

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,524
Likes
37,057
Lots of reasons. Not all due to what is possible electronically. Some are hold overs from earlier times when it made sense for technical reasons that are no longer valid.

Amp example as an outlier, I have an amp that takes digital inputs and nothing else. Everything in the amp is digital and the output stage is a high voltage sigma-delta DAC. The only analog in the device is the output of the amp where the output stage does a digital to analog conversion. 150 wpc into 8 ohms.

About question #1. Before CD output voltages were lower. Something like 150 millivolts was sort of a standard for many source devices prior to CD.

As for amps usually having a gain stage, part of it is not wanting to have interconnects carrying 84 volts as some high powered amps would need like in Ray's chart above. Also think about Ray's chart, which output voltage is correct when customers may have anywhere from 15 wpc to 1000 wpc if the power amps were all unity gain. You have too wide a range of voltage to make that practical. Letting the amp do some gain which varies with the size and design of the amp works out better so all source components can have something close to a standard 2 volt output. It could have been pegged at some other voltage higher or lower. With opamps being common and high performance many of them run on 15 volt power rails. They can do the job at a few volts well enough, but you would run into some problems trying to run at 10 volt or higher output signals when currently at the lower voltage the opamp can do the entire job itself.

One could design some very different gear, but various customs, standards and necessities have all been mixed together.

Another outlier example, I use electrostatic speakers. Typically they have transformers so a regular amps works with them. They actually run on signals up to around 2000 volts. There have been a couple products that used radio transmitting tubes at high voltages to directly drive the panels at 2 kv. I don't think we want a DAC outputting 2 kv for the input signal.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,202
Likes
16,982
Location
Riverview FL
Why is the gain in the amplifier section of power amps? Why don't there exist "true separates" that separate the pre-amp gain stage from a 1x gain power amp?

I'm no expert, but the "power" stage (as in current capability to drive a handful of ohms at higher voltages) needs some "drive".



upload_2018-3-20_1-30-55.png





I don't know if the particular device below is suitable for audio work...

It takes up to 15 amps of "signal" to tickle it (Base Current).

That's the current capacity of a standard U.S. AC Wall Outlet - 120Vac and 15 amps.

Ok, extreme example, and a big honking transistor with 50 amp output...

You'd be using speaker cables for interconnects.

(A darlington would make more sense, unless it didn't sound as good... 2 amps base current on a similarly spec'd darlington )

upload_2018-3-20_1-39-15.png
 
Last edited:

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
1. Why is 2V (or thereabouts) the typical output voltage from a source? Why isn't it much higher, and then attenuated down the line? We instead take this 2V signal, attenuate THAT, and then apply gain.

2 Volts was arrived at due to the technical limitations of reliably and linearly describing the LSB (and the 65535 other levels) in a 16 bit DAC at the time, resulting in around 30uV after IV conversion. Any lower and the LSB would be unable to be reliably retrieved as it would disappear into noise and non-linearities.

Previous to digital, we had standard 150mV line inputs.

2. Essentially the same as number 1. Why is the gain in the amplifier section of power amps?

The power section of power amps usually has no voltage gain, just massive amounts of current gain. The VAS (voltage amplifier stage) contains all the voltage gain.

For example, let's say we wanted to drive our 90db speakers with 100 watts. Doing some quick math, that needs something like 28.28 volts. Instead of generating this gain at the power amp, let's output a 30v signal from our source instead of 2 and attenuate it a tiny bit. Now our unity gain power amplifiers job is to take that super high impedance 30v signal and generate enough current for it (which is ~3.5 amps here I think?)

Firstly, your 100w (at 8ohms) would need a lot more than 28.28 volts (30v). It would need 80v peak to peak to give you 28.28v RMS for your 100W.

So we have the equivalent of a tube output stage (high voltage, high impedance) and and quite dangerous IMO as we are now running high voltage swings on what were previously line level sources. RCA plugs would have to go the way of the Dodo. A 200W amp would have an output swing the same as your power point from source to speaker...
 
Last edited:

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,202
Likes
16,982
Location
Riverview FL
Firstly, your 100w (at 8ohms) would need a lot more than 28.28 volts (30v). It would need 80v peak to peak to give you 28.28v RMS for your 100W.


Huh?

28.28Vrms x 1.414 = 39.98Vpk = 79.97Vpk-pk
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
100W @ 8ohms is exactly 80v pk-pk.

Anyway, what's 30mV between friends? :)
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,202
Likes
16,982
Location
Riverview FL
Sorry, I misinterpreted your (apparent) objection to his 28.28V.

I'll go back to sleep now.
 
OP
F

Fryguy

Member
Joined
Mar 19, 2018
Messages
8
Likes
2
Cool, this all mostly makes sense. I sort of suspected that the 2V stuff was largely due to previous technical limitations and some momentum has kept it in place now. Combined with the fact that there's no real benefits to separating the gain stage from the amp (and as documented, some downsides), it appears there is no reasonable gains here.

As for the amplifier wattage versus gain, that Krell image showing same gain but different sensitivity did not help me understand anymore. I thought I somewhat understood sensitivity -- back from the car audio days --. You match input sensitivity of your amp to your source. So with our hypothetical 2V source, both of these krell amps would be undersignalled, meaning I can't push them fully, and so ultimately they would produce the same volume output through our hypothetical 90db 8ohm speakers.

If instead we gave the krell amps the ability to receive up to a 4v signal, how does that change things? What specifically does having the higher input sensitivity change that allows it to produce more volume at the same gain?
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
The sensitivity being lower (higher voltage needed for full power) means the residual noise is kept further away from the signal resulting in better S/N and dynamic range numbers.

Balanced is a higher level too.

There was a push a while back to lift car audio line levels to several volts?- I'm not in that field, but it was to elevate the signal sufficiently above the noisy environment of modern cars IIRC. Someone else can weigh in on that one.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,524
Likes
37,057
Sensitivity in amps is related to power output. It is the voltage to get to full rated output. So if three amps have the same gain, but different power, they will have different sensitivity ratings. Krell could have made each amp have different gain so it had the same sensitivity.

You are correct that if we only have 2 volts them the higher powered amps will not be driven to full output capability. Which actually may not hurt anything. As restorer john said that decision is related to SNR. The higher power amps with same gain have higher SNR. If Krell had upped the gain to maintain the same sensitivity level they would have lost SNR because the extra gain amplifies noise as well as signal.

The Krell amps in Ray's post above are from a time when many still had active pre-amps. Most of those could take 1 volt or less and raise it to several volts for feeding the power amp. So 2 volts was no problem. Also many current DAC/preamps will put out 3 or 4 volts so again no problem.

Krell's current Duo series of power amps all have sensitivity ratings a small amount below 2 volts. One exception being the largest which has 2.4 volts for full output.

Looking at things like this can make a difference in how good and quiet a system sounds. It is referred to as gain staging. Putting together amps that aren't grossly over-powered, with sources that aren't mismatched give you the widest effective dynamic range and best system SNR ratio. Most modern gear is so quiet you can get away with being sloppy about this often, but best performance is obtained if you gain stage your gear carefully.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,202
Likes
16,982
Location
Riverview FL
Last edited:

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
I came across an interesting excerpt the other day whilst reading addendums to a 1st generation CD player service manual (yep, I relax with a glass of red and a random service manual..)

It seems the 'industry standard' was going to be 1.4V for CD and it was changed early, prior to the worldwide release of the format. I've never seen reference to the 1.4V standard before, so this is an interesting development. My feeling is the level needed to be raised to ensure the bottom LSBs didn't disappear into residual noise as I have previously read and considered true. Some of the earliest Toshiba machines using 14bit D/As had a 1.0V output, but I'm pretty sure that was for interfacing with some of their small shelf systems.

hitachi service.JPG
 
Top Bottom