So, after having a satisfactory audio system for a while, I'm back shopping for some new headphone stuff, and I've decided to get involved in some forums, do some research, learn some more from where I left off.
I've been re-reading a lot of nwavguy o2 stuff, and the new material that's come out about objective based source and amplification, and generally learning more about how amplifiers work. During shopping, I remembered that the last O2 I bought was a 1x gain model, and so here I am learning more about gain stages. Since a 1x gain doesn't actually amplify, that got me wondering.
I have a few questions. Let's assume we have a fixed speaker system with something like 90db/1w response, 8 ohm nominal impedance. And we have a digital source that outputs a 2v signal (seems pretty typical in the range of unbalanced signals). So the variables here we are going to talk about are pre-amp (if any), and amplifier.
To summarize the questions:
1. Why is 2V (or thereabouts) the typical output voltage from a source? Why isn't it much higher, and then attenuated down the line? We instead take this 2V signal, attenuate THAT, and then apply gain.
2. Essentially the same as number 1. Why is the gain in the amplifier section of power amps? Why don't there exist "true separates" that separate the pre-amp gain stage from a 1x gain power amp? I *think* this is sort of what we see in tube hybrid amps, but all in 1-box so no room for experimentation.
3. Given the same 2V source, and speakers, why does a +30db gain 500w amplifier get louder than a +30db gain 250w amplifier?
If we want to get a very low amperage, very high voltage signal out of a source, we could do this seemingly easily I think? DACs already have the ability to turn *something* into 2V, why don't they turn that *something* into 30, 40, 50+ volts? Then a power amplifiers job is providing sufficient current to drive the lower impedance speakers.
For example, let's say we wanted to drive our 90db speakers with 100 watts. Doing some quick math, that needs something like 28.28 volts. Instead of generating this gain at the power amp, let's output a 30v signal from our source instead of 2 and attenuate it a tiny bit. Now our unity gain power amplifiers job is to take that super high impedance 30v signal and generate enough current for it (which is ~3.5 amps here I think?)
Is it just that there are no benefits to having the gain stage in a separate box from the power stage? 2V seems kind of arbitrary as the basis for a gain stage.
I'm sure I'm missing plenty of fundamental concepts here. Look forward to learning some stuff about amplifier design!
I've been re-reading a lot of nwavguy o2 stuff, and the new material that's come out about objective based source and amplification, and generally learning more about how amplifiers work. During shopping, I remembered that the last O2 I bought was a 1x gain model, and so here I am learning more about gain stages. Since a 1x gain doesn't actually amplify, that got me wondering.
I have a few questions. Let's assume we have a fixed speaker system with something like 90db/1w response, 8 ohm nominal impedance. And we have a digital source that outputs a 2v signal (seems pretty typical in the range of unbalanced signals). So the variables here we are going to talk about are pre-amp (if any), and amplifier.
To summarize the questions:
1. Why is 2V (or thereabouts) the typical output voltage from a source? Why isn't it much higher, and then attenuated down the line? We instead take this 2V signal, attenuate THAT, and then apply gain.
2. Essentially the same as number 1. Why is the gain in the amplifier section of power amps? Why don't there exist "true separates" that separate the pre-amp gain stage from a 1x gain power amp? I *think* this is sort of what we see in tube hybrid amps, but all in 1-box so no room for experimentation.
3. Given the same 2V source, and speakers, why does a +30db gain 500w amplifier get louder than a +30db gain 250w amplifier?
If we want to get a very low amperage, very high voltage signal out of a source, we could do this seemingly easily I think? DACs already have the ability to turn *something* into 2V, why don't they turn that *something* into 30, 40, 50+ volts? Then a power amplifiers job is providing sufficient current to drive the lower impedance speakers.
For example, let's say we wanted to drive our 90db speakers with 100 watts. Doing some quick math, that needs something like 28.28 volts. Instead of generating this gain at the power amp, let's output a 30v signal from our source instead of 2 and attenuate it a tiny bit. Now our unity gain power amplifiers job is to take that super high impedance 30v signal and generate enough current for it (which is ~3.5 amps here I think?)
Is it just that there are no benefits to having the gain stage in a separate box from the power stage? 2V seems kind of arbitrary as the basis for a gain stage.
I'm sure I'm missing plenty of fundamental concepts here. Look forward to learning some stuff about amplifier design!