• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Voltage input needed for Amplifiers with variable or low Gain

JeremyFife

Major Contributor
Joined
Jan 8, 2022
Messages
1,091
Likes
1,348
Location
Scotland
Hi All,
Hoping for some explanations and advice, or just a "don't worry about it" sense check.

I'm gradually realising that Amplifiers with low gain may need higher than 'standard' (4V balanced, 2V single-ended) input voltage in order to reach full power. The recent review of the IOM 500s https://audiosciencereview.com/forum/index.php?threads/iom-500s-stereo-amplifier-review.56152/ shows that at the second lowest gain setting of 16dB, 5V is required to reach full power.

I didn't find it easy to find and to understand this. There is a degree of knowledge required to read these reviews - perhaps as there should be - and I'm learning. I wold ask @amirm if this sort of thing could be included as a table in future reviews - it would help. One for discussion.

Anyway. This led me to realise that I don't know how to estimate whether my preamp (miniDSP Flex) would drive a future new amplifier properly. That IOM, for example, would have to be set at a higher gain setting than 16dB - which is fine, and I'd be able to experiment myself with settings. Without my new understanding I'd probably start off at the lowest gain and wonder why my Amp didn't sound as monstrous as I'd expected - I would mistrust my ears, at least to start with.

This being ASR, I'd prefer to know, instead of fiddle, and to be able to assess potential new purchases better.

Is there any rule of thumb or estimation that works to help understand the likely input voltage required at a specified gain in order to achieve an amplifier's full power?
I believe that the actual calculations are complex and need more information, more than I might easily get without measuring (with equipment I don't have).

Any thoughts?

Many thanks
 
Is there any rule of thumb or estimation that works to help understand the likely input voltage required at a specified gain in order to achieve an amplifier's full power?
The calculation is ( input voltage * 10^(Amp gain in dB/20) )^2 / loudspeaker impedance = output power.

As an example, 2V into an Amp with 24dB gain and 6Ω speakers can give you a maximum output power of (2*10^(24/20))^2/6=167W per channel.

You can also use this spreadsheet:
Screenshot_20240807-140853_Sheets.png

You can change the gain and the graph will refresh automatically.
 
The calculation is ( input voltage * 10^(Amp gain in dB/20) )^2 / loudspeaker impedance = output power.

As an example, 2V into an Amp with 24dB gain and 6Ω speakers can give you a maximum output power of (2*10^(24/20))^2/6=167W per channel.

You can also use this spreadsheet:
View attachment 385108

You can change the gain and the graph will refresh automatically.
Ideal thanks - I can handle that sort of maths :) Appreciated
 
Some of the recent Class D amplifiers have low gain. It helps their SINAD. The usual solution is to hook them up to a balanced DAC.
 
Is there any rule of thumb or estimation that works to help understand the likely input voltage required at a specified gain in order to achieve an amplifier's full power
Static has given a great answer.

It might also be useful to know that for many years domestic power amplifiers mostly had gains in the high 20dBs. Recently I noted Amir has reduced his default from 29dB to 25dB, probably reflecting the more common higher balanced outputs these days.

So a power amplifier with, say 13dB is actually quite a strange beast. I've found that my ADI 2 Pro in balanced mode works best with such a low gain, but other unbalanced "preamplifier" or similar devices I've owned have needed something a lot more domestically common such as 27dB gain.
 
There are different line level "standards" for pro and consumer equipment. They are rather loose standards but if you stick with one or the other, line outputs usually have a little extra signal and line inputs usually have a little extra gain and everything works together.

If use a "pro" amplifier with "consumer" source you need to make sure it's got enough gain. Plus, you'll usually need an RCA-to-XLR adapter.
 
Back
Top Bottom