• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

What is the right interpretation of input sensitivity for an amp?

waynel

Major Contributor
Forum Donor
Joined
Nov 14, 2019
Messages
1,037
Likes
1,293
Once the current hardware-only dinosaurs in the electronics industry who are petrified of software become extinct as expected sooner or later, a smarter set of devices would be self-configuring:

Power amps self-configure to be aware of nominal impedance of their speaker connections. They attenuate the input automatically to be below the clipping voltage for the load if the source upstream is not smart-enabled.

If the source connected to the amp is smart-enabled, there would be a handshake between the amp and the source on connection to exchange parameters. The source unit would calibrate its output to be limited by the input sensitivity reported by the amp and calibrate its volume display within that range so 70% is always 70% of maximum output possible and there is no danger of clipping ever.

But it may require a Steve Jobs of audio to drag the audio engineers kicking and screaming like he did at Apple to build smarter designs or fire them. ;)

Just buy a Sonos if this is what you want and stop trying to reinvent USB. I’ll take the stuff designed by engineers myself.
 

March Audio

Master Contributor
Audio Company
Joined
Mar 1, 2016
Messages
6,378
Likes
9,321
Location
Albany Western Australia
People using flip phones said the same thing about smartphones. Very few using flip phones now.

You are arguing by extremes. They don't need to be totally incompatible to be a problem. Just make it so that people don't have to think about it. The number of questions that keep propping up on the input sensitivity matching here and elsewhere indicate that it is not a non-issue for consumers. Otherwise, why even provide that number in the specs. As audio porn? ;)

I will stop before you get into your self-confessed Doc Martin mode. :p
No seriously, take it from someone who designs this stuff. It would create totally unnecessary complexity which would increase cost. It's not intelligent product design to "over engineer" in this way, especially when there is no actual requirement.

Smart phones have far more functionality than non smart phones. Your comparison is erroneous.

As I asked, when was the last time you heard of anyone actually having a problem with this?

It's a non problem. The reality is that with some pres and some amps you might have to turn the volume dial to different settings, nothing more than that.

It's you that are arguing extremes by definition because no one in reality is actually having a problem.
 
Last edited:

waynel

Major Contributor
Forum Donor
Joined
Nov 14, 2019
Messages
1,037
Likes
1,293
No seriously, take it from someone who designs this stuff. It would create totally unnecessary complexity which would increase cost. It's not intelligent product design to "over engineer".

As I asked, when was the last time you heard of anyone having a problem with this?

It's a non problem.

Non problem plus there are standards. As long as the DAC or preamp puts out 2V unbalanced or 4V balanced it will work with almost any amp.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,741
Likes
38,984
Location
Gold Coast, Queensland, Australia
Power amps self-configure to be aware of nominal impedance of their speaker connections. They attenuate the input automatically to be below the clipping voltage for the load if the source upstream is not smart-enabled.

This was done in the 1980s by Matsushita (Technics). Extremely effective. During the power up sequence, the amplifier measured the connected speaker impedance and adjusted the rail voltage to ensure the amplifier wasn't overdriven.

Panasonic (Technics again) are doing it in 2020 with certain GaN FET class D amplifiers, except this time, taking it to a whole new level...

1601079837482.png
 
Last edited:

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,250
Likes
17,200
Location
Riverview FL
Just make it so that people don't have to think about it.

Isn't that how the little round knob with "infinite" variability upon the application of circular motion came into being?

PS:

I still haven't figured out how to use the phone in my phone under different conditions reliably. Don't get many calls though, so not much practice.
 
OP
Vasr

Vasr

Major Contributor
Joined
Jun 27, 2020
Messages
1,409
Likes
1,926
This was done in the 1980s by Matsushita (Technics). Extremely effective. During the power up sequence, the amplifier measured the connected speaker impedance and adjusted the rail voltage to ensure the amplifier wasn't overdriven..

Panasonic (Technics again) are doing it in 2020 with certain GaN FET class D amplifiers, except this time, taking it to a whole new level.

That is very cool. They seem to have some real engineers (and a great heritage).

You know what is going to happen. Bruno Putzeys or some visionary like him (in an evolutionary way) is going to start including a feature like this and increasingly more smarter things in his modules. Then all the people who slap these together and don't think it a problem now are going to crow about how this is the smartest thing to happen since smartphones and write copious amounts in their marketing brochures of how it has revolutionized the industry. And why they are better than all the old class A/B amps which don't have it. :p
 

Mnyb

Major Contributor
Forum Donor
Joined
Aug 14, 2019
Messages
2,786
Likes
3,881
Location
Sweden, Västerås
The natural solution would be to simply adhere to a standard 2/4 volt for full output voltage that’s it done .

To cater for for all situations power amps should simply have adjustable gain . Like most pro amps . Then you can adjust accordingly for vintage sources or an AVR .

The lack of standards is an issue .

AVR’s are a problem they simply output to low voltage ,

In reality most consumer power amps have to high gain , so you can drive it to clipping with an AVR or any source but it’s not optimal for signal to noise ratio .

To get SOTA performance , sources who in most cases is a DAC should generate the signal at as high voltage as possible. Like benchmark does it and power amp should then have low gain .
So there really is a need for a new consumer standard level professionals do have some standard levels adopt one of those ?.
The SOTA performance is limited by physics , the thermal self noise of electronics is what it is so the solution is to begin at a higher level .
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,741
Likes
38,984
Location
Gold Coast, Queensland, Australia
To cater for for all situations power amps should simply have adjustable gain . Like most pro amps .

Most pro-amps and domestic amps don't have adjustable gain. They have a fixed gain and use attentuators up front. Not the same.

Some amplifiers do have genuinely adjustable gain, but they are rare.

So there really is a need for a new consumer standard level professionals do have some standard levels adopt one of those ?

It's completely unnecessary to change consumer level from what it is. 2V was an enormous departure from the existing levels in 1983, which were 150mV and had been for a generation. We had all the same problems back then with amplifiers overloading, clipping front ends and the volume range being squashed into the first few degress of rotation.

High levels are only needed for long runs in professional balanced systems and really have no place in the home.

The fact that modern amplifier "designers" have passed the buck back to the stages before in order to obtain their so-called SOTA numbers is just typical of the world we live in. They know full well that D/A converters are primarily current output and need an active IV stage and buffer- which will introduce just as much noise as their own front ends would, if they designed for typical 28-30dB gains.

The standard for power amplifiers has been 1.0V to 2.0V for full power for many decades- be that single ended or balanced. Proper balanced outputs were not double the single ended level, they were exactly the same. Why? Because they used a proper 1:1 transformer with a 600ohm impedance. I know, I have plenty of source gear with a true balanced output, not ground referenced, fully differential and properly isolated. Trouble is, manufacturers don't want to spend a ton of money of LCOFC mumetal shielded transformers with a ruler flat response- they'd rather use a few 10c opamps.
 

March Audio

Master Contributor
Audio Company
Joined
Mar 1, 2016
Messages
6,378
Likes
9,321
Location
Albany Western Australia
That is very cool. They seem to have some real engineers (and a great heritage).

You know what is going to happen. Bruno Putzeys or some visionary like him (in an evolutionary way) is going to start including a feature like this and increasingly more smarter things in his modules. Then all the people who slap these together and don't think it a problem now are going to crow about how this is the smartest thing to happen since smartphones and write copious amounts in their marketing brochures of how it has revolutionized the industry. And why they are better than all the old class A/B amps which don't have it. :p

Nothing cool about something that is totally unnecessary. Its is why no manufacturers bother doing it.

You need to actually show this is a problem that needs solving.

The Technics LAPC looks like some fancy marketing blurb for "feedback" ;)
 
Last edited:

March Audio

Master Contributor
Audio Company
Joined
Mar 1, 2016
Messages
6,378
Likes
9,321
Location
Albany Western Australia
Most pro-amps and domestic amps don't have adjustable gain. They have a fixed gain and use attentuators up front. Not the same.

Some amplifiers do have genuinely adjustable gain, but they are rare.



It's completely unnecessary to change consumer level from what it is. 2V was an enormous departure from the existing levels in 1983, which were 150mV and had been for a generation. We had all the same problems back then with amplifiers overloading, clipping front ends and the volume range being squashed into the first few degress of rotation.

High levels are only needed for long runs in professional balanced systems and really have no place in the home.

The fact that modern amplifier "designers" have passed the buck back to the stages before in order to obtain their so-called SOTA numbers is just typical of the world we live in. They know full well that D/A converters are primarily current output and need an active IV stage and buffer- which will introduce just as much noise as their own front ends would, if they designed for typical 28-30dB gains.

The standard for power amplifiers has been 1.0V to 2.0V for full power for many decades- be that single ended or balanced. Proper balanced outputs were not double the single ended level, they were exactly the same. Why? Because they used a proper 1:1 transformer with a 600ohm impedance. I know, I have plenty of source gear with a true balanced output, not ground referenced, fully differential and properly isolated. Trouble is, manufacturers don't want to spend a ton of money of LCOFC mumetal shielded transformers with a ruler flat response- they'd rather use a few 10c opamps.


All things being equal I would rather have a higher voltage than a lower voltage, it just takes you further away from the inherent noise levels. 2/4 volts seem far more sensible than 150mV, although higher than this has probably got diminishing returns and may introduce other issues.

John, the gain has to go somewhere, so not sure what you mean by "pass the buck".

There is only a distinction in balanced/unbalanced voltage if you measure against some arbitrary reference, (ground). A differential input doesnt care or need to be ground referenced as long as you dont exceed CM voltage limits. You dont need a transformer for this.

Hypex Purifi modules are differential, they dont distinguish between balanced and single ended inputs, they just need X volts difference between hot and cold.

https://www.diyaudio.com/archive/bl...d1460406090-bruno-putzeys-micropre-g-word.pdf

The "double voltage" has only arisen because due to the common use of a second unity gain inverting op amp to turn a fundamentally single ended output into a balanced output.

1601090554184.png
 
Last edited:

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,741
Likes
38,984
Location
Gold Coast, Queensland, Australia
The Technics LAPC looks like some fancy marketing blurb for "feedback" ;)

It's way more than that. The amplifier measures the connected speaker (everytime it's turned on or the the speaker switch operated IIRC) for impedance/phase, then applies DSP processing to perfect (so they say) the impulse response. It appears to work for low frequencies only based on what I have read. There are some patents floating around in relation to the concept.

Anything that attempts to oversome the myriad issues in the amplifier-speaker interface is a good thing in my book.
 

March Audio

Master Contributor
Audio Company
Joined
Mar 1, 2016
Messages
6,378
Likes
9,321
Location
Albany Western Australia
It's way more than that. The amplifier measures the connected speaker (everytime it's turned on or the the speaker switch operated IIRC) for impedance/phase, then applies DSP processing to perfect (so they say) the impulse response. It appears to work for low frequencies only based on what I have read. There are some patents floating around in relation to the concept.

Anything that attempts to oversome the myriad issues in the amplifier-speaker interface is a good thing in my book.
Ok, fair enough but considering that good class d response is load independent (within reason) I'm not sure what they are selling here.

Again that's a lot of complexity and expense to solve a very minor issue.
 
Last edited:

Mnyb

Major Contributor
Forum Donor
Joined
Aug 14, 2019
Messages
2,786
Likes
3,881
Location
Sweden, Västerås
Ok lets rephrase a couple of things .

Is the consumer 2v “standard” a real standard or just something manufacturers do a de facto standard ?

Input sensitivity of power amps is all over the place , in the few I owned , I had to use inline attenuators with some and one actually had jumpers which was good .
Sources have been better .
Preamps and integrated usually had to much gain to accommodate vintage sources down at 0,1v level as you say .
I’ve had my share of crummy cassette decks :)

Would be cool if all 100watt amp produced the same voltage swing for 2 volts .

Would be cool with a generally sane gain structure in amps and dedicated inputs for vintage components with higher gain similar to phono .

I would think 2/4 volt is good enough if it actually was enforced better .
A good agreement on something is preferable imho .
 

Mnyb

Major Contributor
Forum Donor
Joined
Aug 14, 2019
Messages
2,786
Likes
3,881
Location
Sweden, Västerås
Why the hangup on 4 and 8 ohm , that’s “nominal “ and frequently abused speaker spec a real speaker can vary like 2-30 Ohm in impedance over the frequency spectrum .
If you have a transistor amp output impedance is usually smaller than 0,1 ohm .
The amp is linear until it reaches voltage or current limits possible also thermal limits if abused :D I don’t think I ever driven my stuff to shutdown ever ?
 

mhardy6647

Grand Contributor
Joined
Dec 12, 2019
Messages
11,414
Likes
24,777
well -- I mean the "line" standard used to be... you know... a standard.

0.775 Vrms into 600 ohms (1 mW)... if memory serves.

keysandampexforak by Mark Hardy, on Flickr

OK they were tapin' this remote -- but when they were live, they were usin' 600 ohm lines for their feed.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
3,461
Likes
9,165
Location
Suffolk UK
The 600 ohm standard goes back a very long way to when balanced lines were baseband analogue, basically telephone lines.

Sending signals across a country, or indeed a continent, as the US had to do, meant that cable lengths were very long, hundreds of miles. That required accurate impedance matching and equalising especially once FM radio extended the bandwidth requirement all the way up to a heady 15kHz. Stereo made it even more difficult as not only did the frequency response have to be flat, but both channels had to be matched for amplitude and phase. It's no wonder that even in a small country like the UK, equalised 'music lines' were very expensive and hard to get hold of. Circuits had to be made and measured, and it would take several hours for a pair of lines to be available and re-equalising during a long broadcast was common.

Digital lines were seen as a massive improvement and with it came the end of impedance matching and going over to the now pretty much universal standard of low output impedance into a high 'bridging' impedance. Life's so much easier now!

S.
 

peng

Master Contributor
Forum Donor
Joined
May 12, 2019
Messages
5,738
Likes
5,313
Good way to make products far more complex and more costly than the need to be. :facepalm:

You are worrying about things that don't need to be worried about. When was the last time you came across an amp that was totally incompatible with a pre due to widely disperate outputs, and sensitivities?

Trust me, the engineers and designers have given thought to this. ;)

I trust you on this:), but there is one example I can give, the ATI amps, that I thought may require closer scrutiny. Below are some specs for their two very similar products, the AT4000 seems to be a newer generation model that replaced the AT2000.

AT4000:
Rated 200 W, 8 ohms, 300 W 4 ohms, input sensitivity: 1.6 V, Gain: 28 dB, same for both unbalanced and balanced

AT2000:
Rated 200 W, 8 ohms, 300 W, 4 ohms, input sensitivity: 1.6 V, same for both unbalanced and balanced, Gain: 32 dB unbalanced (RCA), 28 dB balanced (XLR).

How would you reconcile the two, both have the same 1.6 V input sensitivity specs for both RAC and XLR inputs, but the AT2000's gain is 6 dB higher for the RCA inputs, yet the input sensitivity are the same 1.6 V for both RCA and XLR. If you do the math, 34 dB gain, 1.6 V input will result in an output of 800 W! So shouldn't the input sensitivity for the AT2000 be 0.8 V?

Now if you look at the AT4000, input sensitivity are again 1.6 V, same for both RCA and XLR, but now the gain is 28 dB for both inputs. So in this case, it would seem that the AT4000 would work better with an AV processor/preamp that has its XLR output 2X that of their RCA output, if the AVP measured much better in SINAD at 2 to 2.4 V than at 4 V output, such as the AV7705?

ATI's explanation for the difference in input sensitivity and gain between the AT2000 and 4000 (see below), seems fine, as they claimed they were able to "normalize the input voltage" for the AT4000 (the newer gen.) but I still have a little trouble with the specified input sensitivity, can't help wonder if there is a typo or two somewhere.

As someone from a reputable audio equipment manufacturer, I hope you can take a good look and shed some light on this. I have no trouble understanding the input sensitivity specs of Marantz, Yamaha, McIntosh, Parasound, and many other power amp's, just the ATI's for now, I found confusing.

When asked, apparently ATI's explanation for the difference specs between the 2000 and 4000 are as follow:

https://forums.audioholics.com/foru...-to-be-a-denon-avr.119013/page-6#post-1417210

The AT2000/3000 circuitry didn't allow us to normalize input voltage. The AT4000/6000 design allowed us to make input voltage the same for RCA and XLR, so that the same gain can be applied to both.

XLR output on most (not all) pre-amps is 6dB louder than the RCA output. But that's just a result of having two signal wires instead of one. That doubles the signal (and noise), so it's not like XLR has a better signal than the RCA, just twice as much of the same signal.

We haven't decided on output voltage for the ATP-16 or whether XLR and RCA will be the same.

To me, I would much prefer March Audio's specs for the P252, no input sensitivity specs, just gain, and that's good enough for me.
https://www.marchaudio.net.au/product-page/p252-stereo-250-watt-power-amplifier

Power Output
  • 2 Ohms - 180 W rms
  • 4 Ohms - 250 W rms
  • 8 Ohms - 150 W rms
Voltage Gain 26dB

That's because I can then calculate the input sensitivities easily, based on the rated output into 2,4, and 8 Ohms.

That would be:
  • 2 Ohms - 180 W rms (*Input sensitivity: 0.95 V)
  • 4 Ohms - 250 W rms (*Input sensitivity: 1.58 V)
  • 8 Ohms - 150 W rms (*Input sensitivity: 1.74 V)
* The input sensitivity figures are from my calculations only, not part of March Audio's specs. I could be wrong if by chance I made some silly mistakes.

With ATI's, I can do the same kind of calculations, but the way they specified, I can't help but wonder if I interpreted their specs correctly.

Edit: For some reason, I must have keyed in something wrong (the formula in my Excel sheet were right) so the sensitivity results were all wrong, corrected now, and of course they are same as that calculated by restorer-John and March Audio.
 

Attachments

  • AT2000.png
    AT2000.png
    358.1 KB · Views: 143
  • AT4000.png
    AT4000.png
    108.4 KB · Views: 232
Last edited:

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,741
Likes
38,984
Location
Gold Coast, Queensland, Australia
Power Output
  • 2 Ohms - 180 W rms
  • 4 Ohms - 250 W rms
  • 8 Ohms - 150 W rms
Voltage Gain 26dB

That's because I can then calculate the input sensitivities easily, based on the rated output into 2,4, and 8 Ohms.

That would be:
  • 2 Ohms - 180 W rms (*Input sensitivity: 0.76 V)
  • 4 Ohms - 250 W rms (*Input sensitivity: 1.26 V)
  • 8 Ohms - 150 W rms (*Input sensitivity: 1.38 V)

A voltage gain of 26dB is ~20 (19.95) times.

150W RMS@8R is 34.64V RMS.
Divided by 20 is 1.73V

250W RMS@4R is 31.62V RMS
Divided by 20 is 1.58V

The amplifier is fairly linear until 2R where it hits the wall really hard, most likely in the form of current limiting and/or the PSU rails collapsing.

As we are seeing with the testing of the Class Ds, they are pretty much on the money for rated power outputs, so the specification numbers regarding gain are OK to base calculations on.

In the past, amplifiers were much more conservatively rated, often by around 20% as they rated for the full bandwidth (20-20K) at full rated power with very low THD figures from 250mW to rated power. A 150W/ch rated amplifier in the 1980s and 90s was typically 180W@8R, nearly double its official 8R rating into 4R, and often reached huge numbers for 2R. So although the "X volts for rated power output" specifications were usually valid, you had a lot more in reserve should you input more.
 
Last edited:
Top Bottom