OK, I'm tired of threads on various sites with people asking about amp rating and speaker rating and do they need an external amp and people positing all kinds of distance formulas and rules of thumb and the (often-exaggerated) speaker sensitivity to essentially GUESS if the amplifier is clipping.
Let's improve on that!
Most folks do not have a storage oscilloscope on hand (though if someone knows a good phone or PC/Mac/tablet app like that please post!).
Here's my idea of a procedure to measure if the amp is close to clipping or not (for digital sources at least):
(1) Note the maximum volume setting you play at.
(2) Get a good AC voltmeter. Connect it across the front right or left speaker.
(3) You need a say -20 dB test tone, like several hundred Hertz.*
(4) Turn the volume DOWN a chunk, play the track, then move quickly up to your maximum volume and read the AC voltage.
(5) Compare to this chart (the watts being the 8Ω rating of the amplifier):
30W: 1.5V 50W: 2.0V 60W: 2.2V 80W: 2.5V 100W: 2.8V 120W: 3.1V 150W: 3.5V 200W: 4.0V
The chart is somewhat approximate but power increases exponentially in decibels, just use the closest number. Actual voltages with a maximum 0 dB digital signal would be 10X higher.
- If you're below the voltage listed for your amp, you should not be clipping with music. (Even though every AVR I ever saw tested drooped its power with all channels driven, someone here posted data that even on big peaks the surround channels were below the front LCR power. And music is not continuous tones).
- Above the voltage, you might be clipping.
Any comments/corrections please fire away!
*This is the hard part. The Stereophile 1st CD track 27 might work https://www.discogs.com/release/3692778-Various-Stereophile-Test-CD2
If someone could generate and upload suitable test tone(s) you would be the thread HERO!!
300 or 400 Hz at -20 dB is probably good since that will go into the woofers but without huge excursion, and also I dimly recall most voltmeters should be OK at those frequencies even if they are not good across the whole audio band.
Let's improve on that!
Most folks do not have a storage oscilloscope on hand (though if someone knows a good phone or PC/Mac/tablet app like that please post!).
Here's my idea of a procedure to measure if the amp is close to clipping or not (for digital sources at least):
(1) Note the maximum volume setting you play at.
(2) Get a good AC voltmeter. Connect it across the front right or left speaker.
(3) You need a say -20 dB test tone, like several hundred Hertz.*
(4) Turn the volume DOWN a chunk, play the track, then move quickly up to your maximum volume and read the AC voltage.
(5) Compare to this chart (the watts being the 8Ω rating of the amplifier):
30W: 1.5V 50W: 2.0V 60W: 2.2V 80W: 2.5V 100W: 2.8V 120W: 3.1V 150W: 3.5V 200W: 4.0V
The chart is somewhat approximate but power increases exponentially in decibels, just use the closest number. Actual voltages with a maximum 0 dB digital signal would be 10X higher.
- If you're below the voltage listed for your amp, you should not be clipping with music. (Even though every AVR I ever saw tested drooped its power with all channels driven, someone here posted data that even on big peaks the surround channels were below the front LCR power. And music is not continuous tones).
- Above the voltage, you might be clipping.
Any comments/corrections please fire away!
*This is the hard part. The Stereophile 1st CD track 27 might work https://www.discogs.com/release/3692778-Various-Stereophile-Test-CD2
If someone could generate and upload suitable test tone(s) you would be the thread HERO!!
300 or 400 Hz at -20 dB is probably good since that will go into the woofers but without huge excursion, and also I dimly recall most voltmeters should be OK at those frequencies even if they are not good across the whole audio band.