I have two more questions that I am having a hard time understanding or finding answers to on the web.
1. I saw the specs on this amp:
Coda S5.5 amplifier has 50 watts into 8 ohms, 100 watts into 4 ohms and 200 watts into 2 ohms all class A. It can output 100 amps.
My speakers dip down to 3 ohms. What is the benefit of having 100 amps over say 29 amps, which my 100 watt amp puts out?
If this is right, W=I*V and W=I squared * R. So at 3 ohm load and 150 watts, yields a current of 7.1 amps. Why would this amp need to output 100 amps if the draw would only be 7.1 amps? Even at 1 ohm load, based on W=I squared * R, then 150/1 = 150 and the sq rt of 150 is about 12.2.
And W-I*V, then if the watts are 100 and current is 100, then the amp output would be 1 volt? that seems too low for real world audio situation.
2. Is the output voltage of an amp based on its wattage ? For example, the AHB2 lists it max voltage output at about 29 volts. Would a 200 watt amp then have a higher max voltage output?
Thanks.
1. I saw the specs on this amp:
Coda S5.5 amplifier has 50 watts into 8 ohms, 100 watts into 4 ohms and 200 watts into 2 ohms all class A. It can output 100 amps.
My speakers dip down to 3 ohms. What is the benefit of having 100 amps over say 29 amps, which my 100 watt amp puts out?
If this is right, W=I*V and W=I squared * R. So at 3 ohm load and 150 watts, yields a current of 7.1 amps. Why would this amp need to output 100 amps if the draw would only be 7.1 amps? Even at 1 ohm load, based on W=I squared * R, then 150/1 = 150 and the sq rt of 150 is about 12.2.
And W-I*V, then if the watts are 100 and current is 100, then the amp output would be 1 volt? that seems too low for real world audio situation.
2. Is the output voltage of an amp based on its wattage ? For example, the AHB2 lists it max voltage output at about 29 volts. Would a 200 watt amp then have a higher max voltage output?
Thanks.