Vasr
Major Contributor
- Joined
- Jun 27, 2020
- Messages
- 1,409
- Likes
- 1,925
This came out of a discussion in another thread between @peng and I and a subsequent PM exchange. We agreed that this was a confusing issue.
Assume just unbalanced inputs for the following since the unbalanced vs balanced spec is unrelated to this fundamental question. Bringing that in will only obfuscate.
How do you understand it with respect to a specific amps driving 8ohms or 4ohms speakers?
Informally, the input sensitivity is understood as the voltage required at the input to drive the amp to max stated power. So if 0db on your pre-amp is matched in output voltage to this, you get max power out of the amp at 0db volume setting. It is never that ideal in practice (unless you have gain control on the amp) but you try to get it somewhat in the ball park. Is this accurate?
But power at what load? Typically, the specified input sensitivity seems to work out in calculations to max power spec at 8ohms for the published gain. But never for 4ohms (unless the power is double for 4ohms not usually the case).
Power, gain and input sensitivity are related in a fixed equation. So, if max power depends on the load, one of gain or input sensitivity or both must also depend on the load for that equation to hold.
If it is input sensitivity (as understood earlier) that varies depending on load, then its use if you happen to have 4ohm speakers is limited unless you do your own calculations using the stated gain and max power at 4ohms. If so, why isn't the input sensitivity specified as for 8ohms in particular.
Can someone with the correct fundamentals to unravel these technical measures, what they mean and how it is to be used depending on whether you have 8ohms speakers or 4ohm speakers? Thanks.
Assume just unbalanced inputs for the following since the unbalanced vs balanced spec is unrelated to this fundamental question. Bringing that in will only obfuscate.
How do you understand it with respect to a specific amps driving 8ohms or 4ohms speakers?
Informally, the input sensitivity is understood as the voltage required at the input to drive the amp to max stated power. So if 0db on your pre-amp is matched in output voltage to this, you get max power out of the amp at 0db volume setting. It is never that ideal in practice (unless you have gain control on the amp) but you try to get it somewhat in the ball park. Is this accurate?
But power at what load? Typically, the specified input sensitivity seems to work out in calculations to max power spec at 8ohms for the published gain. But never for 4ohms (unless the power is double for 4ohms not usually the case).
Power, gain and input sensitivity are related in a fixed equation. So, if max power depends on the load, one of gain or input sensitivity or both must also depend on the load for that equation to hold.
If it is input sensitivity (as understood earlier) that varies depending on load, then its use if you happen to have 4ohm speakers is limited unless you do your own calculations using the stated gain and max power at 4ohms. If so, why isn't the input sensitivity specified as for 8ohms in particular.
Can someone with the correct fundamentals to unravel these technical measures, what they mean and how it is to be used depending on whether you have 8ohms speakers or 4ohm speakers? Thanks.