But I don't think a 10v test or any other voltage-based test is the right method. We don't listen based on voltage level. We listen based on output. Therefore, if Amir is to do further multi-level non-linear distortion testing then stick with the basis for which we listen: dB. Distortion at 90dB and 96dB at 1m equivalent. 102dB would be an optional test if you want to push things a bit more.
Absolutely. Not doing the tests based upon output level simply advantages speakers with lower sensitivity. Which is useless.
No-one would consider a test for an amplifier based only upon input level as reasonable. Distortion versus output has never been questioned, so the same should be true for speakers.
ETA. I see Amir has already answered. But even a rough tweak to the drive levels once the sensitivity is known would be enough. We can assume that the distortion is reasonably insensitive wrt small changes in input level, so the results will be valid, even if the levels are a little bit off.
Last edited: