I'm actually trying to understand this subject. I'll give the example with a tube amp, for simplicity...What actually drives the speaker is the voltage, so I think you're right that the maximum voltage is probably 20 at 8 ohms.
with P=I*V, the output transformer in the power amp need to lower the very high impedance at the plate of the power tube to that of the speaker (say, 4 ohms). The tube is also driven by a very high voltage (say, 600V). A 100W amplifier should be managing at peaks a current of 0.16A and an impedance of (ohms' law) some 3600 ohms.
Then the output transformer, for those same 100W of power and the 4 ohms load, should be driving a combo of V*I totaling 100. What determines the choice if this is done with a 2:1 voltage drop (300V, 0.33A, given by the 2:1 ratio in the primary and secondary winding); or a 10:1 voltage drop (60V, 1.66A, with a winding ratio of 10:1 primary/secondary)?
Is it preferable to lower voltage as much as possible, so to have the current as high as possible, that some speakers prefer? Is it much more expensive to do it that way?