Some of this might be redundant... I just quickly scanned the comments above...
Loudness depends sensitivity (usually rated as X-dB SPL at 1 Watt and 1 meter). Then of course more wattage means louder, up to whatever the speaker can handle. Sensitive is unrelated to power rating.
If you want to calculate the maximum SPL, there are different dB formulas for voltage & power so make sure to use the power version. And the dB calculation will give you the
difference relative to the reference... If the calculation gives you "10dB", that's 10dB louder than the speaker's 1W@1M rating.
This stuff gets a little "fuzzy"
but a speaker rated for 100W is supposed to be safe with an amplifier playing regular program material and putting-out 100W on the peaks.
You can't really trust manufacturer's speaker ratings or amplifier power ratings!

There is an IEC standard that makes some standardized assumptions about the "fuzzy" parts but I almost never see "IEC" in the specifications.
With regular program material the average power from the amplifier will be a fraction of the peaks so you might fry the speaker with continuous test-tones.
Speakers are rarely fried "at home". We are usually listening at a few watts and in most cases you can use an amplifier with a lot more power than the speaker rating because you don't have it cranked-up all the way and only running a few watts before it gets uncomfortably loud. Speakers usually get fried at parties when a drunk person has access to the volume control, or by teenagers...
On the other hand... Once you get "loud" it takes exponentially more power to get more loudness so once you exceed the speaker's power rating, things get dangerous fast. +6dB is 4X the power and +10dB is 10X the power.... So if 100W isn't loud enough you're probably going to want 500W or more, and in most case that means new/different speakers.
In "pro" environments like with live music or in a dance club you have to be more careful to make sure the speakers can handle the amplifier power.
Or if you drive the amplifier into clipping, most amplifiers can put-out more than their rated power when clipping. A "worst case" clipped sine wave turns into a square wave and a square wave has twice the power of a sine wave of the same voltage.
Also, normal program material has more low-frequencies energy than high-frequency energy. It's easier to fry a tweeter with test-tones than a woofer.
And clipping generates harmonics so that's more high-frequency energy to potentially damage the tweeter. There is a myth that it's safer to run a high-power amp than to drive a lower-power amp into clipping. But in reality a 400W amp running at full power (not clipping) is sending more power to the tweeter than a 100W amp with 6db of clipping. Both are dangerous (for the speaker) but the high-power amp is more dangerous than the clipping lower-power amp. Plus, you are likely to turn it up higher if it's not clipping.
It's rare to fry a speaker but you can't really be 100% safe unless your speakers are way over-rated, like maybe 100W speakers and a 10W amplifier.
With impedance you can usually "assume" the speaker impedance is constant at it's rated value. ...An amplifier rated for 4-Ohms is usually fine with a speaker rated at 4-Ohms even if the actual speaker impedance drops to 2 Ohms, etc. Of course there are exceptions to everything.
...I actually have an amplifier rated for 8-Ohm speakers driving my 4-Ohm subwoofers... So I could burn-up the amp but I'm not pushing it that hard and It's been OK for a few years. I could buy a properly-rated amp now or I can wait 'till mine dies and then buy one, if that ever happens. I'm not really risking anything since it's an old amp and I wouldn't have a use for it if I replaced it.