Myth or not, it has a thread of truth. Play a 100 Hz sine wave and turn up the volume to a loud level close to the limits of the speaker. Now keep the volume knob where it is and replace it with a square wave. Do you think the tweeter will survive? I doubt it. Sure, actual clipping may not be a square wave and the high frequencies may be lower in amplitude. But they will be introduced. So the degree of relevance depends on the clipping behavior of any particular amp.
It does have a small thread of truth...we know the harmonics of a square wave will produce HF content. As will clipping, but not as severe as a square wave.
Here's a line level measurement of a 100Hz square wave sent through a DSP crossover, showing relative drive levels for a two-way speaker.
Crossover is a Linkwitz-Riley 12 dB/oct @3000Hz. So tweeter is handling 3kHz up.
The top meter is the tweeter side of the crossover, and is measuring the level of the 100Hz square harmonics going through it.....it reads -36 dB.
The bottom is the woofer, measuring the 100Hz fundamental and its harmonics below crossover....it reads -16dB.
That 20 dB difference equals about a 100 to 1 ratio, the power going to the woofer vs the power going to the tweeter.
So a 100W amp, sending a 100Hz square wave through said crossover, would pass only about 1W to the tweeter.
If crossover is dropped to 1000Hz, there is still a 15dB difference between woofer and tweeter using the 100Hz square wave . A 32 to 1 ratio.
But the lower a tweeter is designed to run, the greater its power handling capability is.
I don't think any amp can clip worse that a square wave, can it?
If not, seems to me the 'clipping can take out tweeters idea' can only happen in carefully crafted situations.