Class A, like any amplifier, will have other distortion products even if there is no crossover distortion.
Class AB amplifiers are class A at low levels, so no crossover distortion at low power levels, and as the signal moves to higher levels and thus class B operation, feedback reduces the distortion. Some of the best-measuring amplifiers utilize class AB cores (e.g. Benchmark). "Switchover" effects are grossly misrepresented in marketing literature IME, just like the influence of cables and other issues.
Class D is basically a PWM signal so crossover distortion in the conventional sense does not occur; the signal is constantly switching and then filtered to produce a continuous output. Same as happens with a delta-sigma DAC. Modern designs using self-oscillating designs incorporate feedback well above most conventional A or AB designs, resulting in much lower distortion.
The advantage of class D operation is high efficiency at high power, something not of concern for a preamplifier. Class A is also arguably the simplest way to design the preamplifier's circuits. Conventional design serves well and is what most manufacturers know. I also suspect there would be consumer pushback based upon (falsely) poor perception of class D performance, as you exhibit for power amplifiers. All discrete device preamplifiers I have seen operate with class A circuits, not because AB or D doesn't work, but because class A is generally a simpler design, works adequately, and at such low power levels nothing more is required. Note that many op amps, often used in preamplifiers and the low-level circuits in a power amplifier, include class AB output stages within the IC.
When using an oscilloscope for this you are looking at a very small signal, either a very low output level, or a large signal with the fundamental notched out so you can observe fine details of the waveform. Most high-speed 'scopes are 8 bits, but low-speed 'scopes are usually 12 to 16 bits, with a few incorporating 24-bit delta-sigma ADCs. But with such low signal levels that is usually more than enough. Not if you wanted to measure 100+ dB distortion on a full-scale waveform, but when the waveform is only a fraction of maximum, a 'scope can provide a very good look. With a signal of say 100 mVpp (about 0.16 mW into 8 ohms), an 8-bit converter can resolve about 400 uV (about 2.5 nW into 8 ohms), and a 12-bit converter can resolve about 24 uV (~9 pW). Just how low do you want to go?