And besides, is there a measuring device that actually measures down to those levels claimed ?
16-bit picoscope after gain and/or rejecting test sine.
I think there should be bit accuracy of at least 14 bits in the 24-bit DAC->SAR ADC signal path after magnitude scaling, time offset and DC removal.
ADC and DACs for measurement have "
no missing codes" feature.
Delta-sigma have idle tones, limit cycles and "all missing codes" after multi-stage noise shaping.
"To give one example, you could take a DAC and do something very classical, like sweep the level of a sinusoidal signal from full scale to nothing, and then look to see how distortion changes with signal level. You might find some minuscule squiggles at lower levels and shrug them off as measurement errors, like, “OK, that is just the machine not correctly measuring noise.” But I got suspicious at some point and said, “Hang on, let me try to find explicitly whether something happens in the noise floor with the signal modulation, but then I have to do so without a signal present. How do you do that?” Well, you sweep a DC input to a DAC. You feed it a constant code, some small value, and measure the noise. Increase that code and repeat. Suddenly you’ll find that some of these D-to-A converters will do these frightening things, like the noise floor suddenly shooting up or an audible whistle actually just walking through the audioband as you sweep, going from supersonic down to zero and then back up."
https://www.soundstageultra.com/ind...s-of-mola-mola-hypex-and-grimm-audio-part-one