To illustrate that this analysis went wrong, this is the spectrum of the above test with 24bit and truncation to 16bit (the J-test signal is one of the few signals that re immune to truncation by design), using not DUT at all, simply loading the generator files directly into the analyzer:
View attachment 417992
Exactly the same as with the real DUT.
The only thing we can deduce from that plot is that there is a truncation from 24 bit to 16 bit somehwere which could be anything from a simple human setup mishap to a real bug (the DUT actually receiving 24bit data via USB but truncating it to 16).
This has nothing to do with clocks, clocks aren't even involved in this test, remember this is digital loopback.
Truncation / bit-transparency would have been more easy to check with a simple bit level compare because data out must be data in, if not something is fishy. For convenience, a simple ditherered 24bit sine would do, compaing the loopback with and without DUT inserted which should be 100% identical. With DUT, one would see the typical distortion pattern of an (practically) undithered 16bit sine wave.
-------:-------
Therefore,
@amirm, it would nice if you could at least change the caption below the plot to what is really happening there, a truncation rather than a clocking issue.
And the real deal would be the mentioned true jitter analysis with the AP, scanning the analog waveform of that digital output (vs. AP's one one) and looking for time irregularities, even eye-pattern etc, to really see if that adapter is putting out a totally jittery signal... which could well be of course. Sometimes engineers think they are clever by trying to achieve the exact nominal sample rates by occasionally stretching or compressing the timing. Raspberry Pi's I2S output is famous for this strategy, introducing huge amounts of jitter.