Just note that these things were never meant to provide a super clock. They are built for studios to sync up various equipment. Also note that the 10 MHz clock these things provide isn't what is needed by the DACs, so further processing is needed to create the actually needed frequency. This is done once again via a PLL, which will again act as a jitter-reduction barrier. So in general, these master clocks don't really need to be so good anyway, and for sure, the long cable runs in a studio will degrade clock jitter. So a PLL at the receiver is basically mandatory.
Very true. And I am currently evaluating the SMSL G1 clock. It gives me a headache to try to identify its true performances, that seem to be below what I can measure, which is 0.03ppm. But at least I could measure a positive influence, and that is when processing optical data coming from a CD Player that has a clock with a precision of 15ppm measured from its analog outputs. Without the G1 Clock, the same CD player sending SPDIF to an SMSL D200 DAC shows that imprecision (15ppm). With the G1 used as the external clock by the D200, then it falls to -6ppm. This is not a big change, but it's the illustration of what you explained.
And finally: a lot of music was probably recorded at a time when clocks were nowhere near as good as today. So claiming that some miracle jitter reduction from -139 dB to -143 dB would be audible is ludicrous. Never mind if the source is tape...
It is funny because in that video, when I saw the two results, I thought they were the same, and then he says one shows an improvement. And so that made me look closer, because that could be the difference between two runs. And actually... well... they are not the same Jitter tests... :
Since the Jitter test is run at 1/4 of the sampling rate, we see the two tests are not the same and this is enough to explain the difference seen. As an exemple, this is the same tests with the SMSL D200 DAC. Red trace is J-Test at 44.1kHz, and green trace is at 48kHz:
Nothing else changed, yet we see a small difference. Some artifacts are present at 44.1kHz, look at the plot showing -143.6dBFS spike at 44.1kHz (red spike) and nothing (-158.3dBFS) at 48kHz.
EDIT : I forgot to mention, 4 averages is not enough to analyze at that low level. 32 would be better to average differences between two captures. Look at the below:
Nothing has changed in the setup between the two runs. So stating there's a change in resolution between the two would be incorrect.
I think that video needs an update.