A few years ago I did a test of audibility of clock jitter using an Audiphilleo USB to S/PDIF converter. This unit came with a convenient switch that changed the clock jitter/noise profile as such:
Blue is its normal, high-precision clock. The one in pink is the degraded "jitter simulator." We see that above 1 Khz or so, the two clocks already have phase noise/jitter that is below -100 db so what happens above that is inconsequential from audibility point of view. As we get lower though, we heap on some 40-60 db of noise.
I did a listening test first sighted. The high-precision clock sounded smoother, warmer, more analog like.
I then closed my eyes, randomly switched back and forth and then tried to identify the better clock. No luck. All the difference I thought I heard, were no longer there.
I then took the test to work and tested two other people. They had no perception of difference between the two and gave me a blank look when I asked if they heard an improvement.
This of course agrees with psychoacoustics (and perils of using sighted tests). Close-in jitter is subject to very strong masking and hence highly inaudible.
Mind you, there may be an audible difference there but relative to the effort that I put in there, it shows no such problem.
Furthermore, the "good clock" in audiophilleo is still much worse than the clocks advocated that are subject of this thread. I use that unit all the time in my music and it doesn't display any of the audible ills that people say such close-in jitter has. Great music sounds superb and as good as any DAC I have heard anywhere at any room, any location.
I have a better unit (Berkeley) but can't use it due to incompatibility with Dirac Live EQ.