Now let's hit this claim.
It is true that J-test signal was produced to show jitter induced by serial digital cables used for S/PDIF and AES/EBU. The signal itself is a quarter sample rate tone that has a single bit that toggles back and forth. Due to encoding of PCM ("two's complicate") that one tiny toggle of the lowest order bit causes all the bits in the audio sample to change. The frequency of change for this square wave is Sampling rate/192.
Here is the waveform in digital domain:
View attachment 6726
The dots are the actual samples. The sinewave is an interpolation/simulation by the audio workstation program (Audition in this case), but is representative of what comes out due to filtering of the DAC. The tiny undulations explained above are too small to be visible.
The precise, quarter sampling rate ("FS/4") eliminates the need for quantizing/dithering them allowing the displayed noise to be that of the system under test, than the signal itself.
For a 48 Khz signal, the main tone will be at 12 Khz. For 44.1 Khz, it will be at 11.025 Khz. This signal is presented at full amplitude (or near it to avoid clipping) which is far higher than we have in music. That and its high frequency means that jitter effects will be amplified and captured in the output of the DAC.
This signal is played (as a wave file these days from a computer) and we then perform a spectrum analysis. The spectrum analysis is performed in software and can benefit from multiple runs/averaging and high resolution DFT (discrete fourier transform) to substantially reduce the measurement system's noise level. This then allows the jitter components of the device under test to show up even if they are at low levels.
View attachment 6727
Notice the extreme sensitivity of the test. At 8Khz we have a spike. That is 4 Khz below our main tone of 12 Khz. We add 4Khz to that and we land at 16 Khz and see another spike there. We would need to do an additional test to make sure that is not reference voltage modulation but for now, let's assume it is jitter. We are seeing a spike rising from amazingly low level of -130 db to -125 db (SPL). The system is that sensitive. This is despite my analyzer being a generation older than current ones on the market. Using the above signal processing technique we are able to still dig deep -- super deep -- into the output of the DAC.
Even tinier spikes are visible as you see in red. The two DACs are distinguished even though the difference is incredibly low.
For some of my testing I just use a simple high frequency tone, i.e. don't include the FS/192 toggling component. Here is an example of that:
View attachment 6728
So once again we see how revealing a simple, high frequency tone at high amplitude is in showing all that ails the DAC. For low frequency random jitter which triggered this thread, see the spreading of the main tone.
The simplicity of the test, i.e. a single tone, makes it easy to see all the distortion products as represented by the spikes, or changing of the noise floor. It is for this reason that J-test continues to get heavy use. It works and works simply to show incredibly small distortions including clock jitter.
While I have not seen it often, JA tests in Stereophile frequently show components of the FS/192 square wave embedded in J-test. Those are not due to interface jitte but crosstalk inside the DAC due to all the bits transitioning at once.
OK this got long.
For a quick summary,
J-test signal is highly revealing of any jitter induced in the signal on the output of the DAC. It is the ultimate test as it measures the analog, final, output of the DAC. Measuring clock jitter going into DAC is not representative of anything useful because we don't know how much of that jitter is filtered, attenuated or exaggerated by the DAC.