• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

A Case of the Jitters

watchnerd

Grand Contributor
Joined
Dec 8, 2016
Messages
12,449
Likes
10,414
Location
Seattle Area, USA
An article in which the author claims that "classic jitter" isn't much of a problem in today's better DACs, but RFI in the GHz range can introduce errors in the bitstream:

http://www.psaudio.com/article/a-case-of-the-jitters-2/

It doesn't seem bonkers, but I'm failing to see how this would show up as anything other than in the noise floor and possibly already does given the saturation of the GHz bandwidth in most domestic environments.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,658
Likes
240,909
Location
Seattle Area
Don can explain at a deeper level but for now the computation of 340 picoseconds limit of timing error is based on a sinewave running at full amplitude at 20 Khz and computing an error equiv. to 1 bit. So in theory, yes we need to have jitter that is less than 340 picoseconds (I think that is RMS, peak to peak is like 500+ picoseconds).

Of course music doesn't have full amplitude at 20 Khz, nor does missing an equiv. bit means we hear it. Julian Dunn who originally came up with the above computation later on modified that to show the effect of masking:

upload_2017-2-26_18-47-2.png


So we need to perform a spectrum analysis of jitter and compare it to above chart. At lower frequencies the threshold as you see is hugely above 340 picoseconds he mentions.
 
OP
watchnerd

watchnerd

Grand Contributor
Joined
Dec 8, 2016
Messages
12,449
Likes
10,414
Location
Seattle Area, USA
Don can explain at a deeper level but for now the computation of 340 picoseconds limit of timing error is based on a sinewave running at full amplitude at 20 Khz and computing an error equiv. to 1 bit. So in theory, yes we need to have jitter that is less than 340 picoseconds (I think that is RMS, peak to peak is like 500+ picoseconds).

Of course music doesn't have full amplitude at 20 Khz, nor does missing an equiv. bit means we hear it. Julian Dunn who originally came up with the above computation later on modified that to show the effect of masking:

View attachment 5716

So we need to perform a spectrum analysis of jitter and compare it to above chart. At lower frequencies the threshold as you see is hugely above 340 picoseconds he mentions.

All of the above matches with what I've learned before.

It's his statements along these lines that raised my eyebrows:

"The process of recognizing a trigger event occurs in an analog circuit, and in order to have the capability to recognize the trigger with a precision of 346ps it needs to have a bandwidth of at least 3GHz. That bandwidth lies deep in the RF frequency spectrum, which means that it will be susceptible to noise and desperately sensitive to external interference. Circuits with GHz bandwidth and above become exponentially more difficult to design, and exponentially more expensive to construct as you attempt to reduce their noise. "

"The actual D-to-A conversion requires the output voltage to be set to a target level as soon as possible after the clock pulse is detected, and held to that level until the next clock pulse is detected. Once again, that circuit is going to need to have a bandwidth close to 3GHz if it is going to respond quickly enough and accurately enough to the clock pulse, and consequently is going to be highly susceptible to noise."
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,894
Likes
16,710
Location
Monument, CO
After a quick skim it reads like a number of other audio articles that twist the science to support an erroneous conclusion. For starters, you don't need 3 GHz of bandwidth to produce a clock with jitter below 346 ps, nor does the receiver need that much bandwidth; you need that much bandwidth to support a signal with a period of 346 ps. Pretty sure, though somebody will have to check my math, the period of a 44.1 kHz clock is greater than 346 ps. I get 1/44,100 = 22.7 us, or about 65,536 (2^16, imagine that) times greater. ;) I really doubt even those fancy DACs with ps-level jitter have 3 GHz clock bandwidths. All you need do is make sure the noise (jitter) on the actual sampling clock is low enough and the sampling point accurate enough. And note dither makes noise out of the lower few lsbs anyway.

Below is the aperture time for frequency charted for a number of converter (ADC or DAC) resolutions. The sampling rate actually falls out of the equation assuming a single sinusoidal signal. Random jitter greater than the aperture window will cause a 1-lsb error.
20100806_aperture_plot.PNG


Another way to look at this is in terms of SNR for an ideal converter. The chart below shows the SNR assuming a full-scale input signal for the specified frequencies and how jitter affects it.
20100810_aperture_error_SNR_plot.JPG


Finally, the loss in SNR for a certain level of jitter:
20100810_aperture_error_SNR_loss_plot.JPG


Correlated jitter is a bigger problem, since it can cause larger fixed spurs, but even that is below audibility for most of us when buried in music. Amir has lots of data showing the impact of both I believe. I wrote a couple of jitter threads over on WBF I could copy here at some point. They explain it a little better.

HTH - Don
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,759
Likes
37,612
The whole article is in error anyway. He just does it as if 44,100 x 65,536 gives you the amount of time to change by one LSB value. He actually should divide his answer by 2pi which would give the roughly 55 picoseconds which is more or less correct. 1/(sample rate x number of bit values x 2pi) is the correct formula.

You have to remember a sine wave is changing most rapidly around the mid point of the wave. It is changing most slowly at the peaks and troughs of the sine wave. In fact those peaks and troughs are so much slower that at 22,050 hz it takes about 37,800 picoseconds for the value to change an amount equal to the LSB.

The timing accuracy is nevertheless set by that of the mid point crossing of the sine wave. Remember there is one and only one set of sample values that fit any given wave at any given timing as long as all signals are less than half the sample rate.
 
Last edited:
Top Bottom