• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

What is jitter in "jitter noise measurement"-section

Witteg

New Member
Joined
Feb 10, 2021
Messages
1
Likes
1
We all know that jitter is bad and no jitter is good, but lately I've been starting to wonder what kind of jitter is this section actually measuring.

When people ask what is jitter we usually get the wikipedia answer. It's shift in time signature or clock fluctuation. This I understand. And I also understand what it does in when sigma-delta-conversion is done. And this seems to be the thing that jitter noise measurement actually measures. Where things get tricky and for me hard to understand, is how SPDIF or USB factor in.

Let's start with SPDIF, where jitter is more understandable. SPDIF data stream has clock embedded to it and if source clock has jitter in it, well then the whole stream is screwed, but only if DAC is using SPDIF-clock as master clock. This of course never happens with modern DACs because SPDIF-stream is reclocked, because no engineer that isn't a complete tool would never ever trust that SPDIF-clock is any good. What puzzles me is that Biphase Mark Code, that is used as data stream format, should be relatively easy to read jitter free into a buffer. Because buffer doesn't have clock in it we also have eradicated all SPDIF jitter. I haven't actually tried to design such BMC-decoder (mainly because I'm a software engineer not hardware one), but it seems relatively simple task to do and all the SPDIF source clock issues would be solved. We will have just bits in the buffer that would be reclocked with DACs internal clock and all the jitter in the Delta-Sigma would be jitter generated by DAC internals. But that doesn't seem to be the case, so what am I missing here?

With USB we have basically the same thing. USB has its own clock line. We read data into buffer and reclock it with accurate internal clock and again jitter should have vanished and all the jitter should just be DACs internal jitter, but again we measure the jitter with USB-connection as part of the mix. With async USB, we can just transfer the PCM-data and again we have no jitter.

So there must be some jitter-component that is not part of the pure SPDIF/USB data stream clock or the stream buffering is much more complicated that I think it is. Still for me this seems quite trivial problem to solve, but measurements and the emphasis that DAC-designers put to jitter removal seems to point out that it is not.

Any insights what am I missing here?
 

mansr

Major Contributor
Joined
Oct 5, 2018
Messages
4,685
Likes
10,703
Location
Hampshire
Let's start with SPDIF, where jitter is more understandable. SPDIF data stream has clock embedded to it and if source clock has jitter in it, well then the whole stream is screwed, but only if DAC is using SPDIF-clock as master clock. This of course never happens with modern DACs because SPDIF-stream is reclocked, because no engineer that isn't a complete tool would never ever trust that SPDIF-clock is any good.
On the contrary, the S/PDIF clock _must_ be recovered in order for the data to be received at all. Moreover, the DAC must match the average clock frequency of the incoming stream, or else its buffers will either underflow or overflow. The residual jitter in the output depends on the quality of the PLL in the receiver. Some are much better than others.

With USB we have basically the same thing. USB has its own clock line. We read data into buffer and reclock it with accurate internal clock and again jitter should have vanished and all the jitter should just be DACs internal jitter, but again we measure the jitter with USB-connection as part of the mix. With async USB, we can just transfer the PCM-data and again we have no jitter.
USB doesn't have a separate clock signal, but that's beside the point. With adaptive/asynchronous transfers, the receiving DAC dictates the data rate and the host computer adjusts the number of samples per USB packet to match. The received data is placed in some kind of FIFO buffer and removed at a rate determined by the local clock. As you say, this method completely avoids any influence from the upstream clock as this is only used for the USB link itself.

So there must be some jitter-component that is not part of the pure SPDIF/USB data stream clock or the stream buffering is much more complicated that I think it is. Still for me this seems quite trivial problem to solve, but measurements and the emphasis that DAC-designers put to jitter removal seems to point out that it is not.
All clocks have some non-zero amount of jitter. With a high-quality crystal oscillator it can be very low but never zero. Digital circuits are also not perfect. Every gate the clock signal traverses between the oscillator and the D/A conversion stage adds a small amount of jitter. How much depends on various factors including the IC design and the power supply quality. Keeping the jitter below audible levels is easy. Avoiding it entirely is impossible.

The reason jitter is so much talked about is that it has become an audiophile bogeyman, like this:
1612966787583.png
 

pedrob

Active Member
Joined
Oct 1, 2020
Messages
138
Likes
45
I've seen low jitter cables, which I guess are less likely to compromise timing.

Then I've seen jitter in graphs that show harmonic and inter harmonic frequencies added to the original frequency or frequencies with 32 test tones.

Are there two types of jitter of is there mixed usage of the term?
 

AnalogSteph

Major Contributor
Joined
Nov 6, 2018
Messages
3,385
Likes
3,335
Location
.de
https://www.audiosciencereview.com/...nese-optical-spdif-switches.10051/post-276392
A crabby Realtek codec has little jitter to begin with, only limited by noise floor.
That said, it no doubt has some help from its DAC architecture (switched capacitor filters and stuff to get jitter down to multibit levels). I rather suspect the old CMedia 8738 and 8768 were pretty much plain 1-bit converters without any special adornments and that's why they sounded (and measured) so lousy... not like their CMOS opamps are likely to have been anything exciting, mind you, so that wouldn't have been helping.

Fun fact: If you want to bother a classic Realtek HDA codec, use 44.1 kHz (or 88.2 if possible). DAC performance will be pretty much unaffected, but the ADC won't be a happy camper. (ADCs are way more reliant on a clean clock.) In a loopback test using my Audigy FX (ALC898), noise level went up from -99.7 dB(A) in 24/48 to -85.8 dB(A) in 24/44. I don't think their anti-alias filters are particularly ideal for 44.1 either...
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,633
Likes
240,671
Location
Seattle Area
We all know that jitter is bad and no jitter is good, but lately I've been starting to wonder what kind of jitter is this section actually measuring.
Let me be clear that the measurement system does NOT attempt to measure jitter. Indeed no jitter specification is provided. All we are doing is digitizing the output of the DAC using a special test signal (j-test) and see what we find there. What is there routinely has non-jitter components, noise, spurious tones, etc. And yes, jitter if it exists (which has to be symmetrical around 12 kHz).

For the same reason, other tests can also include effect of jitter as you mentioned. That said, jitter is proportional to frequency so 1 kHz tone usually doesn't show jitter in the dashboard unless it is quite high in level.
 

bennetng

Major Contributor
Joined
Nov 15, 2017
Messages
1,634
Likes
1,693
That said, it no doubt has some help from its DAC architecture (switched capacitor filters and stuff to get jitter down to multibit levels). I rather suspect the old CMedia 8738 and 8768 were pretty much plain 1-bit converters without any special adornments and that's why they sounded (and measured) so lousy... not like their CMOS opamps are likely to have been anything exciting, mind you, so that wouldn't have been helping.

Fun fact: If you want to bother a classic Realtek HDA codec, use 44.1 kHz (or 88.2 if possible). DAC performance will be pretty much unaffected, but the ADC won't be a happy camper. (ADCs are way more reliant on a clean clock.) In a loopback test using my Audigy FX (ALC898), noise level went up from -99.7 dB(A) in 24/48 to -85.8 dB(A) in 24/44. I don't think their anti-alias filters are particularly ideal for 44.1 either...
I have similar results (raised noise), but the 44.1k issue can be easily solved by using the Windows SRC like this:
https://www.audiosciencereview.com/forum/index.php?threads/interface-mystery.13115/post-392905
 

ninetylol

Addicted to Fun and Learning
Joined
Dec 7, 2019
Messages
686
Likes
651
I got a follow up question. Can different optical outputs from, different devices be less good than others?

If I buy the best DAC in the world and my motherboard optical out has problems with jitter, would the dac be a waste?

Its hard for me to fathom, that there wont be any quality differences with different optical outputs.
 

voodooless

Grand Contributor
Forum Donor
Joined
Jun 16, 2020
Messages
10,383
Likes
18,317
Location
Netherlands
If I buy the best DAC in the world and my motherboard optical out has problems with jitter, would the dac be a waste?

Wouldn’t the best DAC in the world eliminate jitter? There are varing degrees of quality in clock recover, some are just better than others. HDMI audio is notoriously bad regarding jitter. So much so that it sometimes causes some spdif receivers to loose sync (for instance via a HDMI audio extractor)
 

bennetng

Major Contributor
Joined
Nov 15, 2017
Messages
1,634
Likes
1,693
I got a follow up question. Can different optical outputs from, different devices be less good than others?

If I buy the best DAC in the world and my motherboard optical out has problems with jitter, would the dac be a waste?

Its hard for me to fathom, that there wont be any quality differences with different optical outputs.
Just return the problematic DAC that doesn't work well with your mouse. It is not about jitter, but about loss of connectivity and functionality. The "best" DAC is rather pointless anyway, a few months later there would be another "best DAC" show up again.
 

ninetylol

Addicted to Fun and Learning
Joined
Dec 7, 2019
Messages
686
Likes
651
Just return the problematic DAC that doesn't work well with your mouse. It is not about jitter, but about loss of connectivity and functionality. The "best" DAC is rather pointless anyway, a few months later there would be another "best DAC" show up again.
Actually i got no problems with optical, but I always wondered, if it could be the weak link in a good system.
 

pedrob

Active Member
Joined
Oct 1, 2020
Messages
138
Likes
45
Those are a scam. Cables do not add jitter in the first place.

Are you suggesting that any type of cable is able to pass any frequency at any voltage without introducing timing differences? No one would ever suggest that a house mains cable would be suitable for speakers.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,250
Likes
17,182
Location
Riverview FL
A little footnote from ESS on the sound of jitter:

Technical Details of the Sabre Audio DAC
Martin Mallinson and Dustin Forman, ESS Technology Technical Staff

http://www.esstech.com/files/4314/4095/4318/sabrewp.pdf -- footnote 15


"The noise that jitter induces is not easily described: it is not a harmonic distortion
but is a noise near the tone of the music that varies with the music: it is a noise that
surrounds each frequency present in the audio signal and is proportional to it.
Jitter noise is therefore subtle and will not be heard in the silence between audio
programs. Experienced listeners will perceive it as a lack of clarity in the sound
field or as a faint noise that accompanies the otherwise well defined quieter
elements of the audio program."
 

bennetng

Major Contributor
Joined
Nov 15, 2017
Messages
1,634
Likes
1,693
Actually i got no problems with optical, but I always wondered, if it could be the weak link in a good system.
By talking about loss of connectivity and functionality, it would be similar to buying a monitor with HDMI and DP input but one of them doesn't work, IMO it is not a trivial issue. What if you need to connect more than one source to the DAC in the future? You lost the USB input and that part is not cheap. Just for instance RME has an offer of the ADI-2 series without USB connectivity and the cost is significantly cheaper. Motherboard optical output to the DAC is stable just because it doesn't involve USB. You can't evade the problem forever.
 

ninetylol

Addicted to Fun and Learning
Joined
Dec 7, 2019
Messages
686
Likes
651
By talking about loss of connectivity and functionality, it would be similar to buying a monitor with HDMI and DP input but one of them doesn't work, IMO it is not a trivial issue. What if you need to connect more than one source to the DAC in the future? You lost the USB input and that part is not cheap. Just for instance RME has an offer of the ADI-2 series without USB connectivity and the cost is significantly cheaper. Motherboard optical output to the DAC is stable just because it doesn't involve USB. You can't evade the problem forever.
You are right of course. I actually always prefered optical anyway, but wanted to test full MQA decoding via USB. After 24 hours of testing im still not sure if its a DAC problem though. Could also be a driver, software or hardware problem with the rest of my PC.
 

Wes

Major Contributor
Forum Donor
Joined
Dec 5, 2019
Messages
3,843
Likes
3,790
Some have described jitter as a veiling of the music (yes, really).


and BTW, I suggest that a house mains cable would be suitable for speakers.
 

BDWoody

Chief Cat Herder
Moderator
Forum Donor
Joined
Jan 9, 2019
Messages
7,062
Likes
23,375
Location
Mid-Atlantic, USA. (Maryland)

the_hamster 2

Member
Joined
Jan 20, 2021
Messages
87
Likes
86
Given the SOTA of signal transmission and processing, can an ordinary non-“Golden Ear” listener actually truly detect “jitter” during music playback, when such music is streamed through contemporary kit? How does one “remove” jitter if it can’t be reasonably heard, if that’s the case? Just curious.
 
Top Bottom