We all know that jitter is bad and no jitter is good, but lately I've been starting to wonder what kind of jitter is this section actually measuring.
When people ask what is jitter we usually get the wikipedia answer. It's shift in time signature or clock fluctuation. This I understand. And I also understand what it does in when sigma-delta-conversion is done. And this seems to be the thing that jitter noise measurement actually measures. Where things get tricky and for me hard to understand, is how SPDIF or USB factor in.
Let's start with SPDIF, where jitter is more understandable. SPDIF data stream has clock embedded to it and if source clock has jitter in it, well then the whole stream is screwed, but only if DAC is using SPDIF-clock as master clock. This of course never happens with modern DACs because SPDIF-stream is reclocked, because no engineer that isn't a complete tool would never ever trust that SPDIF-clock is any good. What puzzles me is that Biphase Mark Code, that is used as data stream format, should be relatively easy to read jitter free into a buffer. Because buffer doesn't have clock in it we also have eradicated all SPDIF jitter. I haven't actually tried to design such BMC-decoder (mainly because I'm a software engineer not hardware one), but it seems relatively simple task to do and all the SPDIF source clock issues would be solved. We will have just bits in the buffer that would be reclocked with DACs internal clock and all the jitter in the Delta-Sigma would be jitter generated by DAC internals. But that doesn't seem to be the case, so what am I missing here?
With USB we have basically the same thing. USB has its own clock line. We read data into buffer and reclock it with accurate internal clock and again jitter should have vanished and all the jitter should just be DACs internal jitter, but again we measure the jitter with USB-connection as part of the mix. With async USB, we can just transfer the PCM-data and again we have no jitter.
So there must be some jitter-component that is not part of the pure SPDIF/USB data stream clock or the stream buffering is much more complicated that I think it is. Still for me this seems quite trivial problem to solve, but measurements and the emphasis that DAC-designers put to jitter removal seems to point out that it is not.
Any insights what am I missing here?
When people ask what is jitter we usually get the wikipedia answer. It's shift in time signature or clock fluctuation. This I understand. And I also understand what it does in when sigma-delta-conversion is done. And this seems to be the thing that jitter noise measurement actually measures. Where things get tricky and for me hard to understand, is how SPDIF or USB factor in.
Let's start with SPDIF, where jitter is more understandable. SPDIF data stream has clock embedded to it and if source clock has jitter in it, well then the whole stream is screwed, but only if DAC is using SPDIF-clock as master clock. This of course never happens with modern DACs because SPDIF-stream is reclocked, because no engineer that isn't a complete tool would never ever trust that SPDIF-clock is any good. What puzzles me is that Biphase Mark Code, that is used as data stream format, should be relatively easy to read jitter free into a buffer. Because buffer doesn't have clock in it we also have eradicated all SPDIF jitter. I haven't actually tried to design such BMC-decoder (mainly because I'm a software engineer not hardware one), but it seems relatively simple task to do and all the SPDIF source clock issues would be solved. We will have just bits in the buffer that would be reclocked with DACs internal clock and all the jitter in the Delta-Sigma would be jitter generated by DAC internals. But that doesn't seem to be the case, so what am I missing here?
With USB we have basically the same thing. USB has its own clock line. We read data into buffer and reclock it with accurate internal clock and again jitter should have vanished and all the jitter should just be DACs internal jitter, but again we measure the jitter with USB-connection as part of the mix. With async USB, we can just transfer the PCM-data and again we have no jitter.
So there must be some jitter-component that is not part of the pure SPDIF/USB data stream clock or the stream buffering is much more complicated that I think it is. Still for me this seems quite trivial problem to solve, but measurements and the emphasis that DAC-designers put to jitter removal seems to point out that it is not.
Any insights what am I missing here?