• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Excuse me... Do you have the correct time? Do you care?

JRT

Member
Joined
Sep 17, 2019
Messages
21
Likes
14
In case some others might find this interesting... Here is a simple video recently posted on YouTube that shows some timing error differences in USB D/A converters.

.
 
No, 192kHz tracks do not have higher timing accuracy.

No, it does not matter if a 600 seconds track ends up as 600.028 seconds after conversion.

No, give neither this man nor the manufacturer your money by purchasing a master clock through the affiliate link.

BTW, since the clock accuracy of the ADC used in this test was not established, it's entirely possible for the cheap Douk to be more precise, and the expensive DAC to be too fast, compensated for by a theoretical slow running ADC.

We just don't know.

(Not that it would matter given the meaningless differences)
 
Last edited:
It doesn’t matter in practice, but it’s still surprising to see 600 seconds stretch by 28 ms - that’s sloppier than you’d expect, since even cheap master clocks in phones and other mass-produced devices are usually tighter.
 
It doesn’t matter in practice, but it’s still surprising to see 600 seconds stretch by 28 ms - that’s sloppier than you’d expect, since even cheap master clocks in phones and other mass-produced devices are usually tighter.
Really? That is about 50ppm - which is quite a typical spec for a crystal oscillator.
 
Really? That is about 50ppm - which is quite a typical spec for a crystal oscillator.
Exactly the point - why use the cheapest of oscillators in a supposedly precision device? The two don’t really have much to do with each other, but it’s still hilarious: 24-bit resolution implies precision about a thousand times better than a 50-ppm clock.
 
Thanks for sharing the video, it is an interesting test, with equally interesting results.

28ms out of 10mins, very likely inaudible pitch differences, IMO.

Still a very good test…
 
24-bit resolution implies precision about a thousand times better than a 50-ppm clock.
That is no issue. The clock doesn’t have 50ppm jitter, it’s just apparently 50ppm slow. You’ll still have the timing accuracy, or rather, we don’t know because it wasn’t measured.

And as @staticV3 pointed out, for all we know, the ADC clock ran way too fast, and actually, the Douk is the most accurate. You really don’t know if you don’t measure.
 
Last edited:
I had someone tell me that digital timing errors result in the musicians not playing in correct time on the recording.

I explained why that was not the case but I don't think he believed me.
 
Exactly the point - why use the cheapest of oscillators in a supposedly precision device? The two don’t really have much to do with each other, but it’s still hilarious: 24-bit resolution implies precision about a thousand times better than a 50-ppm clock.
Why is it a precision device. On what planet does 50ppm tolerance on a DAC clock cause any problems whatsoever? Why should I pay 1 penny more for a DAC with tighter tolerance clock?
 
Why is it a precision device. On what planet does 50ppm tolerance on a DAC clock cause any problems whatsoever? Why should I pay 1 penny more for a DAC with tighter tolerance clock?
Tighter clock means nicer engineering (to the clock and wherever it may be) .

But as always we mix engineering with audibility.
We shouldn't.
 
For those curious: the absolute frequency accuracy of an oscillator (e.g. 24.000 MHz vs. 24.010 MHz) is a different specification from the cycle-to-cycle frequency variation.

The first (accuracy) can make playback slightly faster or slower, while the second (short-term variation, or jitter) is what really matters for audio quality, since it can modulate the signal and distort the waveform.

Some clocks are better in one area than the other, and designers choose which matters most for a given application. Choosing a clock with low jitter, not high accuracy is the correct choice here. Want both? You will pay an exceptional price for no practical audio gain.
 
You do need synchronized clocks in a studio environment where you're mixing the outputs of multiple devices. If you make your multichannel audio system out of separate, independent DACs, you'll have to consider clock synchronization. Otherwise, a clock difference of 50ppm between two converters has no effect when listening (assuming it's a constant difference). Well, maybe the same track will take some milliseconds longer to play.

Modern software tools report clock differences very precisely. The great REW and my own Multitone both do it. DeltaWave not only computes the clock drift but also corrects for it when comparing tracks. Any time I measure using a separate DAC and ADC, I find some amount of clock drift. It's expected. No big deal,
 
Dare I mention that major, world class, symphony orchestras can differ their tuning by two percent or more.
 
Why is it a precision device. On what planet does 50ppm tolerance on a DAC clock cause any problems whatsoever? Why should I pay 1 penny more for a DAC with tighter tolerance clock?
It does not cause any problems, just like 16-bit vs 24-bit DACs or 32-bit DACs. Today 24-bit and 32-bit DACs are commonplace, and people are fine paying a bit more for them, even though they offer no practical improvement.
 
Back
Top Bottom