Yes, I know that a master clock makes no difference if you are using a single digital device, like a DAC. However, my question is about multiple digital devices.
I read
this article (in a pro audio blog) about whether your studio needs a master clock or not. My take-away from that article was contained in this quote:
The application is for a friend of mine who has a system configured like this:
- Turntable --> Phono stage --> ADC
- ADC --> MiniDSP
- MiniDSP --> 3 Topping DAC's --> rest of the system
My understanding of using multiple digital devices is that each device might latch on to the signal at different times, and variations in clock accuracy between each device might cause clock drift over time. After some time, the difference in timing between DAC's might become audible, particularly if one DAC is driving the tweeter, and another DAC driving the midrange, etc. When I was configuring my own system, a friend of mine who is an audio engineer told me NOT to use multiple DAC's for multichannel digital output because of clock drift. Or if I wanted to, I had to slave them all to a master clock. Because of his advice, I purchased an 8 channel DAC.
However, that article I linked to mentions that a master clock is not needed in "simple" studio setups, and is only essential for complex setups involving multiple ADC's or if video is involved. Because that is a pro audio blog, I am guessing that they do not sit down for an hour listening to a single album played from start to finish, so clock drift may be less of an issue for a "simple" studio where they stop and start tracks which will give all digital equipment in the chain a chance to resynchronize.
As far as I am aware, my friend's ADC (I don't know what brand) and his Topping DAC's do not have clock outputs or inputs, so it would be impossible to slave the DAC's to the ADC, or even slave all the digital devices to an external master clock. I suppose this may not a problem if he was using a digital source, because the signal would stop and start at the beginning of each track, meaning that the DAC's would have an opportunity to resynchronize. But he is using vinyl, which means noise might be transmitted to the DAC's even between tracks, so the DAC's might not have an opportunity to resynchronize.
I did some "back of the napkin" math, and this is what I came up with. Assume we have a DAC with a deviation of 50ppm, and a "worst case" scenario where the difference between the first and second DAC is 50ppm.
- 44.1/16 * 2 channels = 44100 * 16 & 2 = 1,411,200 bits per second
- DAC clock runs at double speed = 1,411,200 * 2 = 2,822,400 cycles per second (or 2.8MHz)
- 50ppm variability at 2.8mHz = 50/1,000,000 * 2.8224 = 0.00014112 seconds (or roughly 0.1ms) every second.
- Clock drift in 60 seconds = 6ms
- Clock drift in 1 minute = 360ms
- Clock drift in 30 minutes = 21,600ms (or 21.6 seconds)
I did not study maths beyond high school so there is a very high probability that I made a mistake in my math. I would appreciate correction, because clock drift of 21.6 seconds at 30 minutes seems astoundingly high to me. I am no match for you engineering types! So please be nice to me if I got my math wrong! Also, according to my calculations, the difference at just 1 minute of playback is 360ms so it should easily be audible. I subjectively did not hear anything amiss after listening for several minutes.
Advising him to change all his equipment to allow slaving to a master clock would be a major expense for him, so I want to check with ASR whether there is any truth to the assertion that clock drift between DAC's can cause group delay, what the magnitude of the problem is, and what it can potentially add up to over time. I do not wish to give bad advice, so your input is welcome.