Keith_W
Major Contributor
Yes, I know that a master clock makes no difference if you are using a single digital device, like a DAC. However, my question is about multiple digital devices.
I read this article (in a pro audio blog) about whether your studio needs a master clock or not. My take-away from that article was contained in this quote:
The application is for a friend of mine who has a system configured like this:
- Turntable --> Phono stage --> ADC
- ADC --> MiniDSP
- MiniDSP --> 3 Topping DAC's --> rest of the system
My understanding of using multiple digital devices is that each device might latch on to the signal at different times, and variations in clock accuracy between each device might cause clock drift over time. After some time, the difference in timing between DAC's might become audible, particularly if one DAC is driving the tweeter, and another DAC driving the midrange, etc. When I was configuring my own system, a friend of mine who is an audio engineer told me NOT to use multiple DAC's for multichannel digital output because of clock drift. Or if I wanted to, I had to slave them all to a master clock. Because of his advice, I purchased an 8 channel DAC.
However, that article I linked to mentions that a master clock is not needed in "simple" studio setups, and is only essential for complex setups involving multiple ADC's or if video is involved. Because that is a pro audio blog, I am guessing that they do not sit down for an hour listening to a single album played from start to finish, so clock drift may be less of an issue for a "simple" studio where they stop and start tracks which will give all digital equipment in the chain a chance to resynchronize.
As far as I am aware, my friend's ADC (I don't know what brand) and his Topping DAC's do not have clock outputs or inputs, so it would be impossible to slave the DAC's to the ADC, or even slave all the digital devices to an external master clock. I suppose this may not a problem if he was using a digital source, because the signal would stop and start at the beginning of each track, meaning that the DAC's would have an opportunity to resynchronize. But he is using vinyl, which means noise might be transmitted to the DAC's even between tracks, so the DAC's might not have an opportunity to resynchronize.
I did some "back of the napkin" math, and this is what I came up with. Assume we have a DAC with a deviation of 50ppm, and a "worst case" scenario where the difference between the first and second DAC is 50ppm.
- 44.1/16 * 2 channels = 44100 * 16 & 2 = 1,411,200 bits per second
- DAC clock runs at double speed = 1,411,200 * 2 = 2,822,400 cycles per second (or 2.8MHz)
- 50ppm variability at 2.8mHz = 50/1,000,000 * 2.8224 = 0.00014112 seconds (or roughly 0.1ms) every second.
- Clock drift in 60 seconds = 6ms
- Clock drift in 1 minute = 360ms
- Clock drift in 30 minutes = 21,600ms (or 21.6 seconds)
I did not study maths beyond high school so there is a very high probability that I made a mistake in my math. I would appreciate correction, because clock drift of 21.6 seconds at 30 minutes seems astoundingly high to me. I am no match for you engineering types! So please be nice to me if I got my math wrong! Also, according to my calculations, the difference at just 1 minute of playback is 360ms so it should easily be audible. I subjectively did not hear anything amiss after listening for several minutes.
Advising him to change all his equipment to allow slaving to a master clock would be a major expense for him, so I want to check with ASR whether there is any truth to the assertion that clock drift between DAC's can cause group delay, what the magnitude of the problem is, and what it can potentially add up to over time. I do not wish to give bad advice, so your input is welcome.
I read this article (in a pro audio blog) about whether your studio needs a master clock or not. My take-away from that article was contained in this quote:
SoundonSound said:In most compact project studios, there's little need for a master clock. The required system clocking can usually be achieved by interconnecting the equipment directly and, as explained above, where there's only one A‑D in the system, it's generally best to use that as the clock master anyway.
Typically, with a stand-alone A‑D configured as the master, its digital output would be passed on to the audio interface as either an S/PDIF or AES3 signal (for a stereo A‑D), or an ADAT signal (for a multi-channel A‑D). All of those protocols include embedded clock information which the interface can be configured to accept via the appropriate audio input as its slave clock reference. Alternatively, a word clock output could be taken from the A‑D and connected to a word clock input on the DAW interface (remembering to ensure the correct 75Ω termination is in place — see the 'Interface-induced Jitter' box), with the interface set to use the external word clock as the slave reference.
However, in more elaborate and expansive systems, where there are several A‑Ds and lots of other digital outboard, it's often more convenient and practical to have a centralised master clock source, and to distribute clocks from that to all of the other devices, all of which are configured as slaves. All master clock units provide numerous word clock outputs, and often several AES11 clocks too (AES11 is basically a silent AES3 signal, intended specifically for clocking purposes). In this kind of system, though, it would be worth ensuring that the A‑D converters all work well when operating on external clocks, to maximise their audio quality.
The only situation where a dedicated master clock unit is truly essential is in systems that have to work with, or alongside, video, such as in music-for-picture and audio‑for‑video post‑production applications. It's necessary here because there must be a specific integer number of samples in every video picture‑frame period, and to achieve that, the audio sample rate has to be synchronised to the picture frame rate. The only practical way to achieve that is to use a master clock generator that is itself sync'ed to an external video reference, or which generates a video reference signal to which video equipment can be sync'ed.
The application is for a friend of mine who has a system configured like this:
- Turntable --> Phono stage --> ADC
- ADC --> MiniDSP
- MiniDSP --> 3 Topping DAC's --> rest of the system
My understanding of using multiple digital devices is that each device might latch on to the signal at different times, and variations in clock accuracy between each device might cause clock drift over time. After some time, the difference in timing between DAC's might become audible, particularly if one DAC is driving the tweeter, and another DAC driving the midrange, etc. When I was configuring my own system, a friend of mine who is an audio engineer told me NOT to use multiple DAC's for multichannel digital output because of clock drift. Or if I wanted to, I had to slave them all to a master clock. Because of his advice, I purchased an 8 channel DAC.
However, that article I linked to mentions that a master clock is not needed in "simple" studio setups, and is only essential for complex setups involving multiple ADC's or if video is involved. Because that is a pro audio blog, I am guessing that they do not sit down for an hour listening to a single album played from start to finish, so clock drift may be less of an issue for a "simple" studio where they stop and start tracks which will give all digital equipment in the chain a chance to resynchronize.
As far as I am aware, my friend's ADC (I don't know what brand) and his Topping DAC's do not have clock outputs or inputs, so it would be impossible to slave the DAC's to the ADC, or even slave all the digital devices to an external master clock. I suppose this may not a problem if he was using a digital source, because the signal would stop and start at the beginning of each track, meaning that the DAC's would have an opportunity to resynchronize. But he is using vinyl, which means noise might be transmitted to the DAC's even between tracks, so the DAC's might not have an opportunity to resynchronize.
I did some "back of the napkin" math, and this is what I came up with. Assume we have a DAC with a deviation of 50ppm, and a "worst case" scenario where the difference between the first and second DAC is 50ppm.
- 44.1/16 * 2 channels = 44100 * 16 & 2 = 1,411,200 bits per second
- DAC clock runs at double speed = 1,411,200 * 2 = 2,822,400 cycles per second (or 2.8MHz)
- 50ppm variability at 2.8mHz = 50/1,000,000 * 2.8224 = 0.00014112 seconds (or roughly 0.1ms) every second.
- Clock drift in 60 seconds = 6ms
- Clock drift in 1 minute = 360ms
- Clock drift in 30 minutes = 21,600ms (or 21.6 seconds)
I did not study maths beyond high school so there is a very high probability that I made a mistake in my math. I would appreciate correction, because clock drift of 21.6 seconds at 30 minutes seems astoundingly high to me. I am no match for you engineering types! So please be nice to me if I got my math wrong! Also, according to my calculations, the difference at just 1 minute of playback is 360ms so it should easily be audible. I subjectively did not hear anything amiss after listening for several minutes.
Advising him to change all his equipment to allow slaving to a master clock would be a major expense for him, so I want to check with ASR whether there is any truth to the assertion that clock drift between DAC's can cause group delay, what the magnitude of the problem is, and what it can potentially add up to over time. I do not wish to give bad advice, so your input is welcome.