• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

SMSL G1 Review (Clock Generator)

Nice and very well written.

I especially liked the fact you compared the digital player ppm deviation to that of a tuning fork! That gives a very useful perspective.

The revox seems to be on par with a good tuning fork...
 
Good interesting report. In order to get the most accurate 10 MHz frequency standard just buy a ceasium based one. Cost is around 50 k$.
These are installed also in the GPS satellites.
 
Not all. My DAC does not use PLL.
interesting, but the site is confusing:

« The MCLK, 64fs, and 1fs clocks required for PCM playback will use the Direct Digital Synthesizer AD9852 of Analog Devices to generate MCLK, and the built-in PLL without using 10MHz is directly calculated to generate a high-precision playback clock. »

So it can use an external clock that would feed the AD9852 which generates the internal master clock reference. But since the DAC accepts many sampling rates, other components will use PLLs to divide or multiply the frequency provided by the AD9852, I guess. I will not read the full datasheet of that AD chip, but I saw it has a Phase Noise of -140dBc/Hz @1kHz offset, while SMSL claims -151dBc/Hz for the same offset.
 
interesting, but the site is confusing:

« The MCLK, 64fs, and 1fs clocks required for PCM playback will use the Direct Digital Synthesizer AD9852 of Analog Devices to generate MCLK, and the built-in PLL without using 10MHz is directly calculated to generate a high-precision playback clock. »

So it can use an external clock that would feed the AD9852 which generates the internal master clock reference. But since the DAC accepts many sampling rates, other components will use PLLs to divide or multiply the frequency provided by the AD9852, I guess. I will not read the full datasheet of that AD chip, but I saw it has a Phase Noise of -140dBc/Hz @1kHz offset, while SMSL claims -151dBc/Hz for the same offset.
PLLを用いない高精度DDSクロックマネージャーを搭載

PCM再生に必要なMCLK、64fs、1fsのクロックはMCLKの生成にアナログデバイセズ社のDirect Digital Synthesizer AD9852を採用する事とし、PLLを使わずに内蔵の10MHzをダイレクトに演算処理して高精度な再生クロックを生成しています。

内蔵のクロックジェネレーターにはNDK社製の超低ジッター発振器を搭載しています

Translation: Contains high precision DDS clock manager that does not use PLL

The MCLK, 64fs, 1fs needed for PCM playback, are generated by using MCLK and Analogue Device's DDS AD9852, without using PLL, using internal 10MHz direct calculation to generate the clock. (Also yea I just half-a**ed the translation, I could do better, but I won't)

Japanese is notoriously difficult to machine-translate
 
Ok, thanks for the info, it changes the translation indeed :)
But that won’t change the rest of my comments. If it generates a 10Mhz clock, or can use an external 10MHz clock to create an internal one without a PLL (and that’s what the AD9852 does), then the other components referring to it will use a PLL to divide or multiply that rate (SPDIF receiver, oversampler, Delta sigma modulator, DAC, …).
 
Last edited:
Ok, thanks, but that won’t change the rest of my comments. If it generates a 10Mhz clock, or can use an external 10MHz clock to create an internal one without a PLL (and that’s what the AD9852 does), then the other components referring to it will use a PLL to divide or multiply that rate (SPDIF receiver, oversampler, Delta sigma modulator, DAC, …).
You need to take a closer look at K2 technology.
 
Very interesting...
However, I wonder if pitch errors can be reported in absolute terms, in Hz.
As suggested by others, it's hard to tell which of this clock generator and your audio interface is the most accurate. Therefore, relative pitch errors (%) would make more sense imho.

I still wonder how the whole chain is supposed to work. Block diagrams with buffers, src and clock sources would help.

According to me, there is no way to properly and consistently correct pitch errors in the kind of setups you use.
Let's use a comparison : the drive is a water source but its flow is slightly too low. You could store a large bucket of that water and let it out at the right pace, but this can only work for a while : when the bucket is empty, you just have to wait for more water.
In a continuous process, if you consistently output faster than the input, there will be glitches at some point. You will have to either repeat or skip some samples.
The only solution is to synchronize the source.

However, the reference clock can help to reduce the jitter, but the pitch would still be based on the source.

I really wonder how you achieved those consistent 4 ppm pitch errors ?
 
If it is meant the K2 tech by JVC then this seems nonsense to me. To create high frequencies beyond 20 kHz from CD where nothing exists above this limit, what should this be good for? Nobody can hear it.
 
Wondering if you'd use chain connection instead of star would the result be the same?

I know star connection is recommended and a very long chain can degrade the signal but syncing more DACs(4+) with this generator would be an interesting thing to try as well or at least knowing that the possibility is there.
I could try with longer cables, for fun, but I guess even 10m would be ok.
 
Very interesting...
However, I wonder if pitch errors can be reported in absolute terms, in Hz.
As suggested by others, it's hard to tell which of this clock generator and your audio interface is the most accurate. Therefore, relative pitch errors (%) would make more sense imho.

(…)

I really wonder how you achieved those consistent 4 ppm pitch errors ?
I’ve been testing CD players this way for a bit more than 2 years. That’s how I discovered that their internal pitch error transitions to an external DAC via the SPDIF connection.

The SMSL D200 and the Topping D50III reported the same pitch error from the different CD players, as I’m used to see. But when the G1 provides the clock to the D200, then the pitch error becomes a flat 4ppm indeed.

I also regularly measure the digital output via the digital input of my Motu. It offers the possibility to sync to the incoming digital signal getting the clock from SPDIF or referring to its internal one when sending digital signal via its digital outputs, so that the external device can sync to it. When I use the second option, but read the incoming data from digital input of the Motu, I can see the glitches from the FFT analyzer (and the frequency offset rapid variations), because the Motu expects the external device to synchronize. The more stable the source is and the less glitches, as I explained here.
In that context, the SMSL PL200T had some of the best results I measured so far, nearly 0 glitches. But if I feed it with the G1 clock, then I see a few, and this correlates with the 1ppm vs 4ppm that I measured. In the past, all CD players that I measured at 1ppm where also generating the lowest glitches in that test.

To me, what I measured when using the G1 seem to indicate a very low phase noise helping the PLLs of the DACs to reject fast frequency offset from the source. But I guess not all DACs would benefit the same way.
 
You need to take a closer look at K2 technology.
For what I know, this tech was used to create some of the less dynamic, most compressed and distorted Masters. It was a story similar to recent MQA business case, to license studios and make everyone buy again what they already had.

Bit depth increase has been offered "for free" by Denon in their CD players since 1990 or so, via their Alpha filter. Maybe that’s what this tech does in this DAC.

Example with this old test from Stereophile that I still use: with 16bits, @-90.31dBFS, it should show a square as the smallest symmetrical signal in PCM 16bits, such as the below:

IMG_4953.jpeg


And Denon increases the bit depth to recreate a sine:

IMG_4952.jpeg
 
I also regularly measure the digital output via the digital input of my Motu. It offers the possibility to sync to the incoming digital signal getting the clock from SPDIF or referring to its internal one when sending digital signal via its digital outputs, so that the external device can sync to it. When I use the second option, but read the incoming data from digital input of the Motu, I can see the glitches from the FFT analyzer (and the frequency offset rapid variations), because the Motu expects the external device to synchronize. The more stable the source is and the less glitches, as I explained here.
In that context, the SMSL PL200T had some of the best results I measured so far, nearly 0 glitches. But if I feed it with the G1 clock, then I see a few, and this correlates with the 1ppm vs 4ppm that I measured. In the past, all CD players that I measured at 1ppm where also generating the lowest glitches in that test.
Most likely the glitches is buffer over-/under-run.
The closer the clock frequency of the player matches that of the DAC the longer it takes to overrun the memory.
Because of this issue (does not happen with USB) the PLL in a DAC matches the clock speed to the derived clock speed from the source thereby preventing buffer overrun.
As the PLL uses a 'flywheel' type which is sync'ed with the incoming clock (derived from the SPDIF stream) that frequency can have some jitter.
That jitter is removed by a small FIFO which is clocked by the DAC chip's own clock.
I suspect that clock is being 'replaced' by the external clock.

When the speed of the source clock is too high or too low buffer over-/under-run can happen.
 
Z tego co wiem, technologia ta została wykorzystana do stworzenia jednych z mniej dynamicznych, najbardziej skompresowanych i zniekształconych Mastersów. To była historia podobna do niedawnego przypadku biznesowego MQA, polegającego na udzielaniu licencji studiom i zmuszaniu wszystkich do ponownego kupowania tego, co już mieli.

Zwiększenie głębi bitowej jest oferowane „za darmo” przez Denona w odtwarzaczach CD od około 1990 roku za pośrednictwem filtra Alpha. Być może właśnie to zapewnia ta technologia w tym przetworniku cyfrowo-analogowym.

Przykład ze starego testu ze strony Stereophile, którego nadal używam: przy 16 bitach i poziomie -90,31 dBFS powinien on pokazywać kwadrat jako najmniejszy sygnał symetryczny w 16-bitowym PCM, taki jak poniżej:

View attachment 488908

Denon zwiększa głębię bitową, aby odtworzyć sinusoidę:

View attachment 488909

For what I know, this tech was used to create some of the less dynamic, most compressed and distorted Masters. It was a story similar to recent MQA business case, to license studios and make everyone buy again what they already had.

Bit depth increase has been offered "for free" by Denon in their CD players since 1990 or so, via their Alpha filter. Maybe that’s what this tech does in this DAC.

Example with this old test from Stereophile that I still use: with 16bits, @-90.31dBFS, it should show a square as the smallest symmetrical signal in PCM 16bits, such as the below:

View attachment 488908

And Denon increases the bit depth to recreate a sine:

View attachment 488909
I think you mean something else because CDs with music recorded using K2 technology are some of the best sounding.
 
These types of devices bring me to the question that has been bugging me lately: Is all this optimization we seek on the reproduction side really worth it, given the source material issues? In this case, does 3 ppb (and other ultra-low specs) really do anything if the source was recorded with tens of ppms?
Consider, for example, music produced by electric instruments, such as guitars, synthesizers, etc. Does anyone believe that those instruments, as well as the entire recording and processing path for acoustic instruments, truly have even a 96 dB of SNR and low distortion/jitter to match 16-bit playback, let alone more for 24-bit playback? And how about (unintentional) distortion of a typical Marshall amp+speaker that is being closed-mic'ed during studio recordings -- does it not exceed that of the cheapest reproduction equipment by orders of magnitude?
 
Most likely the glitches is buffer over-/under-run.
The closer the clock frequency of the player matches that of the DAC the longer it takes to overrun the memory.
Because of this issue (does not happen with USB) the PLL in a DAC matches the clock speed to the derived clock speed from the source thereby preventing buffer overrun.
As the PLL uses a 'flywheel' type which is sync'ed with the incoming clock (derived from the SPDIF stream) that frequency can have some jitter.
That jitter is removed by a small FIFO which is clocked by the DAC chip's own clock.
I suspect that clock is being 'replaced' by the external clock.

When the speed of the source clock is too high or too low buffer over-/under-run can happen.
I like the comparison with the flywheel, very illustrative.
 
I am glad this discussion takes place here and I hope it continues because there is a lot more to dig in.

I know it is off topic here but I am personally more interested in word clock synchronization of parallel devices in multichannel systems (via external word clock) and interfaces from a DIY point of view, and there isn't a lot of information out there on how this is done. One needs to dig in Analog Devices tech notes and schematics of their DSP evaluation boards, that is a trove of material, but not easy to understand for me
As already mentioned, when one uses a master clock reference, inevitably, samples will be dropped.
 
Last edited:
I am glad this discussion takes place here and I hope it continues because there is a lot more to dig in.

I know it is off topic here but I am personally more interested in word clock synchronization of parallel devices in multichannel systems (via external word clock) from a diy point of view, and there isn't a lot of information out there on how this is done. One needs to dig in Analog Devices tech notes and schematics of their DSP evaluation boards, that is a trove of material, but not easy to understand for me
As already mentioned, when one uses a master clock reference, inevitably, samples will be dropped.
You could daisy chain the word clock output from one device to the other, this method work fine if you don't have several equipement to synchronize.
The best approcah is too use a Word Clock Distrubution amplifier as this one from Ross Video.

Some advices:
Use proper video cable with 75 Ohms connectors and wire, don't mix with 50 Ohms BNC connectors. BNC connectors must be crimped not soldered.
Always put a 75 Ohms terminator on the last output.
It's a good pratice to monitor the word clock output with an oscilloscope.
The square signal should be pristine with but minimum noise, measure the output of the clock and the last output.
 
Back
Top Bottom