• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

CD player : measurment of the accuracy of the clock

Vintage02

Member
Joined
Nov 6, 2024
Messages
88
Likes
89
Location
France
Hi,

I humbly open this topic in order to share with you an aspect that can easily go unnoticed in the specifications of a CD player or even a DAC.

Having NTTY (Florent) among my acquaintances and thanks to his CD test 7.2, a Focusrite Scarlette 4i4 and the REW software, I have access to a tool that proves to be efficient. I also share with him the interest in CD players.

I retrieved a broken Technics SL-PG480A CD player. This player would power on but no longer recognized CDs. A quick glance at the service manual showed that the laser head was a CDM12 / VAM1201, which unfortunately has a nasty tendency to age poorly. This is due to a laser diode of poor durability. Fortunately, it is possible to find replacement heads, new and identical, for around 10€ (12$). The replacement is relatively simple to perform. But that is not the subject, once the CDM12 was changed, the player recognized the CDs. So I decided to take some measurements.

On the NTTY test CD 7.2, track 32 allows for measuring the accuracy of the clock.
Is it important you will tell me?

I noticed during a recording via Audacity of a track from a CD that although the duration displayed on the player was the same as that indicated on the cover, the actual duration of the recording was different. In looking for why this was the case, I found that it was due to a drift in the precision of the CD player's internal clock.

Track 32 allows testing the 'pitch error', which is the accuracy of the clock. It involves reading a sinusoidal signal at 0dBFS at precisely 19997 Hz.

Here is what the measurement looks like with the original clock :

32-SL-PG480A 32 ceramique.jpg


It can be seen from the measurement that the frequency of 19997 Hz is restored to 20022.42 Hz, which is a difference of +25.42 Hz, corresponding to a difference of +1271.1906786 ppm, which for me is an excessive drift.

Looking again at the service manual, we realize that there is a ceramic oscillator at 16.9344 MHz (X701) and without much questioning, we are sure we have found the culprit. Indeed, ceramic oscillators are not very precise and especially tend to drift over time. There are ceramic oscillators with 3 pins and 2 pins. In this case, we are dealing with a 2-pin, which makes it even easier to replace with a crystal of the same frequency.I therefore intervened on the board where the oscillator was located and replaced it with a crystal.

Here is what the measurement with the quartz yields :

32-SL-PG480A 32 quartz.jpg


We can see that now, the frequency of 19997 Hz is restored to 19996.94 Hz, which is a difference of -0.06 Hz, corresponding to a difference of -3.00045007 ppm, which is truly excellent.

I don't know if the improvement in clock accuracy will have an impact on listening, but intellectually and technically I am satisfied with this improvement.
Following my intervention, it is even possible that the accuracy of the clock is higher than that of the device at the factory output.

Thank you for reading me and thanks to Florent for his test CD and all his advice.

(I used a translation tool because my English is a bit rusty, I am writing to you from France.)
 
That is much much better indeed, I’d like to measure 3ppm every time :)
I got to experience the same with the Yamaha CDX-393. In my case I got "only" 70ppm as the end result, which is good enough for audio, but could been improved by paying more attention to the load capacitance of the Xtal.
 
If the clock is off the pitch and tempo will be off. But it has to off by quite a bit before the "average listener" notices anything. If it's off by enough a musician trying to play along might be out-of-tune with the recording. It's no problem for a singer because they will just song-analog in-tune with what they hear.

Some consumer soundcards are off by enough to cause problems for musicians trying to play in-tune. But instruments aren't tuned "perfectly" either. I'd bet the average quartz crystal is a lot more accurate than the average piano. And the timing-tempo more accurate than an old-fashion metronome, and far more accurate than a conductor's sense of timing.

Usually it's the pitch that's noticed first. A common scenario is two collaborating musicians using different computers/equipment and when they try to mix, somebody is off-key. Or sometimes they record vocals, singing into record with a USB mic while listening to a backing track from their soundcard. They are singing in-tune with the backing track but when mixed, it's off because the clock in the USB device doesn't match the clock in the soundcard

Or if two devices are used to record a concert they may not be far-enough off for a noticeable pitch error but by the end of the concert they are out-of-sync with each other.

Pros often use a master clock, which is sometimes an atomic clock, along with equipment that has an external clock input. Besides keeping the time accurate multiple devices can be synchronized to the exact sample-clock count.
 
Bonjour Florent,

I didn't expect to have 3 ppm... We will see what influence this may have on the other measurements.
 
Considerations of music production aside, how does the clock difference of domestic CD players have any relevance to what one hears? We've been using turntables for over 100 years, and a speed accuracy of 0.1% , i.e. 1000ppm is perfectly acceptable. Even the best turntables struggle with 0.01% or 100ppm. Ditto with the tape machines that were used for the original recordings, then different machines used for subsequent editing and mastering. What the overall speed error from first capture to final as-issued recording is anyone's guess. (There was the well known speed errors on Miles Davis Kind of Blue)

Do I care if my CD plays back 0.1% fast? Would I even notice?

And don't get me started on jitter...

S.
 
Let's just say that by accumulating clock errors, we risk arriving at something that could begin to be heard.
I had already noticed, when recording via Audacity, a song played from a CD player with a clock drift, that the duration was different from that of the same ripped song.

Otherwise when technology allows it, why deprive yourself of it... personally, even if it doesn't change anything in listening, I prefer 3ppm to 1271ppm
 
I respect every opinion but please note music is not physics. Music that people enjoy is art.
That is the only approach that leads to pleasure.

Long before we had digital Arthur Rubinstein praised the option people have to enjoy music in the comfort of their home at the moment it was convenient for them to listen to recordings.

CD technology as registered in the blue book came a decade too early for the hardware that was available.
CD’s have audible limitations. Still the average reproduction quality was improved over vinyl.
Please note I state average. That means many listeners were able to enjoy an improvement of reproduction quality for a modest investment.

Those who were after optimum reproduction quality were prepared to pay more for improvements.
The music industry invested large sums to improve the quality of recordings.

I had more than a fair share in the improvement of recordings. I admit even leading industries made some large errors in that process.
It is a learning curve where the effort and talent of gifted engineers needs feedback from balance engineers and producers to improve the quality of recordings. It took several decades to develop a good communication between those disciplines.

Enjoy music!
 
I respect every opinion but please note music is not physics. Music that people enjoy is art.
That is the only approach that leads to pleasure.

Long before we had digital Arthur Rubinstein praised the option people have to enjoy music in the comfort of their home at the moment it was convenient for them to listen to recordings.

CD technology as registered in the blue book came a decade too early for the hardware that was available.
CD’s have audible limitations. Still the average reproduction quality was improved over vinyl.
Please note I state average. That means many listeners were able to enjoy an improvement of reproduction quality for a modest investment.

Those who were after optimum reproduction quality were prepared to pay more for improvements.
The music industry invested large sums to improve the quality of recordings.

I had more than a fair share in the improvement of recordings. I admit even leading industries made some large errors in that process.
It is a learning curve where the effort and talent of gifted engineers needs feedback from balance engineers and producers to improve the quality of recordings. It took several decades to develop a good communication between those disciplines.

Enjoy music!
Also consider then that most audio sales are generated on fantasy rather than fact let alone art. Chasing imaginary things in the name of audio, meh. Expand on your actual and useful (and proveable) contritbution could perhaps be interesting more than your general assertion.
 
I do not follow your suggestion that most audio sales are based on fantasy.
Improvements can only be made by hard work of people interested to improve the quality of recordings and replay equipment.
When you are not interested or able to enjoy this effort keep the money in your pocket and do not spend it on equipment.
 
Considerations of music production aside, how does the clock difference of domestic CD players have any relevance to what one hears? We've been using turntables for over 100 years, and a speed accuracy of 0.1% , i.e. 1000ppm is perfectly acceptable. Even the best turntables struggle with 0.01% or 100ppm. Ditto with the tape machines that were used for the original recordings, then different machines used for subsequent editing and mastering. What the overall speed error from first capture to final as-issued recording is anyone's guess. (There was the well known speed errors on Miles Davis Kind of Blue)

Do I care if my CD plays back 0.1% fast? Would I even notice?

And don't get me started on jitter...

S.
It would be difficult to spot, unless doing A/B direct comparison.

But why would we not care? All things being otherwise equal… Going from 1000ppm to 3ppm is a great tangible improvement, as would be THD improvement from -90dB to -110dB. Do we hear that? Well… :)

Funny you mentioned jitter, it is actually when running some hearing threshold tests of different flaws, with @Vintage02, that we discovered this unrelated issue. When it came to jitter, we repeated the exact same tests as described in this AES convention paper: Theoretical and Audible Effects of Jitter on Digital Audio Quality. Measurements were ugly way before we could hear anything. More info here.

There is quite a gap between what we can measure and what we can hear :) But when we measure a tangible improvement, I think it’s cool.
 
I respect every opinion but please note music is not physics. Music that people enjoy is art.
That is the only approach that leads to pleasure.

Long before we had digital Arthur Rubinstein praised the option people have to enjoy music in the comfort of their home at the moment it was convenient for them to listen to recordings.

CD technology as registered in the blue book came a decade too early for the hardware that was available.
CD’s have audible limitations. Still the average reproduction quality was improved over vinyl.
Please note I state average. That means many listeners were able to enjoy an improvement of reproduction quality for a modest investment.

Those who were after optimum reproduction quality were prepared to pay more for improvements.
The music industry invested large sums to improve the quality of recordings.

I had more than a fair share in the improvement of recordings. I admit even leading industries made some large errors in that process.
It is a learning curve where the effort and talent of gifted engineers needs feedback from balance engineers and producers to improve the quality of recordings. It took several decades to develop a good communication between those disciplines.

Enjoy music!
Perhaps you should found a new community: audioartreview dot com
 
Thank you for this suggestion.
As it is I have enough to handle here.
 
It would be difficult to spot, unless doing A/B direct comparison.

But why would we not care? All things being otherwise equal… Going from 1000ppm to 3ppm is a great tangible improvement, as would be THD improvement from -90dB to -110dB. Do we hear that? Well… :)

Funny you mentioned jitter, it is actually when running some hearing threshold tests of different flaws, with @Vintage02, that we discovered this unrelated issue. When it came to jitter, we repeated the exact same tests as described in this AES convention paper: Theoretical and Audible Effects of Jitter on Digital Audio Quality. Measurements were ugly way before we could hear anything. More info here.

There is quite a gap between what we can measure and what we can hear :) But when we measure a tangible improvement, I think it’s cool.
Indeed. And perhaps 40 years ago, I would agree with you. However, certainly in the past 25 years if not longer, I've only been interested in technical improvements I can hear. One of the benefits of having been in the Broadcast Industry pretty much all my working life, is the pragmatism that industry adopts. As an example, in 2017, the BBC streamed the Proms in FLAC rather than the 320kbps AAC of previous years. After the Proms, research done indicated that the change wasn't audible, at least not to the vast majority of listeners, so the BBC took the pragmatic decision not to increase their streaming costs by going to FLAC, and sticking to AAC. The HiFi community was unhappy with this, but nobody else cared, or even noticed.

Similarly, I couldn't give two hoots for an amplifier with THD at -110dB compared with one at -80dB. That's already 40dB better than audible on programme material, and 20dB better than it ever needs to be. A lot of the specs today come about because the devices used do that, as it's hard now for them not to. No problem with that, but for equipment to be marked down in reviews because they 'only' do -90dB is, to me, idiocy when their other qualities like reliability (harder to quantify in a review, I accept) is barely mentioned.

S.
 
Knowing BBC, I guess it was after all taken into consideration and extensive testing. Pro world has other needs to satisfy starting with not spending when the spend can be avoided/reduced.

As for the rest, I think it’s good we continue to chase innovation, even if we don’t directly (audibly) benefit from it. If we stop the chase (or I should say the "trust and verify"), then we’ll see bad engineering coming back very quickly, not that it’s completely gone anyways.

I agree on the reliability side, we miss data. That said, in today’s world, we change because we want to (new toy, new feature, new look, …), much less because we need to (wear, failure, obsolescence, …).
 
Good evening,
(8:15 pm in France)

19-SL-PG480A 32 quartz.jpg


The beautiful precision of the clock is confirmed, the signal at 999.91 Hz is reproduced at 999.91 Hz. This will not tell us if the sound will be good but this technical data pleases me.

For those who would like to hear what this player sounds like, here is a recording:

00 - SL-PG480a - Diana Krall - Peel Me A Grape (24-48)
 
Back
Top Bottom