• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Hardware Mods - the ugly story

andreasmaaan

Master Contributor
Forum Donor
Joined
Jun 19, 2018
Messages
6,652
Likes
9,406
Yes, as of about 1985 or so. Single chips from the last 30-35 years have two separate D/A converters.

Generally, the "problem" was just a constant time delay in one channel, which wasn't really a problem, since it could literally be fixed by shifting one speaker a couple of millimeters. The first CD players also only had 14 bit converters, but again, that "problem" was solved within a year or two as new chips became available. That could be audible as a slightly increased noise floor if you boost the volume during extremely quiet passages.

Aaaaaah ok. This is very clear now, thanks!
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,705
Likes
38,855
Location
Gold Coast, Queensland, Australia
The first CD players also only had 14 bit converters

Not quite.

Sony had the 16 bit CX-20017 D/A from day one, released in Japan six months before (Oct 1, 1982) the official worldwide release (March 1st, 1983). Philips had the 14 bit 4x OS filter chipset & TDA1540 ceramic pack. According to Philips literature I have, the ENOB was 15.6bits with the 4x OS chipset.

Philips weren't ready for the official release, so they agreed to delay and Sony only released in the home market as a consolation prize. As such, Sony gained a massive first mover advantage. Toshiba had existing 14 bit D/As which they supplied to the low end.
 
Last edited:

SIY

Grand Contributor
Technical Expert
Joined
Apr 6, 2018
Messages
10,506
Likes
25,336
Location
Alfred, NY
I was thinking of the CD100, which used the 14 bit converter. The CDP101 was, as you say, 16 bit from the get-go.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,705
Likes
38,855
Location
Gold Coast, Queensland, Australia
The CDP101 was, as you say, 16 bit from the get-go.

I'd really like to see a linearity or stair step for one of my CDP-101s. My CBS1 test disc is no longer with me unfortunately. :( I have Denon/Sony test discs, but the low level ILSB tests bottom out at -60dB.

I have read the consistency of the CX-20017 meant there was a considerable variation between the chips. Yields apparently improved as time went on (or demands for product meant 'rejects' were lower...)
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
@restorer-john,

It is not a 'given', never was.

What 100% functional 1st gen machines do you have, that can demonstrate those 'audible differences'?

Although it is only loosely related, even the ABX crew realized in 1995 that they could differentiate between a Philips first generation player and a later, more sophisticated, Sony player:
http://djcarlst.provide.net/abx_cd.htm

Wrt my statement it imo isn´t important if _i_ hear something although my very first cd player was indeed the CDP 101.


A single D/A converter does not and did not mean a constant 11.3uS interchannel time delay.

Some machines, including the CDP-101 attempted to adjust time constants in S/H to minimise that time delay, and consequently, back in the day, interchannel phase differences were typically tested and displayed in decent reviews at various spot frequencies up to 20KHz as a Lissajous CRO image.

Single D/A machines were not all the same in that regard.

You surely confuse the CDP-101 with another one, as you already correctly stated before, that it uses a two channel DAC (i.e. the CX20017).
It might be, that some were different in this regard.

@andreasmaan,

I was well aware of the research showing audibility of very small ITDs, but in fact it is news to me that any stereo DAC that uses a single DAC IC for both channels has an inherent ITD of 11.3µs.

This is probably not the thread to delve into this directly, but if you would be able to quickly link some further info for me to do my own reading, I'd much appreciate it.

Andreas

Not so much to delve in, but see for typical example a review of a Mitsubishi DP-103 in Stereo Review:

Mitsubishi_DP103_1.gif

(Source: Stereo Review, January 1984, page 45)

They measured the phase shift and that relates quite well to the mentioned delay.
Constant delay leads to increasing phase angle if frequency increases. Formula: phi = f x 360 x td (phi is phase angle in degrees, f is frequency in Hz and td is delay in seconds)
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,894
Likes
16,705
Location
Monument, CO
Interesting. At 44.1 kS/s the sampling period is 22.7 us so I would have expected there to be a one-sample delay from one side to the other. The data indicates about 1/2 that time period, so they must be using an output T/H to get down to 11.35 us or so. Been a long, long time since I read any technical details about those early converters.
 

sergeauckland

Major Contributor
Forum Donor
Joined
Mar 16, 2016
Messages
3,460
Likes
9,158
Location
Suffolk UK
@restorer-john,



Although it is only loosely related, even the ABX crew realized in 1995 that they could differentiate between a Philips first generation player and a later, more sophisticated, Sony player:
http://djcarlst.provide.net/abx_cd.htm

Wrt my statement it imo isn´t important if _i_ hear something although my very first cd player was indeed the CDP 101.




You surely confuse the CDP-101 with another one, as you already correctly stated before, that it uses a two channel DAC (i.e. the CX20017).
It might be, that some were different in this regard.

@andreasmaan,



Not so much to delve in, but see for typical example a review of a Mitsubishi DP-103 in Stereo Review:

View attachment 13701
(Source: Stereo Review, January 1984, page 45)

They measured the phase shift and that relates quite well to the mentioned delay.
Constant delay leads to increasing phase angle if frequency increases. Formula: phi = f x 360 x td (phi is phase angle in degrees, f is frequency in Hz and td is delay in seconds)
But exactly the same thing happens with any two-channel source unless your head is clamped rigidly. ANY movement off the centre line will create phase differences in a central image. The interchannel time difference between your ears creates a dip in response around 2kHz and yet we're perfectly happy with that.
That report got it right. There is no audible effect in stereo. There IS an effect for a mono listener which is why radio stations go to a lot of trouble to have identical phase shift between the two channels for mono compatibility, but at home, in stereo, no issue whatsoever.

S
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
<snip> The only difference was the fixed interchannel time difference, which was comparable to moving one's head a few millimeters.

<snip>
S

<snip>

Generally, the "problem" was just a constant time delay in one channel, which wasn't really a problem, since it could literally be fixed by shifting one speaker a couple of millimeters. <snip>

Both, "moving one´s head a few millimeters" and "wasn´t really a problem, since it could literally be fixed by shifting one speaker a couple of millimeters" are from the ad hoc argument type and seem to be reasonable, but a headphone user might have thought differently about these explanations/solutions. :)

But it misses the point which was the asserted impossibility of any audible difference......
 

Kal Rubinson

Master Contributor
Industry Insider
Forum Donor
Joined
Mar 23, 2016
Messages
5,303
Likes
9,865
Location
NYC
Interesting. At 44.1 kS/s the sampling period is 22.7 us so I would have expected there to be a one-sample delay from one side to the other. The data indicates about 1/2 that time period, so they must be using an output T/H to get down to 11.35 us or so. Been a long, long time since I read any technical details about those early converters.
If the sampling period for a stereo sample is 22.7 us, then the interchannel sample delay is half that: 11.35 us. At least, that is how I understood it.
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
If the sampling period for a stereo sample is 22.7 us, then the interchannel sample delay is half that: 11.35 us. At least, that is how I understood it.

That´s correct. The CD delivers a serial data stream that contains both channel´s data; if processed in real time it leads to the delay mentioned for the single DAC ic case.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,705
Likes
38,855
Location
Gold Coast, Queensland, Australia
You surely confuse the CDP-101 with another one, as you already correctly stated before, that it uses a two channel DAC (i.e. the CX20017).

I don't confuse the CDP-101 with anything. I also never stated it had a two channel D/A converter. The CX-20017 is a multiplexed, single D/A. It switches between channels, one after the other.

By all means, study the schematic below and don't trust IC descriptions on the internet as they are often incorrect and sadly, repeated ad infinitum.

cdp-101.JPG


Note in the above schematic section from the CDP-101, the L/R clock signal which feeds the CMOS CD-4053 switch. Note also it alternates the channels at 22uS period (11uS intervals). Also note R530 and R531 have slightly different values- hence a slight change in the time constant- an attempt to reduce the phase delay at higher frequencies.

These are scope captures of the left and right channels of one of my CDP-101s showing the interchannel phase difference at 20KHz from the multiplexed single D/A:


20KHz.jpeg


20KHz liss.jpeg


Don't confuse the L/R current outputs from the CX-20017 as implying there are two 16bit R2R or ladder networks. There is only one current source.
 
Last edited:
  • Like
Reactions: trl

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,894
Likes
16,705
Location
Monument, CO
If the sampling period for a stereo sample is 22.7 us, then the interchannel sample delay is half that: 11.35 us. At least, that is how I understood it.

OK, I understand the incoming bit stream is at twice the rate, but the fundamental sampling rate is 44.1 kS/s per channel (22.7 us/sample), so there must be something at the output of the DAC (like a T/H circuit) to hold the output for the full sampling period. Then you could ping-pong the output between the two T/H's to reduce the lag to 11.35 us (or whatever). It was stated earlier that there was such a circuit so I think I'm good.

Thanks Kal - Don
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,705
Likes
38,855
Location
Gold Coast, Queensland, Australia
Don, don't forget the single D/A is processing two channels worth of data, so 88.2KS/s and toggling between each sample at 11uS.

Also, see IC 508/09 in the schematic above.

:)
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,894
Likes
16,705
Location
Monument, CO
Duh. Thanks. Analog guy, so despite decades of experiencing designing data converters, factors of two still befuddle me. :oops:

I saw your post after I had written mine; at least I didn't get the hold circuit wrong.

2 + 2 = 5 for very large values of 2.
 

egellings

Major Contributor
Joined
Feb 6, 2020
Messages
4,064
Likes
3,309
"POOGE" came for a DIY Magazine call The Audio Amateur. It's an acronym for "Progressive Optimization Of Generic Equipment". TAA published lots of mods that were done on existing equipment, especially the analog sections of early CD players.
 
Top Bottom