• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Battle of S/PDIF vs USB: which is better?

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
The question you raise is a potential blow to many tests. In quantitative science - and the softer as well - you try and control for other factors to distill an essence which consists of only one ingredient.

When a cable goes through a (black...) box, you lose control in the strict statistical sense. Alas, even the connectors - male and female - complicate the picture.

So what is a cable tester to do?

Do test equipment and procedures exist that test the cable only, with and without connectors?
If we consider S/PDIF or isochronous USB and how we would build a DAC to work with it, we can imagine all kinds of schemes that would have various limits on their performance.

For example, if we knew that the source clock was likely to be within a known fraction of our sample clock, we could half fill a buffer and allow it to drain or accumulate over a listening session, ending up with a second of latency, say. But no clock adjustments would need to have been made so the system would be as 'pure' as asynchronous USB.

But if we can't guarantee that then we need a different approach. What latency can we accommodate? Do we change the DAC's timing, or resample the audio? Do we perform corrections all the time, or larger corrections occasionally? Do we attempt to ascertain the source clock average frequency over several minutes and attempt to emulate it, changing our PLL's time constant over time? What if the source suddenly changes its frequency by a non-negligible amount? How do we cope with that in such a scheme? etc. Various schemes will respond to jitter in the source/cable differently, with systems that appear marginally 'worse' on some measures actually behaving better in other circumstances.

Only by knowing what the DAC is trying to do can we hope to understand what the measurement means. Measurements alone can't tell us what the system will do in the case of a source whose clock has non-ideal characteristics - a clever scheme that works with a good source may fall over very badly with one whose frequency is off by more than a certain amount.

Basically, measurements don't really tell us anything definitive or lift the lid on what the DAC is doing. My take on it is that knowing how the system works, combined with very low level measurements of eye patterns or what have you would allow us to simulate the performance in software which would be much more useful.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
If we consider S/PDIF or isochronous USB and how we would build a DAC to work with it, we can imagine all kinds of schemes that would have various limits on their performance.

For example, if we knew that the source clock was likely to be within a known fraction of our sample clock, we could half fill a buffer and allow it to drain or accumulate over a listening session, ending up with a second of latency, say. But no clock adjustments would need to have been made so the system would be as 'pure' as asynchronous USB.

But if we can't guarantee that then we need a different approach. What latency can we accommodate? Do we change the DAC's timing, or resample the audio? Do we perform corrections all the time, or larger corrections occasionally? Do we attempt to ascertain the source clock average frequency over several minutes and attempt to emulate it, changing our PLL's time constant over time? What if the source suddenly changes its frequency by a non-negligible amount? How do we cope with that in such a scheme? etc. Various schemes will respond to jitter in the source/cable differently, with systems that appear marginally 'worse' on some measures actually behaving better in other circumstances.

Only by knowing what the DAC is trying to do can we hope to understand what the measurement means. Measurements alone can't tell us what the system will do in the case of a source whose clock has non-ideal characteristics - a clever scheme that works with a good source may fall over very badly with one whose frequency is off by more than a certain amount.

Basically, measurements don't really tell us anything definitive or lift the lid on what the DAC is doing. My take on it is that knowing how the system works, combined with very low level measurements of eye patterns or what have you would allow us to simulate the performance in software which would be much more useful.

Some DACs have filtering, reclocking and galvanic isloation. Some don’t and some have some features but lack others. How this can and will influence measurable output is a tall order.

So you can’t easily dig out a cable’s contribution to the output.

I guess well-engineered digital boxes are less influenced by badly engineered digital cables or cables that are prone to jitter.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
Some DACs have filtering, reclocking and galvanic isloation. Some don’t and some have some features but lack others. How this can and will influence measurable output is a tall order.

So you can’t easily dig out a cable’s contribution to the output.

I guess well-engineered digital boxes are less influenced by badly engineered digital cables or cables that are prone to jitter.
Filtering and so-called reclocking are what I'm talking about. I think people have naive ideas of what's going on and aren't thinking it through. There is no magic system. You have a bit stream being sent at f1 with jitter for various reasons (including the cable), and a DAC that is playing samples at f2 that you may control - at the expense of generating your own jitter. You can't 're-clock' without accommodating the frequency difference - which will drift - or allowing long latency and a buffer that drains or fills until it's empty or full whereupon something drastic is needed. Otherwise, at some level you have to measure the incoming sample frequency and this will never be precisely accurate because of resolution limits and jitter.

Asynchronous packet-based systems dispense with all that.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,792
Likes
37,695
I don't know if people think about the effects trying to synch two digital clocks. In my posted example of jitter the clocks in the ADC and DAC differed by 75 ppm. It isn't uncommon for digital devices to have clocks that differ by 200 ppm. Just using the 75 ppm example. A 3 minute track at 44,100 sample rates will have nearly 8 million samples. A clock difference of 75 ppm means when the faster clock is finished after 3minutes the slower one will be 600 samples behind. Over 13 milliseconds. You have to synch or buffer the clocks or digital signal transfer never works.

Also two clocks that differ by 75 ppm at 44,100 sample rate means the sample time differs between them by 1700 picoseconds per sample. And that is ignoring that both clocks will jitter about their average rate some as well. So you have to keep adjusting the timing for each sample by that amount on at least one of the clocks or you have to buffer enough that grosser adjustments suffice or a stop and start of data flow can keep the buffer roughly half full. The various ways to pull a clock off its natural frequency can have jitter result from that operation. It really is amazing how well this has all been worked out for pennies in the various devices. The asynchronous USB connection in principle that allows the clock at the DAC to free run at its natural frequency is a much better way to do things because it has divorced or isolated the DAC clock from any upstream effects by being able to buffer and call for data over USB as needed. A free running clock is the lowest jitter that clock will ever be. That is also why using external high accuracy clocks for DACs is not going to improve things as you by necessity will have to disturb a free running clock. The accuracy then isn't that of the high accuracy external clock, it is still limited by the accuracy of the internal clock plus whatever degradation of that clock occurs from having it synch to an outside clock.
 

3beezer

Member
Joined
Sep 22, 2017
Messages
11
Likes
7
Location
Montana
JA at Stereophile measured the eye pattern of the Vlink via 15 ft of toslink plastic optical cable. Unfortunately he didn't do the same for coax, but nothing wrong with the optical result. JA reported the AP calculated jitter of the datastream as 395 picoseconds. Of course SPDIF inputs optical or coax usually are able to filter datastream jitter to lower levels than whatever is in the datastream.

511MFVLfig1.jpg
Blumlein 88, do you understand what JA actually measured? I read his review in Stereophile several times, but I must be stupid. I guess that he used a PC or Mac to generate a J-test signal. He connected the V-Link to a USB output of the computer. Then he says that he measured the eye pattern by connecting a 15' plastic cable from the TOSLINK output of the V-Link... to what? I don't see a TOSLINK input on the AP 2722. This eye pattern is so good that it defies belief. The sample rate of the excitation appears to be 96kHz. If TOSLINK works this well at 96kHz, I guess it would work well at 10x this sample rate -- maybe even 100x -- yet conventional wisdom is that optical cables don't support sample rates in excess of 96kHz. I suspect that he connected the optical cable to a TOSLINK-S/PDIF converter and then connected its S/PDIF output to the AP 2722. If so, I wonder whether the converter filtered out the jitter.
 

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
634
I am gathering from the general tone in this thread that the main and title question has been answered. USB, now almost universally in the asynchronous flavor, seems to have won hands down in a rout. Does anyone dispute that?

Since then, almost all focus here has been on the much less interesting "debate" about spdif coax vs Toslink, two aged technologies in which there is probably almost no current investment anywhere in further development toward improved versions. That might have been a hot debate once, maybe two or three decades ago. But, you never know, perhaps, like vinyl, they are about to make a big comeback. Hey, retro is cool.

But, why would any but a few dinosaurs care about coax vs. Toslink?
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,250
Likes
17,200
Location
Riverview FL
USB, now almost universally in the asynchronous flavor, seems to have won hands down in a rout. Does anyone dispute that?

USB doesn't fit into my antique gear chain.

If going straight from PC to DAC - it's OK.

I go from TV/HDRadio/CD player (transport)/PC to Optical/Coax Switcher to DEQ2496 to miniDP OpenDRC-DI to DAC.

Only the PC has a USB out, only the DAC has a USB in.

I don't dispute the advantages and progress USB presents, but I "can't" use it.

But, why would any but a few dinosaurs care about coax vs. Toslink?

That's me!

Out of curiosity only, the optical and/or coax connections I use cause me no consternation.
 

Jinjuku

Major Contributor
Forum Donor
Joined
Feb 28, 2016
Messages
1,279
Likes
1,180
1. Latency is really only a concern in realtime environments utilizing A/D/A. Listening to reproduction equipment doesn't meet the criteria.

2. Most manufacturers don't care about legacy users. I know I wouldn't.
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,915
Likes
16,746
Location
Monument, CO
Cable boxes, some streamers, my SONOS:Connect, a lot of pro audio gear does not have USB output.
 

3beezer

Member
Joined
Sep 22, 2017
Messages
11
Likes
7
Location
Montana
I am gathering from the general tone in this thread that the main and title question has been answered. USB, now almost universally in the asynchronous flavor, seems to have won hands down in a rout. Does anyone dispute that?
As I said in a previous post, I know of DACs for which the sound quality is better using their S/PDIF inputs than it is using their USB inputs. It's likely that the gap is closing with newer models, but until someone has demonstrated that it has closed, we should proceed cautiously. Aside from sonic differences, S/PDIF also has the advantage of being simpler and cheaper, it produces less electrical noise, galvanic isolation is easier, and it is possible to transmit S/PDIF -- or AES3, to be precise -- down much longer cables (1000m, in the case of AES-3id). I dislike S/PDIF philosophically as much as any of you, but declaring USB the winner in a rout may be a bit hasty.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
As I said in a previous post, I know of DACs for which the sound quality is better using their S/PDIF inputs than it is using their USB inputs.
Really?
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,717
Likes
241,524
Location
Seattle Area
I don't see a TOSLINK input on the AP 2722
My model has it so I am sure his too. It is very hard to see though as they are tiny connectors that are normally covered. I have circled them in this picture of it:

upload_2017-9-23_18-14-27.png
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,717
Likes
241,524
Location
Seattle Area
But, why would any but a few dinosaurs care about coax vs. Toslink?
Hey, I was just about test that and you tell me there is no use. :)

I like to see if this was really true or a myth.
 

Sal1950

Grand Contributor
The Chicago Crusher
Forum Donor
Joined
Mar 1, 2016
Messages
14,213
Likes
16,966
Location
Central Fl
As I said in a previous post, I know of DACs for which the sound quality is better using their S/PDIF inputs than it is using their USB inputs.
Can you supply verifiable evidence of that?
Or is it just a "what I reckon" sort of thing.
 

Sal1950

Grand Contributor
The Chicago Crusher
Forum Donor
Joined
Mar 1, 2016
Messages
14,213
Likes
16,966
Location
Central Fl
But, why would any but a few dinosaurs care about coax vs. Toslink?
That's why there's ethernet.

That's me!
Out of curiosity only, the optical and/or coax connections I use cause me no consternation.

Me too Ray.
But at the end of the day, after all the jabber about jitter this, noise that, ethernet it and the other.
Can any of ya'll provide verifiable evidence that any of the various digital delivery systems sound any different than the other when supplying the same data stream while operating within their proper perimeters? Lot's of interesting technical detail examinations to discuss but is there anything to be gained in the realm of non-delusional High Fidelity?
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,717
Likes
241,524
Location
Seattle Area
The point here wasn't which sounds better. But rather, given a choice, which one is better from technical point of view. To the extent you have a choice and one is cleaner than the other, it makes sense to use that.
 

Sal1950

Grand Contributor
The Chicago Crusher
Forum Donor
Joined
Mar 1, 2016
Messages
14,213
Likes
16,966
Location
Central Fl
The point here wasn't which sounds better. But rather, given a choice, which one is better from technical point of view. To the extent you have a choice and one is cleaner than the other, it makes sense to use that.
I know, Ray and I are both doing just a bit of kidding. Progress can only be made thru a scientific approach to investigation.
But we do have to keep in mind not to get too carried away and let ourselves run off the tracks chasing ghosts like the audiophools do elsewhere. Listening without controls to their power cords, USB cleaners, grounding boxes and all the rest of the damn foolery. They waste time producing thousands of pages of print and electronic media in a supposed effort to further the SOTA ,and whatever the motivation, do absolutely nothing to bring us better HiFi.......
 
Top Bottom