• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Battle of S/PDIF vs USB: which is better?

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Or alternatively, the idea USB results in better measured performance is disproven for this device.

What it really means is jitter over decent SPDIF and USB is low enough it isn't a limiting factor in performance among most DACs. Except for notable failures (like the USB of Schiit Modi 2 or Sanskrit SMSL), jitter is effectively a non-issue.

I have a recording interface that can have a free running internal clock fed via ASIO, use an external clock over Toslink or use an external clock with the ADAT format, use external clock over coax SPDIF or external clock over BNC coax connection. When fed to my Forte ADC you get identical results with those. Even as close as .4 hz from the main tone the results are the same. The only differences are noise levels below 10 hz between the connections.

Now it is possible the Forte ADC has higher jitter than anything else and is dominating the result. It however has pretty low jitter or it would manifest itself in sum and difference tones and other noise. All of which are very, very low with the Forte.

So checking the Jtest for the various inputs is worthwhile to be thorough. If you don't get sub-par results the type of digital connection doesn't matter audibly.

Which is a strong case for the flexible, light, smooth toslink cables.
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
I find it hard to put myself in the place of a person who has never written any real time software or looked at pulses with an oscilloscope or whatever i.e. typical audiophiles and reviewers. As I said before, shuffling bits around in asynchronous links is supremely trivial. Implementing PLLs and adaptive systems less so - but entirely graspable intuitively. It is possible to grasp that a unidirectional link can never be entirely free of topology-induced jitter, while an asynchronous one is. And also that jitter in the non-asynchronous system can be attenuated and/or 're-distributed' at the expense of some latency.

I worry that people are imagining that digital links are rocket science, or have an element of 'art' in them, and can only be evaluated by listening to string quartets and jazz from DAC outputs. The reality is much, much more prosaic and predictable than that. As someone said earlier, it isn't an audio interface. It's a digital interface.

Obviously statements like "it isn´t an audio interface. It´s digital interface" and "shuffling bits around in asynchronous links is supremely trivial" are worrysome. :)
If developers don´t have the goal in mind (which is an analog output signal at the end) they tend to overlook specific factors that are able to compromise the signal quality (hey it´s just audio we are not talking about rocket science)

Iow, no matter how simple/trivial something is, it´s not a problem to screw it up, if you don´t permanently realize what the goal is. Therefore i often emphasize the importance of measurements (be it by sensory evaluation or by measurement gear) at the analog outputs instead of theoretisizing about things that can´t happen.

To be a bit pedantic; a unidirectional link can of course be entirely free of topology induced jitter, it is a matter of convenience/complexity or latency....
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
To be a bit pedantic; a unidirectional link can of course be entirely free of topology induced jitter, it is a matter of convenience/complexity or latency....
Ah, you see, that's the sort of thing I'm referring to. You have to think it through. The only way you can avoid introducing jitter is to allow for infinite latency. As described in an earlier post, any other scheme is going to require adjustment (i.e. at least momentary jitter) at some point. Or some Heath Robinson scheme to flush a FIFO in between tracks detected as digital silence or whatever - which may never happen.
 

3beezer

Member
Joined
Sep 22, 2017
Messages
11
Likes
7
Location
Montana
If you have a dac that performs better with spdif then there is something wrong with its usb implementation.
That's a nice way of restating my point. USB has the theoretical advantage of putting the clock in the right place, but there are other design issues that can undermine this advantage if they are executed incorrectly. The topological correctness of USB does not automatically assure superior performance. Nor does the topological incorrectness of S/PDIF automatically assure inferior performance -- at least not from the standpoint of perceptible distortion. Benchmark claims that their DAC keeps jitter-induced distortion below -135dB for all digital inputs. As I said, I think that all products are evolving to provide equal performance regardless of the digital interface, just as the Benchmark (and others, I'm sure) do now. I am not the least surprised that Amir found two more DACs for which performance is better using USB. I suspect that it is more common for that to be the case than the reverse. My point was simply that the reverse does seem to occur, though with a frequency that I don't know. Accordingly, it seems prudent in the meantime for people who care to assure either with objective measurements, as suggested by Blumlein88, or by careful listening tests whether your DAC is like the few that I found for which sound quality is better using S/PDIF.

The inferiority of USB in some implementations could be related to the issue of galvanic isolation. I know that one of the DACs I tested does not provide galvanic isolation on its USB interface. Fitzcaraldo215, do you have any idea how your DAC implements galvanic isolation -- or did you galvanically isolate it using an external device? Vincent has a great illustration of the effectiveness of galvanic isolation (http://thewelltemperedcomputer.com/Intro/SQ/GalvanicIsolation.htm). Does anyone know how Exasound implements galvanic isolation? Galvanic isolation is easy with S/PDIF, although the requisite pulse transformers are somewhat expensive. Because USB is bidirectional and fast, it is difficult to implement galvanic isolation at the input, where it belongs. I assume that manufacturers are introducing galvanic isolation later in the signal chain, but if that is the case then the electrical noise is getting into their box so they have to implement other measures to protect the sensitive analog circuitry.
 

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
632
That's a nice way of restating my point. USB has the theoretical advantage of putting the clock in the right place, but there are other design issues that can undermine this advantage if they are executed incorrectly. The topological correctness of USB does not automatically assure superior performance. Nor does the topological incorrectness of S/PDIF automatically assure inferior performance -- at least not from the standpoint of perceptible distortion. Benchmark claims that their DAC keeps jitter-induced distortion below -135dB for all digital inputs. As I said, I think that all products are evolving to provide equal performance regardless of the digital interface, just as the Benchmark (and others, I'm sure) do now. I am not the least surprised that Amir found two more DACs for which performance is better using USB. I suspect that it is more common for that to be the case than the reverse. My point was simply that the reverse does seem to occur, though with a frequency that I don't know. Accordingly, it seems prudent in the meantime for people who care to assure either with objective measurements, as suggested by Blumlein88, or by careful listening tests whether your DAC is like the few that I found for which sound quality is better using S/PDIF.

The inferiority of USB in some implementations could be related to the issue of galvanic isolation. I know that one of the DACs I tested does not provide galvanic isolation on its USB interface. Fitzcaraldo215, do you have any idea how your DAC implements galvanic isolation -- or did you galvanically isolate it using an external device? Vincent has a great illustration of the effectiveness of galvanic isolation (http://thewelltemperedcomputer.com/Intro/SQ/GalvanicIsolation.htm). Does anyone know how Exasound implements galvanic isolation? Galvanic isolation is easy with S/PDIF, although the requisite pulse transformers are somewhat expensive. Because USB is bidirectional and fast, it is difficult to implement galvanic isolation at the input, where it belongs. I assume that manufacturers are introducing galvanic isolation later in the signal chain, but if that is the case then the electrical noise is getting into their box so they have to implement other measures to protect the sensitive analog circuitry.
I have no idea how George Klissarov at Exasound has implemented galvanic isolation in detail. But, measurements seem to confirm that it works. All I know is that my E28 sounds terrific, and Kal and others agree anecdotally. It also seems magic USB cables and add on toys do not improve it, if one does not succumb to audiophile paranoia and listens in a reasonably unbiased way.

But, I agree, there is no panacea, no magic bullet. The implementation is key with all things. It is not hard to screw up a even a very good and seemingly robust technical architecture.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
Nor does the topological incorrectness of S/PDIF automatically assure inferior performance -- at least not from the standpoint of perceptible distortion. Benchmark claims that their DAC keeps jitter-induced distortion below -135dB for all digital inputs. As I said, I think that all products are evolving to provide equal performance regardless of the digital interface, just as the Benchmark (and others, I'm sure) do now. I am not the least surprised that Amir found two more DACs for which performance is better using USB. I suspect that it is more common for that to be the case than the reverse.
Do such tests measure/specify the the source sample rate, DAC sample rate and the stability thereof? If not, you cannot conclude anything about the performance of that particular interface from any test - you just have to think about how it works. It would be entirely possible for the source and DAC sample rates to match sufficiently well that no adjustments have to be made by the DAC for several minutes. But try it next day with a different source or temperature and it could be different. In the asynchronous case however, there is good reason to conclude that no significant changes will occur from one test to the next.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,376
Likes
234,506
Location
Seattle Area
A note guys: it is easy to make generalized statements. As objectives, we love to do that :). My goal with this forum and personally has been to be able to put hard, concrete data behind any statement we make. The data in this thread for example has changed people's view (elsewhere) far more than us just saying USB is as good as S/PDIF.

To that end, I want to make sure we don't come across dogmatic and suppress the need for additional data finding and objective information no matter how predictable the results may be to us.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,524
Likes
37,057
Do such tests measure/specify the the source sample rate, DAC sample rate and the stability thereof? If not, you cannot conclude anything about the performance of that particular interface from any test - you just have to think about how it works. It would be entirely possible for the source and DAC sample rates to match sufficiently well that no adjustments have to be made by the DAC for several minutes. But try it next day with a different source or temperature and it could be different. In the asynchronous case however, there is good reason to conclude that no significant changes will occur from one test to the next.

Here are some warm up results I posted elsewhere:

I did some measures of warming up gear from cold.


I started both an ADC and DAC from a dead cold start. Ran some measurements immediately and then at 35 minutes, 63 minutes, and 114 minutes later.


The first surprise is timing and jitter. Cold while I have no way to measure absolute clock speed, the clock rate difference between ADC and DAC was 81 ppm. At 35 minutes it was 79 ppm and stayed at that speed. Using a quarter sample rate tone the sharpness of the tone and observable jitter sidebands changed not at all even dead cold compared to 114 minutes later.


The noise floor did drop nearly 5 db from cold to 114 minutes later. I guess analog circuits care more about warmup!


The 3rd harmonic of a 1 khz tone was -97 db cold, - 99.5 db at 35 minutes, 99.7 db at 63 minutes and 99.9 after 114 minutes.


The 1 khz difference with a max level 18 khz and 19 khz tone for testing IMD was -112.5 db cold, -114.5 db at 35 minutes, -115.4 db at 63 minutes and -115.9 db at 114 minutes.


So it looks like analog circuitry benefits from some warm up at least for two hours. Though the differences are not at all likely audible. The big surprise is timing and jitter change very little and apparently stabilize quickly or at least less than 30 minutes time.


So my suggestion would be turn your DACs off. No need to leave them on forever. One less thing to obsess over.

In this case both ADC and DAC were connected via ASIO over USB. So both would have had free running clocks. I have measured the speed difference a few times between these two pieces of gear. I have gotten a difference as low as 70 ppm and as high as 83 ppm. Most often they are about 75 ppm different. So the speed difference does drift some just not by a huge amount.
 
Last edited:

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
Ah, you see, that's the sort of thing I'm referring to. You have to think it through.

Wasn´t that my point? :)

The only way you can avoid introducing jitter is to allow for infinite latency. As described in an earlier post, any other scheme is going to require adjustment (i.e. at least momentary jitter) at some point. Or some Heath Robinson scheme to flush a FIFO in between tracks detected as digital silence or whatever - which may never happen.

Infinite latency isn´t necessarily required in case of a connected cd player as the playing time is restricted. So as a brute force approach a buffer containing the maximum data amount would do the job.
A bit less "brute force" would be to use the specifications from the standards, as high quality and low quality sources were specified. So the worst case drift between send and replay clock is given and the buffer fill to avoid overrun or underrun is set up too; that was the "inconvenient" part as replay might only start after a couple of seconds and after stopping the source, the dac will play still for a couple of seconds.

Or add a bit more and let the dac respond accordingly to the remote control commands to stop the replay immediately.
As said before it depends but for a synchronous interface like SPDIF the topology doesn´t generally prohibit a jitter performance independent from the source.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
Infinite latency isn´t necessarily required in case of a connected cd player as the playing time is restricted. So as a brute force approach a buffer containing the maximum data amount would do the job.
A bit less "brute force" would be to use the specifications from the standards, as high quality and low quality sources were specified. So the worst case drift between send and replay clock is given and the buffer fill to avoid overrun or underrun is set up too; that was the "inconvenient" part as replay might only start after a couple of seconds and after stopping the source, the dac will play still for a couple of seconds.

Or add a bit more and let the dac respond accordingly to the remote control commands to stop the replay immediately.
As said before it depends but for a synchronous interface like SPDIF the topology doesn´t generally prohibit a jitter performance independent from the source.
As I said, Heath Robinson...
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
As I said, Heath Robinson...

Well, we were starting at "a unidirectional link can _never_ be ...." and are now at, well it can be, but..... so it´s a progress. :)

I´m a bit surprised that the buffer considerations would qualify for the "Heath Robinson mark" as i´d say it is more near trivial, but ok that is debateble. Nice remark btw, i didn´t know Heath Robinson before.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
Well, we were starting at "a unidirectional link can _never_ be ...." and are now at, well it can be, but..... so it´s a progress. :)
Ahem... me some pages ago:
You can't 're-clock' without accommodating the frequency difference - which will drift - or allowing long latency and a buffer that drains or fills until it's empty or full whereupon something drastic is needed.
Of course you can, effectively, abandon all real time pretensions and simply record 'a track' until it's 'finished' before you begin playing it back! But in the general case, there may not be 'tracks' and the stream may never be 'finished'.

Above all, there is no reason to use S/PDIF except for applications where existing digital signals need to be mixed with low latency, etc. - and these applications preclude the use of Heath Robinson tactics, anyway. For everything else, a bidirectional link can be used.
 

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
632
Ahem... me some pages ago:

Of course you can, effectively, abandon all real time pretensions and simply record 'a track' until it's 'finished' before you begin playing it back! But in the general case, there may not be 'tracks' and the stream may never be 'finished'.

Above all, there is no reason to use S/PDIF except for applications where existing digital signals need to be mixed with low latency, etc. - and these applications preclude the use of Heath Robinson tactics, anyway. For everything else, a bidirectional link can be used.
I understand that extremely low latency may be critical in some specialized applications, though probably not in normal home playback, especially for music. It happens that I also use asynch USB together with the added latency of room EQ (Dirac) on BD and cable TV playback. Lip synch is not a problem given the automatic lip synch adjustments in the JRiver software I use for playback.
 

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
632
You need low latency for video. It is impossible to delay video to match.
Possibly my resulting latency is still "low enough". I just know that there is some audio latency, but there is no lip synch issue.
 

Jakob1863

Addicted to Fun and Learning
Joined
Jul 21, 2016
Messages
573
Likes
155
Location
Germany
Ahem... me some pages ago:

Of course you can, effectively, abandon all real time pretensions and simply record 'a track' until it's 'finished' before you begin playing it back! But in the general case, there may not be 'tracks' and the stream may never be 'finished'.

Above all, there is no reason to use S/PDIF except for applications where existing digital signals need to be mixed with low latency, etc. - and these applications preclude the use of Heath Robinson tactics, anyway. For everything else, a bidirectional link can be used.

Obviously i missed that or forgot about (was more attracted by seeing again) the BurrBrown link from the past; but i don´t understand why you asserted that the S/PDIF can´t achieve "jitter performance indepence" during playback although you provided arguments for the opposite?
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
Obviously i missed that or forgot about (was more attracted by seeing again) the BurrBrown link from the past; but i don´t understand why you asserted that the S/PDIF can´t achieve "jitter performance indepence" during playback although you provided arguments for the opposite?
I think there are two modes which you are conflating here, both of which can be achieved with S/PDIF:
1. streaming
2. downloading

It is not in dispute that a file can be downloaded in its entirety via any digital link (even RS232!) and then replayed jitter-free.

'Streaming' per se is different: there is no defined start and end point, so you can't know in advance where to insert pauses, or start playback. If the DAC's sample rate is different from the source's (by only a fraction a percent) the DAC must somehow accommodate that difference dynamically => a form of jitter.

What I am branding 'Heath Robinson' is the idea of creating a 'pseudo-stream' comprising
(a) Whole songs identified by silence or other markers which allow the system to take liberties with gaps between tracks, etc.
(b) Starting with a particularly large latency and large buffer with the assumption that the system won't hit the endstops (either emptying the buffer or filling it totally) over the course of a few hours.
(c) a combination of the two

I have seen these schemes suggested seriously. They might even achieve what they set out to - in limited applications and with some inconvenience - but using a bidirectional link these shenanigans are unnecessary.
 
Last edited:
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,376
Likes
234,506
Location
Seattle Area
I have seen both of these schemes suggested seriously.
Naim had a DAC that did (a). It used a fixed sampling rate and used the gaps to resync with the source. I am not a fan of such designs as it puts requirements on what one plays.
 
Top Bottom