• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Battle of S/PDIF vs USB: which is better?

captain paranoia

Active Member
Joined
Feb 9, 2018
Messages
293
Likes
218
192 samples corresponds to the (AES) S/PDIF block rate; every 192 samples the preamble is inverted. The 250 Hz jitter component is evidence of faulty clock extraction.

That sounds a very plausible candidate; using the frame rate would be a reasonable approach to locking the loop, if done properly...
 

Rene

Member
Joined
Feb 11, 2018
Messages
90
Likes
87
That sounds a very plausible candidate; using the frame rate would be a reasonable approach to locking the loop, if done properly...

Extracting clock from the preamble may sound plausible but therein lies the rub. The preambles violate the biphase-mark coding (a transition at every bit period plus a transition mid-period for a binary one) by failing to have a data transition at the bit edges. Many early attempts at clock extraction failed because of this. They used the bit period edges to extract clock and freewheeled through the preambles, adding lots of sample-rate related jitter in the process. The preamble violation is one of the ways the "AES-EBU standard is flawed" (relating to a previous reference to a famous AES paper).
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,464
Location
Australia
I have the old Creative Extigy external soundcard lying around. It uses USB 1.1. I was wondering if the S/PDIF is similarly limited on this device? How would it perform for CDs?

Here is an old test for interest sake: https://www.extremetech.com/computing/73098-creative-sound-blaster-extigy

P.S. these are bringing $50 plus on Ebay so they must be useful for something. o_O
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,464
Location
Australia
Assuming you can get drivers for it, it should work. The quality of S/PDIF output may be suspect but a good DAC would filter that.

Thanks. I am running Win 7 so XP drivers should be OK. My ancient IBM R40(XP) could be dragged-in if necessary.
Maybe the Topping d30 would do the job? :)
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,464
Location
Australia
Another question. Is this more than adequate for ripping LPs from a quality TT/ phono preamp combo?
 
Last edited:

captain paranoia

Active Member
Joined
Feb 9, 2018
Messages
293
Likes
218
Extracting clock from the preamble may sound plausible but therein lies the rub.

I had a bit of a think about how I might go about recovering a good clock. And it's not trivial...

In theory, AES can operate at any frequency, which means a VCO, rather than a pulled xtal. VCO will have worse phase noise, so let's go for the cleaner pulled xtal. So I thought I'd limit my 'spec' to the common frequencies, and their multiples. Which probably means two xtals, one for the 44.1k multiples, and one for the 48k multiples. If you went for a 4x bit rate clock, and a 384kSa/s rate, you'd need a clock of 98.304MHz....

You'd need to determine the basic clock multiple (44.1 or 48), and the rate multiplier, and choose which xtal to lock with, which would set your PLL divider ratios.

Then lock to the bit transitions to get the basic frequency right, and start extracting sub frames, with the preamble indicator.

At that point, I might switch the loop to start using the preambles, rather than the bit transitions, to lock the loop, changing the PLL divider ratios appropriately. Some interesting things might happen as the PLL divide ratio changes the loop gain, so it may need a change to the loop charge pump to compensate.

Since the preambles occur at sub-frame rate, that's still a very high rate to be hitting the PLL loop filter with, which should ensure the feedback is well outside the loop filter bandwidth; that wants to be narrow (when acquired), as we don't need to track fast changes in the input clock, we're just trying to match the frequency, and we don't want our local oscillator to move about much.

I haven't designed a PLL for about ten years (and they were VCO synthesizers, not xtal oscillators), so I'm very rusty, but I can see it's not a simple task to get right, and I'm sure I've overlooked some stuff in my idle musings above. I've worked with some optical data comms systems over the years, that used clock recovery for the received data stream (including one with 'bad biphase coding' preamble/sync). But simple data comms receive is relatively tolerant of jitter; you just need to sample somewhere around the middle of the eye to get a good bit recovery, and exact sample timing isn't necessary.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Assuming you can get drivers for it, it should work. The quality of S/PDIF output may be suspect but a good DAC would filter that.

@amirm , what do you mean by «a good DAC would filter that».

I know the marketing lingo on «filter» (you have lots of filter products around), but what are theory and measurements saying?
 

Rene

Member
Joined
Feb 11, 2018
Messages
90
Likes
87
I had a bit of a think about how I might go about recovering a good clock. And it's not trivial...

I

Thank you for your thoughts on this.

I am only somewhat conversant on the design of digital PLLs, having had only to familiarize myself with the details of data ordering in the AES and, by extension, S/PDIF streams in order to make some intelligent comments when serving on the AES technical committee years back. Being an analog engineer by birth (I grew up with vacuum tubes and Heathkit audio amps) my limited experience with PLLs was back in the days when they were done with 555 timers and opamp loop filters. These were the pre-jitter days of audio! I remember using some of the first generation Crystal Semiconductor AES receivers and being none too excited about the results.

Today there are a number of manufacturers of such parts and they seem to perform well. My feeling, as also expressed elsewhere on this thread, is the best way to handle clock generation for a dac is to place a low jitter oscillator next to the chip and use either an ASRC or USB source for the dac data. The true challenges for PLL design are with A-D convertors, which must often be synchronized to an external system clock.
 

captain paranoia

Active Member
Joined
Feb 9, 2018
Messages
293
Likes
218
My feeling, as also expressed elsewhere on this thread, is the best way to handle clock generation for a dac is to place a low jitter oscillator next to the chip and use either an ASRC or USB source for the dac data.

Oh, I agree entirely. Data transfer controlled by the destination device (aka 'data pull', 'asynchronous operation', etc.) is by far the best way of dealing with the clocking problem; it becomes a matter of a packet-based protocol and FIFO control, leaving us with a 'perfect' local sample clock.

But SPDIF and AES exist, and people use them... I'm just trying to think how it might be done as well as possible...
 

DonH56

Master Contributor
Technical Expert
Forum Donor
Joined
Mar 15, 2016
Messages
7,891
Likes
16,696
Location
Monument, CO
Most PLLs include a VCO in the loop, not sure I have seen many using a pulled crystal circuit, but again my career did not really focus on audio. Clock and data recovery for digital data transfer is nontrivial in my world of long, lossy, noisy channels at rates over 10 Gb/s although bang-bang designs are still prevalent so a little noisier but easier to align to a system clock. Last time I designed a multirate PLL (analog, not digital) I used a pretty basic circuit, just swept the divider until the bloody thing locked. Or not. Had to be careful the PFD and VCO did not lock up along the way, one of those obvious things that was a pain to implement. That was for an X-band ADC, though, with clock that could be 10 t0 1000 MHz and was required to be clean to <1 ps, not audio. Audio rates would not seem to be a horrible challenge but there are always gotcha's around...
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,652
Likes
240,793
Location
Seattle Area
Hi there. I took a look. The features are definitely good. Problem is that it costs nearly $400 and you can get DACs with excellent USB interfaces.

That said, I hear fair bit about Matrix products so if someone is reading this and have their products, I love to review them! :)
 

gvl

Major Contributor
Joined
Mar 16, 2018
Messages
3,490
Likes
4,078
Location
SoCal
With the popularization of I2S inputs on newer DACs it would be interesting to see if they add anything useful, maybe apart from adding native DSD support through an external USB interface. The claim is that say a dedicated DDC such as the Matrix X-SPDIF2 has better XOs that what is typically found in the DACs which helps to further reduce jitter. The concern is that longer I2S connection due to cabling and a typical LVDO circuitry used on HDMI type connectors may actually introduce additional jitter. There are some additional details to ponder such as if the I2S master clock is used inside the DAC, and if there is any additional async processing or a FIFO buffer with reclock downstream of the I2S input.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
With the popularization of I2S inputs on newer DACs....
So people are now using an interface designed only to be used between chips on a PCB as the physical interface from the outside world? All because they think it gives better quality bits?

What should be the reaction to such an idea? To laugh at it then ignore it, or set up a series of highly controlled listening tests? To many people it's the latter because they think that's science. But in fact this is a man-made system and doesn't need to be observed like the natural world. Using I2S as the interface to the outside world is just stupid and doesn't need to be tested in order to dismiss it.
 
Last edited:

gvl

Major Contributor
Joined
Mar 16, 2018
Messages
3,490
Likes
4,078
Location
SoCal
.
So people are now using an interface designed only to be used between chips on a PCB as the physical interface from the outside world? All because they think it gives better quality bits?

What should be the reaction to such an idea? To laugh at it then ignore it, or set up a series of highly controlled listening tests? To many people it's the latter because they think that's science. But in fact this is a man-made system and doesn't need to be observed like the natural world. Using I2S as the interface to the outside world is just stupid and doesn't need to be tested in order to dismiss it.

It is not much more stupid than the ubiquitous SPDIF, and maybe less.
 
Last edited:

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
.


It is not much more stupid than the ubiquitous SPDIF, and maybe less.
S/PDIF is designed to be sent over cables. I2S is not.
 

gvl

Major Contributor
Joined
Mar 16, 2018
Messages
3,490
Likes
4,078
Location
SoCal
S/PDIF is designed to be sent over cables. I2S is not.

We are not talking about feet here. A very short I2S connection over quality cable may well be better than an equivalent SPDIF link that has to go through demux and clock recovery.
 
Top Bottom