• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Optical SPDIF to AES XLR conversion - Clock transparency

tamtam

Member
Joined
Apr 19, 2020
Messages
42
Likes
9
Hey folks,

I need to convert from Optical SPDIF (Toslink) to AES XLR - stereo.
I was looking at Hosa ODL-312, and noticed that someone on the internet mentioned that he performed a loopback test and it was completely null.
This means the device is bit transparent.
But, if I understand correctly, this check alone is not enough to confirm this device is entirely transparent,
since this test doesn't verify that clock information hasn't been modified.
Is this true? How likely it is that clock is information is not transparent on this device?
Should I consider a more expensive device?
Thanks
 
Hey folks,

I need to convert from Optical SPDIF (Toslink) to AES XLR - stereo.
I was looking at Hosa ODL-312, and noticed that someone on the internet mentioned that he performed a loopback test and it was completely null.
This means the device is bit transparent.
But, if I understand correctly, this check alone is not enough to confirm this device is entirely transparent,
since this test doesn't verify that clock information hasn't been modified.
Is this true? How likely it is that clock is information is not transparent on this device?
Should I consider a more expensive device?
Thanks
I'm not really sure what "clock transparency" means.

Firstly, to be clear, you are happy that this is S/PDIF to AES/EBU, not audio on XLR. The way you describe it suggests (perhaps wrongly) that you think it's a DAC, when it's not. S/PDIF and AES/EBU are pretty much identical* serial digital protocols, on different physical connectors.
* some slight protocol differences.

The clock is embedded in the bitstream and is used by the end receiving device. If this box transfers the audio BitPerfect, then there's no reason why the clock will be incorrect. Perhaps you could quote the reference where this question is raised, so we know more.
 
I'm not really sure what "clock transparency" means.

Firstly, to be clear, you are happy that this is S/PDIF to AES/EBU, not audio on XLR. The way you describe it suggests (perhaps wrongly) that you think it's a DAC, when it's not. S/PDIF and AES/EBU are pretty much identical* serial digital protocols, on different physical connectors.
* some slight protocol differences.

The clock is embedded in the bitstream and is used by the end receiving device. If this box transfers the audio BitPerfect, then there's no reason why the clock will be incorrect. Perhaps you could quote the reference where this question is raised, so we know more.
I don't think this is a DAC, but before the signal goes out of the S/PDIF it is being clocked by the source clock.
To my understanding when the clock is imprecise it will create a jitter in the digital stream of data,
the receiving side will use as is, and in some cases will apply clock correction algorithm.
However, it is not same as re-clocking the entire stream of data the same way it happens in usb class 2.
It is more similar to how the usb class 1 works, where the computer clock is used as source.
 
Hey folks,

I need to convert from Optical SPDIF (Toslink) to AES XLR - stereo.
I was looking at Hosa ODL-312, and noticed that someone on the internet mentioned that he performed a loop-back test and it was completely null.
This means the device is bit transparent.
But, if I understand correctly, this check alone is not enough to confirm this device is entirely transparent,
since this test doesn't verify that clock information hasn't been modified.
Is this true? How likely it is that clock is information is not transparent on this device?
Should I consider a more expensive device?
Thanks
The conversion is very simple (both optical to electrical as well as v.v.
Of course there is some slight difference in timing in conversion (both ways) but that is not a problem for a digital connection.
This is jitter and depending on how the connected device reacts to that (clock synchronization) is another matter.

Data and clock are combined in the signal and can easily be pulled apart again.
The determination if there is a 0 or 1 is not done on the edge of the signal and whether or not a 1 or 0 is data/clock thus is not dependent on some jitter.
For that reason the conversion is transparent (for both clock and data), it just is half a clock pulse later.
 
The conversion is very simple (both optical to electrical as well as v.v.
Of course there is some slight difference in timing in conversion (both ways) but that is not a problem for a digital connection.
This is jitter and depending on how the connected device reacts to that (clock synchronization) is another matter.

Data and clock are combined in the signal and can easily be pulled apart again.
The determination if there is a 0 or 1 is not done on the edge of the signal and whether or not a 1 or 0 is data/clock thus is not dependent on some jitter.
For that reason the conversion is transparent (for both clock and data), it just is half a clock pulse later.
I'm not sure I fully understand.
You have mentioned that it does introduce a jitter, but the conversion is transparent for both clock and data?
Isn't the jitter that was introduced due to conversion makes the clock not "transparent"?
I assume that the receiving device reads the timing from input and doesn't re-clock the stream again using its own clock.
 
To my understanding when the clock is imprecise it will create a jitter in the digital stream of data,
the receiving side will use as is, and in some cases will apply clock correction algorithm.
Clock recovery is always done, otherwise it would not work properly. Basically the jitter is low-passed away.

As to your question: the conversation will have a negligible impact on jitter, so don’t worry!

However, it is not same as re-clocking the entire stream of data the same way it happens in usb class 2.
That is no exactly what happens with USB Audio Class 2. Generally, the USB DAC is master of its click domain, and the PC is slaved to this clock. This happens by the DAC requesting new audio data from the PC via timing controlled by the DAC. It’s not exactly re-clocking.
 
I'm not sure I fully understand.
You have mentioned that it does introduce a jitter, but the conversion is transparent for both clock and data?
Isn't the jitter that was introduced due to conversion makes the clock not "transparent"?
I assume that the receiving device reads the timing from input and doesn't re-clock the stream again using its own clock.
Simply stated.

You don't have to worry about jitter. The receiving DAC will recover and retime the clock to well below audible levels.

Let me ask another question - why do you need to do this? The only reason I can think of is if you have one device that outputs only toslink, and another that accepts only AES.

However, if you are doing it because you someone has told you AES is better - then you can just stop now and use Toslink. It isn't.
 
Simply stated.

You don't have to worry about jitter. The receiving DAC will recover and retime the clock to well below audible levels.

Let me ask another question - why do you need to do this? The only reason I can think of is if you have one device that outputs only toslink, and another that accepts only AES.

However, if you are doing it because you someone has told you AES is better - then you can just stop now and use Toslink. It isn't.
Right now I'm using an analog output, but I can use a digital connection to avoid the extra DA/AD conversion.
I'm skeptical about how much improvement it will bring, but in the worst case I free up one output on my audio interface.

Thank you all for the answers. Appreciate this.
 
Last edited:
Good answers in this thread, but what is your application where changes to the clock signal would be a concern? For playback and most production use cases a little jitter here or there won't affect anything materially.
 
Good answers in this thread, but what is your application where changes to the clock signal would be a concern? For playback and most production use cases a little jitter here or there won't affect anything materially.
I had a jittery audio interface once, Focusrite Scarlett class, so yes I want to avoid introducing jitter.
Also, I want the convertor to match the quality of my current gear (Prism Sound, PMC),
having said that, if it is fully transparent doesn't matter which brand it is.
 
Last edited:
I'm not a fan of trying to hear differences on a YouTube video but I still like this video as explanation of the so called jitter problem. I'm pretty sure it has been linked in other threads. The TLDR is;

Don't worry because:
  • Good DACs get the jitter below an audible level.
  • Poor DACs may on paper exhibit audible effects, but the effects are usually masked by the inherently poor SINAD of such DACs.
There a few caveats to this but in the context of a simple converter, I think it's good advice.
 
You have mentioned that it does introduce a jitter, but the conversion is transparent for both clock and data?
transparent for digital means that the input and output DATA (and in this case also the clock) is exactly the same. There is no buffering or changing of protocols going on.
Simply put a continuous string of 0's and 1's remains the same but just changes in physical format (light vs voltage in this case)

Isn't the jitter that was introduced due to conversion makes the clock not "transparent"?
No, the data/clock remains the same. Only the edges of the 0 to 1 and 1 to 0 transitions have a bit of jitter in it.
The data and clock 'values' (same digital signal) are not determined at the edges but when the 1 or 0 has 'settled' so halfway during the bitrate.
So both the data/clock value are determined at the dotted line below so jitter does not influence the data.

Schermafdruk van 2024-10-23 12-15-01.png


Of course for this to work a clock inside the receiver has to synchronize with with the clock that is embedded in the received signal in order to retrieve the data.
This is done by detecting the edges of the 1 to 0 or 0 to 1 transitions in the signal.
That will always contain jitter at that point and may or may not move the (dotted) decision point a bit more forward or backward in time.
This kind of depends on how good (stable/adaptive) the 'flywheel' in the PLL is and how much jitter there is.

Synchronizing the clocks is usually done with a PLL (phase locked loop) which controls the receiver clock frequency and matches it to that of the received signal.
This adapts an internal clock to the average of the incoming clock and should and sort of likes like the equivalent of a flywheel. Small variance don't change the decision point much and thus allows jitter to not influence the decision point (where it is decided incoming data is high or low)

After that is done the jitter of the incoming signal is removed by re-clocking to that of a stable quartz clock of the DAC chip.
The 'jitter' that clock has is what determines the 'jitter' of the DAC output and that is 'disconnected' from the jitter in the signal.

Not all DACs do this equally well though but most modern receivers have no problem unless the incoming data is so far degraded that the edge detection is making errors.

I assume that the receiving device reads the timing from input and doesn't re-clock the stream again using its own clock.
The receiving device synchronizes the internal clock with that of the incoming data/clock by looking at the transitions of the incoming signal.
That received clock is not used to clock the DAC samples. There are intermediate steps in the signal processing that remove the jitter from the incoming stream.
How well that is done is receiver dependent.
 
Last edited:
I had a jittery audio interface once, Focusrite Scarlett class, so yes I want to avoid introducing jitter.
Also, I want the convertor to match the quality of my current gear (Prism Sound, PMC),
having said that, if it is fully transparent doesn't matter which brand it is.
As far as I can see, prism sound devices have toslink inputs. Just use that, no conversion necessary.
 
As far as I can see, prism sound devices have toslink inputs. Just use that, no conversion necessary.
Yes, the Lyra 1 has toslink in/out, but the speaker has only switchable XLR analog/digital(aes) input and XLR digital out.
(a pair of PMC 6 monitors)
 
Last edited:
The receiving device synchronizes the internal clock with that of the incoming data/clock by looking at the transitions of the incoming signal.
That received clock is not used to clock the DAC samples. There are intermediate steps in the signal processing that remove the jitter from the incoming stream.
If I may add - we may better call it asynchronously-resampling instead of reclocking. The process changes values of the output samples, it's not just some FIFO. Only some DACs have the async resampler built in (mostly only ESS dacs). The incoming jitter cannot be eliminated completely, although to a large extent, depending on the async resampling method.
 
If I may add - we may better call it asynchronously-resampling instead of reclocking.
Only on devices using ASRC.

Dacs that use PLL to generate a synchronised clock don't need to resample. TBH though I've no idea which DACs use which system.

And you're correct that jitter is not completely eliminated : all clocks have some jitter. But on pretty much all DACS it is reduced to levels well below audibility.
 
Only on devices using ASRC.

Dacs that use PLL to generate a synchronised clock don't need to resample. TBH though I've no idea which DACs use which system.

And you're correct that jitter is not completely eliminated : all clocks have some jitter. But on pretty much all DACS it is reduced to levels well below audibility.
Correct. I apologize for quoting wrong part, my post applied to:

After that is done the jitter of the incoming signal is removed by re-clocking to that of a stable quartz clock of the DAC chip.
The 'jitter' that clock has is what determines the 'jitter' of the DAC output and that is 'disconnected' from the jitter in the signal.
 
That is no exactly what happens with USB Audio Class 2. Generally, the USB DAC is master of its click domain, and the PC is slaved to this clock. This happens by the DAC requesting new audio data from the PC via timing controlled by the DAC. It’s not exactly re-clocking.
This has nothing to do with USB Audio Class 1 or 2.
USB audio puts the bus in isochronous mode.
The synchronization can be done in 3 ways

Synchronous​

The clock driving the DAC is directly derived from the 1 kHz frame rate.
This mode was used by the early USB audio devices.
They were limited to 48 kHz and pretty jittery.

Adaptive​

In this mode the timing is generated by a separate clock.
A control circuit (sample rate guesser) measures the average rate of the data coming over the bus and adjusts the clock to match that.
Since the clock is not directly derived from a bus signal it is far less sensitive to bus jitter than synchronous mode, but what is going on the bus still can affect it.
It’s still generated by a PLL that takes its control from the circuits that see the jitter on the bus.

Asynchronous

In this mode an external clock is used to clock the data out of the buffer and a feedback stream is setup to tell the host how much data to send.
A control circuit monitors the status of the buffer and tells the host to increase the amount of data if the buffer is getting too empty or to decrease if it’s getting too full.
Since the readout clock is not dependent on anything going on with the bus, it can be fed directly from a low jitter oscillator, no PLL need apply.
This mode can be made to be very insensitive to bus jitter.

Today UAC2 and asynchronous synchronization is common.

Bit more detail: https://www.thewelltemperedcomputer.com/KB/USB.html
 
Back
Top Bottom