• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Wiim Pro distortion on spdif input

antcollinet

Master Contributor
Forum Donor
Joined
Sep 4, 2021
Messages
7,678
Likes
12,936
Location
UK/Cheshire
OK - not sure what you are saying here - you stated "then reclocked":

Plus in previous post, you linked a post that states the one hand that the SPDIF (when you couple mini to pro) is reclocked to the pro clock (so must be using ASRC or similar?). But at the same time you say the data is not altered.
 

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
OK - not sure what you are saying here - you stated "then reclocked":

Plus in previous post, you linked a post that states the one hand that the SPDIF (when you couple mini to pro) is reclocked to the pro clock (so must be using ASRC or similar?). But at the same time you say the data is not altered.
It's reclocked by Pro which means that the Pro doesn't use a clock recovered from spdif in when passing data to spdif out. I can see that when capturing with ASRC DAC.
And, on the other hand, the audio data passed is exactly the same as the original one. I can see that when capturing with RME ADI-2 or with just a simple spdif-usb converter.
 

antcollinet

Master Contributor
Forum Donor
Joined
Sep 4, 2021
Messages
7,678
Likes
12,936
Location
UK/Cheshire
And, on the other hand, the audio data passed is exactly the same as the original one. I can see that when capturing with RME ADI-2 or with just a simple spdif-usb converter.

What? The reclocked data from the pro is the same as the original? Or the data out of the mini into the pro?

Second one I would expect if mini set to bit perfect. First doesn't seem possible.
 

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
What? The reclocked data from the pro is the same as the original? Or the data out of the mini into the pro?

Second one I would expect if mini set to bit perfect. First doesn't seem possible.
The audio data captured on Pro's spdif out is the same as audio data received by Mini and passed to Pro over spdif.
 

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
First please explain exactly what you mean by "reclocking" in your case.

If you capture an incoming SPDIF stream via USB asynchronous (which is the case for the RME soundcard), there are not two clock domains involved. The stream preserves the pace of the original incoming SPDIF clock.


I am afraid I do not understand your post. How can a DAC receive SPDIF and send it to PC? DAC means digital-audio converter, it has nothing to do with SPDIF.

If by DAC you call an "audio device with digital input and output", then the part "which locks to the incoming clock" is important - again no two clock domains, only the incoming clock.

I have repeatedly emphasized the "two independently running clocks" scenario. That is when the DAC part of the chain has its own clock and is not clocked by clock recovered from the incoming SPDIF stream. Since the WiiM device does not require to have the SPDIF feed as a master, it must have some other clock for the DAC part (either independent by the DAC or generated by the A113X SoC). For the DAC part to be clocked by the SPDIF clock, it would have to include a clock switch. Which it may or may not do/have, I do not know.
When I've mentioned "reclocking" I've meant a situation when a device locks to the incoming clock to retrieve incoming audio data but uses an internal clock when audio data is passed thru spdif out later. This is something what happens when Mini is chained with the Pro according to what I've seen. I do not mean here a situation when ASRC device resamples audio data to it's own internal clock without locking to the recovered one.
I was mentioning "DACs" as example devices but without touching DA conversion, so fully in the digital domain.
 

phofman

Addicted to Fun and Learning
Joined
Apr 13, 2021
Messages
502
Likes
325
When I've mentioned "reclocking" I've meant a situation when a device locks to the incoming clock to retrieve incoming audio data but uses an internal clock when audio data is passed thru spdif out later. This is something what happens when Mini is chained with the Pro according to what I've seen.
If the two clocks are really independent - incoming SPDIF and internal of the device (how did you determine their independence?), then please explain how the outgoing clock can pass unchanged samples delivered by the incoming clock for unlimited time, when eventually any two clocks will deviate by more than any limited size of a FIFO?

IF the two clocks are really independent and you determined that samples are unaltered, then the only possible outcome will be a buffer under/overflow in some finite time. That time may be from seconds (very mutually deviating clocks, short FIFO) to any finite time. The OP complained about an hour, well within expected range of FIFO limits.

Did your test capture run for expected uninterrupted session time (that IMO being at least several hours)? Even if it did, those results would only apply to your specific pair of clocks. Any other pair of clocks could yield vastly different deviation times.

That is why professional solutions use either one master clock for the whole chain, or use ASRC.
 
Last edited:

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
If the two clocks are really independent - incoming SPDIF and internal of the device (how did you determine their independence?), then please explain how the outgoing clock can pass unchanged samples delivered by the incoming clock for unlimited time, when eventually any two clocks will deviate by more than any limited size of a FIFO?

IF the two clocks are really independent and you determined that samples are unaltered, then the only possible outcome will be a buffer under/overflow in some finite time. That time may be from seconds (very mutually deviating clocks, short FIFO) to any finite time. The OP complained about an hour, well within expected range of FIFO limits.

Did your test capture run for expected uninterrupted session time (that IMO being at least several hours)? Even if you did, those results would only apply to your specific pair of clocks. Any other pair of clocks could yield vastly different deviation times.
I'm not trying to explain this. I've mentioned what I've seen during my tests and my opinion on how it "seems to be" for me. You can see that exactly in my post.
If my conclusion is incorrect I would gladly welcome any explanation of what I've observed.
I've seen that tracks being played on the Pro and on the Mini, and captured later on the ASRC device, have different length. Thus my conclusion that clocks differ. Track played on Mini connected to the Pro has the same captured length as the track played directly on the Pro. Thus my conclusion that reclocking took place.
On the other side I was able to capture bit perfect content for all the scenarios - Mini alone, Pro alone, Mini to Pro - when using a device which locks to the recovered clock.

If there is another and better explanation that reclocking I'm perfectly fine with that.
 

phofman

Addicted to Fun and Learning
Joined
Apr 13, 2021
Messages
502
Likes
325
I do not dispute your results, they are what you measured.

It's perfectly possible (in fact very likely) that WiiM devices do not clock their output with SPDIF input as that requires extra clock-switching circuitry.

Also it's perfectly possible (in fact very likely) that WiiM devices do not use any adaptive resampling. ASRC in software is a complex task which is difficult to get right, especially if low latencies are targeted. Look at the complexity of CamillaDSP relative pitch detection source code and there are still ASRC issues to be sorted out.

If so, then my 2 cents guess at the beginning of the thread would most likely apply - the SPDIF "distortion" after some long time of fine playback is caused by buffer incidents because the streamer uses two separate clock domains which are not merged by proper adaptive resampling. A common issue which many SW and HW devices and audio chains have.
 

Joffy1780

Addicted to Fun and Learning
Joined
Jul 31, 2022
Messages
665
Likes
530

antcollinet

Master Contributor
Forum Donor
Joined
Sep 4, 2021
Messages
7,678
Likes
12,936
Location
UK/Cheshire
The audio data captured on Pro's spdif out is the same as audio data received by Mini and passed to Pro over spdif.
Huh?

I must be seriously misunderstanding what you are saying.

The mini recieves a file and passes it to the pro via SPDIF. The Pro resamples it as demonstrated by the different number of samples in your comparison. And sends that resampled data out via SPDIF

And somehow the resampled output from the pro with it's extra samples, and different values for those samples is somehow identical to the input to the mini.

If that understanding of what you are saying is correcdt - sorry: It doesn't work that way. Both those things can't be happening (Resampling/chaning the number of samples AND the data being the same as input)


OR are you saying that there isn't a different number of samples, just a different amount of time to output the same samples (ie a different spacing between the samples based on a different clock rate.

In which case that can happen but it is not resampling it is just different clocks into and out of the buffer - and will result in buffer overrun or underrun and either dropped samples, or droputs as the data runs out.
 
Last edited:

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
Huh?

I must be seriously misunderstanding what you are saying.

The mini recieves a file and passes it to the pro via SPDIF. The Pro resamples it as demonstrated by the different number of samples in your comparison. And sends that resampled data out via SPDIF

And somehow the resampled output from the pro with it's extra samples, and different values for those samples is somehow identical to the input to the mini.

If that understanding of what you are saying is correcdt - sorry: It doesn't work that way. Both those things can't be happening (Resampling/chaning the number of samples AND the data being the same as input)


OR are you saying that there isn't a different number of samples, just a different amount of time to output the same samples (ie a different spacing between the samples based on a different clock rate.

In which case that can happen but it is not resampling it is just different clocks into and out of the buffer - and will result in buffer overrun or underrun and either dropped samples, or droputs as the data runs out.
You must be really misunderstanding indeed.
The Pro does not resample anything, the ASRC capture device does. In this scenario I can see that Pro's clock and Mini's one differ. And I can see that Pro's clock is being used for Mini's spdif stream.
Bit-perfect scenario is related to capturing by the RME which locks to the recovered clock. I've never suggested that Pro or Mini resamples anything.
 

morillon

Major Contributor
Joined
Apr 19, 2022
Messages
1,380
Likes
279
good .. can be thanks to its latest firmware
-the output in these cases via the pro will have a reasonable jitter .... even if a toslink source is not top top
-seems rather bitperfect in to out

so, this looks pretty good..
the essential...for "a second fonction" (machine very affordable and general public ..)
;-)))
 
Last edited:

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
good .. can be thanks to its latest firmware
-the output in these cases via the pro will have a reasonable jitter .... even if a toslink source is not top top
-seems rather bitperfect in to out

so, this looks pretty good..
the essential...for "a second fonction" (machine very affordable and general public ..)
;-)))
If @phofman is right and I guess he is right indeed, being bit perfect ends when buffer is full in case of the scenario I had - when the Mini's clock is higher than Pro's one.
I'll try later to measure a situation when incoming clock is lower, with buffer underrun.
 

morillon

Major Contributor
Joined
Apr 19, 2022
Messages
1,380
Likes
279
question also is to know if the people who had problems, with this new firmware, don't have any more...? or improves it enough for everyday use..?
(not really knowing the things implemented.. let's see maybe the result.. he who interests us in the first place ;-) )

;-)
 
Last edited:

antcollinet

Master Contributor
Forum Donor
Joined
Sep 4, 2021
Messages
7,678
Likes
12,936
Location
UK/Cheshire
You must be really misunderstanding indeed.
The Pro does not resample anything, the ASRC capture device does. In this scenario I can see that Pro's clock and Mini's one differ. And I can see that Pro's clock is being used for Mini's spdif stream.
Bit-perfect scenario is related to capturing by the RME which locks to the recovered clock. I've never suggested that Pro or Mini resamples anything.
OK - that is me misinterpreting your word "reclocked" as resampled. Sorry for the confusion.


Buffer under-run/over-run is still going to be a problem. Unless the pro is using a huge PLL Buffer that runs at a lower speed for a while until it half fills (causing your difference in measured length).
 

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
If my calculations are correct for 0,0007% Pro clock deviation and 0,0019% Mini clock deviation, a 44 ms buffer would be enough for 1 hour transmission with 96 kHz sample rate.

@phofman do you know how the situation when incoming clock is lower than the internal one is handled usually when time sync must be maintained as well?
For TVs and sync with the picture for example?
 
Last edited:

onlyoneme

Major Contributor
Joined
Jul 5, 2022
Messages
1,117
Likes
624
Location
Poland
I would assume that latency is inevitable here and it will include processing latency and prefetch buffer size. That means that the underrun issue can occur even earlier if the total latency has been decreased and the processing latency stays the same, am I right?
 

phofman

Addicted to Fun and Learning
Joined
Apr 13, 2021
Messages
502
Likes
325
The smaller the buffer, the smaller the latency and
  • the higher the chances of buffer issues for single-clock + PC processing
  • or the shorter the intervals (100% certainty) between buffer issues for two clock domains without adaptive resampling.

A typical chain would be:

alsa capture device -> capture buffer (CB) -> DSP (one or more threads) -> playback buffer (PB) -> alsa playback device

My 2 cents the WiiM devices have the same architecture, as it's pretty much the only logical setup in linux.

It's important to keep in mind that alsa device sets the pace of the client processing, be it capture or playback. The kernel driver wakes up to user-space process when fresh samples are available for reading by the userspace (capture) or when output buffer has gained enough room for new samples to be written by the userspace (playback).

For analog input, the capture device (ADC) and playback device (DAC or SPDIF out) are clocked by same clock. Low-cost devices use clock signals generated by internal PLLs of the SoC (i.e. the SoC I2S interfaces run in master mode), more expensive devices have external clock circuits, running the SoC I2S interfaces in slave mode.

If I were to design a low-cost device with that linkplay A98 module which has two I2S interfaces, I would probably do:

  • ADC as slave -> I2S_A input (alsa capture device A) as master
  • I2S_A (alsa playback device A) output as master -> DAC slave/SPDIF_OUT as slave
  • SPDIF_IN as master (SPDIF stream always carries master clock) -> I2S_B input (alsa capture device B) as slave
This setup does not need any external clock, no clock switching.

Now for analog input -> DSP -> output there is only one clock involved (the I2S master clock generated by the SoC), both alsa capture and playback devices A (clocked by the same clock) produce/consume samples at the same rate, buffers CB and PB are happy and can be kept quite small.

When switching from the ADC analog input to SPDIF the DSP must start capturing from the SPDIF capture device B, clocked by the SPDIF clock entering I2S_B. In this case the two alsa devices will not run at the same speed, the SPDIF_IN capture device will provide data at a different rate than the playback device will consume, and buffers CB and PB will eventually start having issues.

Typically the processing chains are pulled, i.e. it's the playback device which sets pace of the whole DSP. If so, then:
  • if playback is faster than capture, CB will eventually underflow (missing capture samples)
  • if playback is slower than capture, CB will eventually overflow (dropped capture samples)
To avoid that, e.g. CamillaDSP puts an adaptive resampler between CB and the DSP thread, which consumes samples at rate of the capture device B, and produces samples at rate of the playback device A. Of course determining the correct current resampling ratio is crucial and not simple, especially if the overall latency is to be small, i.e. buffers must be kept small and the extra room in CB for compensating the rate inequality is thus small.

Typically CB and PB work together, because DSP has a separate thread and delays on capture (DSP waiting for new samples to process) will delay delivery of DSP'd samples to playback too.

Due to the chunked processing these computer-based chains are quite difficult to tune right and to run reliably at very small latencies. Therefore HW DSP with dedicated HW which processes samples continuously (not in chunks) is much more robust and capable of smaller latencies.
 
Last edited:
Top Bottom