• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Converting WAV to FLAC which FLAC Level is sufficient

Audio over USB is an isochronous transfer. There is nothing in the protocol that permits any corrections of the data stream to occur at the receiving end, nor for the source to retransmit any erroneous data. However CRC checking is permitted (but not required) and errors can, and often are, flagged, which can be an important diagnostic tool.
Besides, what we are talking about here is not packet errors per se, but what happens to jittery data when it gets to the USB receiver/DAC and how that system handles it. In essence it becomes an analog rather than exclusively a digital problem.
This argument also applies to any other digital interface that requires transmission and reception and processing back into bits (i.e. all of them).
For the DAC output to be impaired in some sense all that is required is that the jitter profile of the recovered data be unacceptable, and all digital data has jitter.
As has been stated and I paraphrase rather flippantly, "bit perfect does not mean perfect bits".
A well engineered interface/DAC will convert a substantially less than perfect bit stream, even if it is bit-perfect, into an adequately near perfect bit stream which remains bit-perfect, reducing the timing imperfections in the process. A poorly engineered or outmoded design will not be able to achieve the same level of perfection, even if the output remains bit perfect and free of errors.

I recently ran a null test of an DA/AD interface with and without a CPU stress test. The null RMS difference between the two (one unstressed and one stressed) was below -100dB and within 1dB over the entire 2 minute music track. Whatever jitter or noise was introduced by the CPU activity was well below the resolution of 44.1/16 bit. Just one sample, admittedly, but similar tests are easy for others to repeat.
 
For the DAC output to be impaired in some sense all that is required is that the jitter profile of the recovered data be unacceptable, and all digital data has jitter.
Sorry, but that makes no sense at all. Digital data is just a sequence of values. A transmission of digital data may have jitter; the data itself does not. Consequently, the recovered data also cannot have a jitter profile, and what it doesn't have cannot influence the recipient (the DAC chip). Transmission jitter in excess of specified tolerances can of course result in erroneous data recovery. In this case, some bits will have the wrong value. The bits as such still do not have jitter any more than they have a colour.

Where non-excessive transmission jitter might conceivably affect the DAC output is if it somehow, perhaps through some intermodulation effect, gets translated into noise in the audible range that then couples into the analogue side of the DAC. Note, however, that I have yet to see evidence of this actually occurring to an audible extent.

The above notwithstanding, the idea that transmission jitter has audible effects still falls apart. USB is a packet based system operating at a fixed link speed of 480 Mbps. When playing audio, the link is idle apart from a short (less than 5 μs) burst every 125 μs. If transmission jitter is to have any effect, it will have to exert it during these bursts. Completely aside from jitter, the bursty nature of USB transmission does have an effect. During link activity, the power consumption in the USB receiver circuits increases, and the voltage on the data lines can cause stray currents outside the USB receiver. These effects can, whether through power supply ripple or ground noise, result in a readily measurable 8 kHz tone plus harmonics on the analogue output. Maybe link jitter can somehow influence this 8 kHz noise, though it seems doubtful. Even if it does, the effect is surely far lower in magnitude than the packet noise itself. Thus, if a particular DAC exhibits packet noise below whatever threshold is deemed acceptable, then USB link jitter will not have any influence on the sound quality either.
 
Nobody affirms that the information is not the same, hence the mystery, at least in my systems. Logic and theory dictate that they should sound the same.
My explanation is that your ODAC (Objective DAC) only supports the USB Audio Class (1) protocol and is therefore obsolete!
The protocol is based on USB 1.1 and only supports isochronous data frames that may therefore suffer unrecoverable bit errors (thanks @wynpalmer for putting me on the right track here). Although the receiver can detect this via CRC, it cannot request retries.
Link: XMOS Ltd. posted an extensive explanation of the original USB audio protocol.

Another article explains how USB implementations might indeed suffer from jitter or other clock misalignment errors.
Link: Select your USB audio MCU with care - Scary stories from the test bench
When properly implemented, jitter is not a problem as the USB-audio receiver adjusts the data frame sizes according to the observed clock difference, and reports this back to the sender.

The bad news is that you may need to buy a new DAC to solve your problems, but the good news is you're not crazy! ;)

Modern devices will surely implement USB Audio Class 2.0 which is fully asynchronous and therefore jitter-free on the source side.
 
I recently ran a null test of an DA/AD interface with and without a CPU stress test. The null RMS difference between the two (one unstressed and one stressed) was below -100dB and within 1dB over the entire 2 minute music track. Whatever jitter or noise was introduced by the CPU activity was well below the resolution of 44.1/16 bit. Just one sample, admittedly, but similar tests are easy for others to repeat.
That conforms to my own tests/experiences using my RME ADI-2 PRO FS.
Sorry, but that makes no sense at all. Digital data is just a sequence of values. A transmission of digital data may have jitter; the data itself does not. Consequently, the recovered data also cannot have a jitter profile, and what it doesn't have cannot influence the recipient (the DAC chip). Transmission jitter in excess of specified tolerances can of course result in erroneous data recovery. In this case, some bits will have the wrong value. The bits as such still do not have jitter any more than they have a colour.

Where non-excessive transmission jitter might conceivably affect the DAC output is if it somehow, perhaps through some intermodulation effect, gets translated into noise in the audible range that then couples into the analogue side of the DAC. Note, however, that I have yet to see evidence of this actually occurring to an audible extent.

The above notwithstanding, the idea that transmission jitter has audible effects still falls apart. USB is a packet based system operating at a fixed link speed of 480 Mbps. When playing audio, the link is idle apart from a short (less than 5 μs) burst every 125 μs. If transmission jitter is to have any effect, it will have to exert it during these bursts. Completely aside from jitter, the bursty nature of USB transmission does have an effect. During link activity, the power consumption in the USB receiver circuits increases, and the voltage on the data lines can cause stray currents outside the USB receiver. These effects can, whether through power supply ripple or ground noise, result in a readily measurable 8 kHz tone plus harmonics on the analogue output. Maybe link jitter can somehow influence this 8 kHz noise, though it seems doubtful. Even if it does, the effect is surely far lower in magnitude than the packet noise itself. Thus, if a particular DAC exhibits packet noise below whatever threshold is deemed acceptable, then USB link jitter will not have any influence on the sound quality either.

Thank you, much appreciated. Let me say that I have experienced no audible degradation as a function of using the USB connection except when there are verifiable packet errors.
I feel that we are writing at cross purposes here, so my apologies and I'll try to be more precise.
Specifically now the USB interface- yes, I am aware of the bursty nature and the 8kHz rate.
I do, however, dispute that 8kHz is the only frequency content on the 5v USB supply due to PC activity. My own experience contradicts that, for whatever that is worth.
I do have a sort of question/comment for you.
The USB receiver must take the information from the packet/frame and transfer it one sample at a time to the DAC for processing at the correct rate. Surely this means that the input data has to be transferred into a data pipe using a clock, and then transferred into the DAC also using a clock.
Let's assume that the DAC produces an output which is linearly dependent on the data value, and also is dependent in some fashion on the duration of the value.
This means that the DAC converts a series of samples, the magnitude of which is the data, but the duration of which is defined by the varying period of the clock, which is defined by the "jitter". Effectively at that point you move from the data domain to the analog domain. The dependence on the clock may be something as simple as the duration of a zero order hold. When the output is low pass filtered the variation in the duration of the zero order hold, in this instance, appears as noise on the output waveform.
It's this idea which is at the heart of the crystal paper and which, potentially, creates a way for noise on a digital interface- either due to variations in the data edges ( if the clock is generated from the data) or the clock itself (if the clock is generated from the host)- to create noise that is audible at the output of a DAC.
 
I recently ran a null test of an DA/AD interface with and without a CPU stress test. The null RMS difference between the two (one unstressed and one stressed) was below -100dB and within 1dB over the entire 2 minute music track. Whatever jitter or noise was introduced by the CPU activity was well below the resolution of 44.1/16 bit. Just one sample, admittedly, but similar tests are easy for others to repeat.
I've done the same test a few times. I also included moving the DAC further away from and as close as possible to the computer in case some of the noise was radiated thru the air. Same results both ways. The remaining difference between the two conditions was below -100 db and about the same as running the test twice consecutively changing nothing.
 
That conforms to my own tests/experiences using my RME ADI-2 PRO FS.


Thank you, much appreciated. Let me say that I have experienced no audible degradation as a function of using the USB connection except when there are verifiable packet errors.
I feel that we are writing at cross purposes here, so my apologies and I'll try to be more precise.
Specifically now the USB interface- yes, I am aware of the bursty nature and the 8kHz rate.
I do, however, dispute that 8kHz is the only frequency content on the 5v USB supply due to PC activity. My own experience contradicts that, for whatever that is worth.
I do have a sort of question/comment for you.
The USB receiver must take the information from the packet/frame and transfer it one sample at a time to the DAC for processing at the correct rate. Surely this means that the input data has to be transferred into a data pipe using a clock, and then transferred into the DAC also using a clock.
Let's assume that the DAC produces an output which is linearly dependent on the data value, and also is dependent in some fashion on the duration of the value.
This means that the DAC converts a series of samples, the magnitude of which is the data, but the duration of which is defined by the varying period of the clock, which is defined by the "jitter". Effectively at that point you move from the data domain to the analog domain. The dependence on the clock may be something as simple as the duration of a zero order hold. When the output is low pass filtered the variation in the duration of the zero order hold, in this instance, appears as noise on the output waveform.
It's this idea which is at the heart of the crystal paper and which, potentially, creates a way for noise on a digital interface- either due to variations in the data edges ( if the clock is generated from the data) or the clock itself (if the clock is generated from the host)- to create noise that is audible at the output of a DAC.

Maybe I misunderstand you, most USB DACs obtain the data and then have their own clock for the DAC output. Clock isn't from the host, clock isn't from data.

I have seen one explanation for how ASIO connections over USB can be bungled, still work and have lousy jitter, but that wasn't related to the PC or noise over the USB.
 
Did you compare WAV vs. FLAC in memory playback configuration if you've doubt abt. possible interaction between processor load and audio performance ?

As I wrote, an ABX of FLAC 0 vs WAV would not happen but yes with greater compression like FLAC 4. The audio information is the same in all cases, hence the mystery.

Hard disk playback, Western Red NAS (It is worth spending a little more and having greater reliability, I think).
 
Last edited:
@Eirikur

The first time I read a plausible explanation, thank you very muchhh. I will read your links tomorrow.

However, I still believe that the audio information before being processed by the CPU is different from the one that comes out of it. From there it does not change. But it is something I am unable to prove or verify.

If my hypothesis is true, it would not be important if it were USB 1.1 or USB 2.0

I can skip the DAC and send via WiFi up to 24/192 to my main audio system which does have DAC that complies with USB 2.0. and can play DSD too. Tomorrow.
 
Last edited:
Maybe I misunderstand you, most USB DACs obtain the data and then have their own clock for the DAC output. Clock isn't from the host, clock isn't from data.

I have seen one explanation for how ASIO connections over USB can be bungled, still work and have lousy jitter, but that wasn't related to the PC or noise over the USB.
Boy this is turning into a can of worms...
Yes, I know that most DACs generate their own clock and reclock the input data using that clock before presenting it to the DAC. That's how they are able to get really low phase noise (jitter).
The clock has to have some inter dependence with the host/interface as otherwise there would eventually be failures in the data pipe- underflow or overflow.
In general terms, either the data transfer rate discrepancy has to be fed back to the host, which then deals with the problem by transferring more or less data per frame on average, or the clock is constructed in one way or other from the data, either by detecting an approaching overflow/underflow case and incrementally changing the frequency, or by actually extracting the clock from the data transitions themselves or a derivative of them.
In any case, anything to do with the data interface that causes the jitter of the retiming/data transfer clock to degrade is a bad thing, but I can't pin down a mechanism that is explicitly associated with the USB port to explain it, besides some obscure noise coupling through the supply perhaps.
OK. This was a bad idea.
Objectivists win again :)
 
Last edited:
My explanation is that your ODAC (Objective DAC) only supports the USB Audio Class (1) protocol and is therefore obsolete!
The protocol is based on USB 1.1 and only supports isochronous data frames that may therefore suffer unrecoverable bit errors (thanks @wynpalmer for putting me on the right track here). Although the receiver can detect this via CRC, it cannot request retries.
Link: XMOS Ltd. posted an extensive explanation of the original USB audio protocol.

Another article explains how USB implementations might indeed suffer from jitter or other clock misalignment errors.
Link: Select your USB audio MCU with care - Scary stories from the test bench
When properly implemented, jitter is not a problem as the USB-audio receiver adjusts the data frame sizes according to the observed clock difference, and reports this back to the sender.

The bad news is that you may need to buy a new DAC to solve your problems, but the good news is you're not crazy! ;)

Modern devices will surely implement USB Audio Class 2.0 which is fully asynchronous and therefore jitter-free on the source side.
Bit errors result in the USB transfer no longer being bit perfect, and in my experience errors are in general quite audible and not as a subtle effect.
 
Let me say that I have experienced no audible degradation as a function of using the USB connection except when there are verifiable packet errors.
Same here. The only time I've experienced any degradation at all using USB was when I crafted a really poor cable just to see what would happen.

I do, however, dispute that 8kHz is the only frequency content on the 5v USB supply due to PC activity. My own experience contradicts that, for whatever that is worth.
The 5 V USB power (Vbus) can of course have all manner of noise on it. That has nothing to do with jitter, though.

The USB receiver must take the information from the packet/frame and transfer it one sample at a time to the DAC for processing at the correct rate. Surely this means that the input data has to be transferred into a data pipe using a clock, and then transferred into the DAC also using a clock.
Let's assume that the DAC produces an output which is linearly dependent on the data value, and also is dependent in some fashion on the duration of the value.
This means that the DAC converts a series of samples, the magnitude of which is the data, but the duration of which is defined by the varying period of the clock, which is defined by the "jitter". Effectively at that point you move from the data domain to the analog domain. The dependence on the clock may be something as simple as the duration of a zero order hold. When the output is low pass filtered the variation in the duration of the zero order hold, in this instance, appears as noise on the output waveform.
I'm going to assume we're talking about DACs with asynchronous USB interfaces here. Anything else is pretty much non-existent. I'm also going to assume a typical sigma-delta DAC chip with internal oversampling.

Whenever a USB packet is received, the sample data it contains is placed into a FIFO buffer. Separately, and continuously, samples are read from the other end of the FIFO buffer and transferred to the DAC chip. I'm assuming everything is working properly and the FIFO buffer never over- or underflows.

The input to the DAC chip is generally an I2S link. This is a serial interface with three signals: LR (or word) clock, bit clock, and data. The data values are sampled on the rising edge of the bit clock while the LR clock edges indicate the boundaries of the sample values as well as which channel (left or right) they belong to. Within the DAC chip, this bit stream enters something equivalent to a serial-in/parallel-out shift register with the output latched on the rising LR clock edge. However, and this is important, the parallel samples emerging from this circuit do not go directly to the D/A conversion stage.

After deserialisation the audio samples enter the digital interpolation stage which, as the name implies, interpolates the data producing a sample rate typically 8x higher than the input. This is then further oversampled using zero-order hold to the rate of the sigma-delta modulator. The modulator output is what enters the D/A conversion stage, and here the timing is important.

The digital interpolation and sigma-delta modulator are operated from a separate master (or system) clock input, often 24.576 MHz when the audio sample rate is 48 kHz or a multiple thereof. Although the I2S input is typically (ESS being the notable exception) required to be synchronous with the master clock, there is enough internal buffering that some wavering is tolerated. For example, TI/Burr-Brown DACs only require that the LR clock remain within ±6 bit clock periods of the ideal.

In a well designed DAC, the master clock is located close to the DAC chip with ample power supply decoupling, perhaps even a dedicated regulator. This makes it difficult for activity at the USB receiver to cause jitter at the critical point, i.e. the D/A conversion stage. Even if the readout from the receiver FIFO buffer has some jitter, the data is reclocked again within the DAC chip.

Now someone might point out that switching noise in the I2S input buffer will be tied to whatever jitter is present on the bit clock. This is true. However, this input stage is tiny compared to all the other digital circuitry making up the interpolation filter and modulator. It stands to reason that any jitter-correlated noise will be swamped by switching noise from the rest of the chip, which as discussed is operated from a clean clock.

Of all the things that can adversely affect the output quality of a DAC, jitter on the USB link really should be very far down the list. If there is a problem, it is almost certainly caused by something else.

It's this idea which is at the heart of the crystal paper and which, potentially, creates a way for noise on a digital interface- either due to variations in the data edges ( if the clock is generated from the data) or the clock itself (if the clock is generated from the host)- to create noise that is audible at the output of a DAC.
In an asynchronous USB DAC, the clock is not derived from the host in any way whatsoever. The USB receiver does recover a clock signal from the NRZI data stream, but only for the purpose of extracting the data. This clock is not used for anything else.

In general terms, either the data transfer rate discrepancy has to be fed back to the host, which then deals with the problem by transferring more or less data per frame on average
That is exactly how it works. The receiver compares the received packet rate with an 8 kHz reference derived from the local clock and informs the host how many (fractional) samples per packet it wants. This average rate is then maintained by rounding up or down as needed for each packet. The desired rate is continuously recalculated to account for drifting clocks.
 
Last edited:
Same here. The only time I've experienced any degradation at all using USB was when I crafted a really poor cable just to see what would happen.


The 5 V USB power (Vbus) can of course have all manner of noise on it. That has nothing to do with jitter, though.


I'm going to assume we're talking about DACs with asynchronous USB interfaces here. Anything else is pretty much non-existent. I'm also going to assume a typical sigma-delta DAC chip with internal oversampling.

Whenever a USB packet is received, the sample data it contains is placed into a FIFO buffer. Separately, and continuously, samples are read from the other end of the FIFO buffer and transferred to the DAC chip. I'm assuming everything is working properly and the FIFO buffer never over- or underflows.

The input to the DAC chip is generally an I2S link. This is a serial interface with three signals: LR (or word) clock, bit clock, and data. The data values are sampled on the rising edge of the bit clock while the LR clock edges indicate the boundaries of the sample values as well as which channel (left or right) they belong to. Within the DAC chip, this bit stream enters something equivalent to a serial-in/parallel-out shift register with the output latched on the rising LR clock edge. However, and this is important, the parallel samples emerging from this circuit do not go directly to the D/A conversion stage.

After deserialisation the audio samples enter the digital interpolation stage which, as the name implies, interpolates the data producing a sample rate typically 8x higher than the input. This is then further oversampled using zero-order hold to the rate of the sigma-delta modulator. The modulator output is what enters the D/A conversion stage, and here the timing is important.

The digital interpolation and sigma-delta modulator are operated from a separate master (or system) clock input, often 24.576 MHz when the audio sample rate is 48 kHz or a multiple thereof. Although the I2S input is typically (ESS being the notable exception) required to be synchronous with the master clock, there is enough internal buffering that some wavering is tolerated. For example, TI/Burr-Brown DACs only require that the LR clock remain within ±6 bit clock periods of the ideal.

In a well designed DAC, the master clock is located close to the DAC chip with ample power supply decoupling, perhaps even a dedicated regulator. This makes it difficult for activity at the USB receiver to cause jitter at the critical point, i.e. the D/A conversion stage. Even if the readout from the receiver FIFO buffer has some jitter, the data is reclocked again within the DAC chip.

Now someone might point out that switching noise in the I2S input buffer will be tied to whatever jitter is present on the bit clock. This is true. However, this input stage is tiny compared to all the other digital circuitry making up the interpolation filter and modulator. It stands to reason that any jitter-correlated noise will be swamped by switching noise from the rest of the chip, which as discussed is operated from a clean clock.

Of all the things that can adversely affect the output quality of a DAC, jitter on the USB link really should be very far down the list. If there is a problem, it is almost certainly caused by something else.


In an asynchronous USB DAC, the clock is not derived from the host in any way whatsoever. The USB receiver does recover a clock signal from the NRZI data stream, but only for the purpose of extracting the data. This clock is not used for anything else.


That is exactly how it works. The receiver compares the received packet rate with an 8 kHz reference derived from the local clock and informs the host how many (fractional) samples per packet it wants. This average rate is then maintained by rounding up or down as needed for each packet. The desired rate is continuously recalculated to account for drifting clocks.
So, modern DACs are indeed immune to these effects, as was originally surmised. Thanks very much for the explanation.
 
My explanation is that your ODAC (Objective DAC) only supports the USB Audio Class (1) protocol and is therefore obsolete!
The protocol is based on USB 1.1 and only supports isochronous data frames that may therefore suffer unrecoverable bit errors (thanks @wynpalmer for putting me on the right track here). Although the receiver can detect this via CRC, it cannot request retries.

In a quick read I have not found what you say in the reviewed PDF. I understood that they could even in USB 1.1, hence the use of no audiophile cable (the problem is in the power of the ODAC via USB, which I think is capable of dealing with a limited amount of noise in the 5 Vdc bus, hence the 150 kHz ferrite test, which proved successful, but is not the thread).
 
[PDF] https://www.xmos.com/download/Fundamentals-of-USB-Audio(1.0).pdf

Fundamentals-of-USB-Audio-CRC-no-resend.png


It is a generic comment, do not specifies with USB 1.1 or other.
 
For the purposes of this discussion, USB 1.1 and USB 2.0 work exactly the same way. The only relevant difference is the higher data rate of USB 2.0. The latency of USB 2.0 can be lower due the packet rate being 8x higher. The transaction types and their ability to resend corrupted packets are exactly the same.
 
For the purposes of this discussion, USB 1.1 and USB 2.0 work exactly the same way. The only relevant difference is the higher data rate of USB 2.0. The latency of USB 2.0 can be lower due the packet rate being 8x higher. The transaction types and their ability to resend corrupted packets are exactly the same.
Yes, that is also my understanding. The USB protocols (all up to and including USB3.0) do not include any ability to resend in isochronous mode, but the presence of a CRC detected error can be reported to the host and can be used to analyze link statistics. It's not a requirement and as far as I'm aware not all DACs do it, but at least one DAC does...
 
But modern DAC, ODAC included, are asynchronous, so there is CRC and data resending if necessary.
The DAC sample clock may be asynchronous, but I believe that the interface remains isochronous as that mode is what is used to guarantee throughput.
As was described before by mansr, modern USB DACs use isochronous mode with asynchronous sample clock management.
This means that there is no data resend.
 
Last edited:
Back
Top Bottom