• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Let's develop an ASR inter-sample test procedure for DACs!

Those are already irreversibly messed up. So garbage in, what do you expect to get out?

If you want consistent reconstruction of these music across all different DAC implementations, just digitally attenuate them, say by 3 or 6 dB, then the intersample over reconstruction problem is eliminated. It is not hard.
To call music that doesn't strictly adhere to a specification 'garbage' is incredibly disrespectful to the artists and the labels... especially when that specification didn't always exist.

I don't think it's a very good solution to put the onus on the end user to find the right level of attenuation to enjoy their music properly... if they can do it at all. What if your CD player doesn't have a volume control and is digitally connected to a DAC?
 
@pkane

Any valid PCM data should be decoded by a decoder. Simple as that.

The PCM format does not account for further oversampling issues: it's up to the one who oversamples (for example oversampled DAC manufacturers) to account for any issue; for that matter, any DSP implementer should account for issues they may introduce; this is just basic DSP practice, isn't it?
Also, mastering engineers are facing a gigantic variety of DACs listener will listen music on, amongst which non-oversampled; are you saying that MEs are to know eactly each DAC implementation, know exactly what oversample ratio and most importantly what oversample filter they use -- all of which giving different peak readings? You don't sound reasonable nor rational. Plus, there's what's being just said: what about the gazillion pieces of music already out there? (?)

All in all, oversampled DAC manufacturers need to account for the issues introduced by oversampling a perfectly valid PCM signal. But since inter-sample values can theoretically be 'infinite', oversampled DACs cannot make an 'infinite' headroom. So there should be a practical headroom value that would account of most material. Benchmark chose -3dB. Others advocate for -6dB. The Rolls-Royce of digital consoles — the Oxford R3 — had 14dB of headroom! Because the designer was well-educated enough, and didn't put the blame on music makers and mastering engineers.

Valid PCM data that represents a waveform that exceeds the limits specified in the format is not valid. PCM is not the end-goal, the actual waveform reproduction is. If the waveform being sampled by PCM exceeds the maximum allowed, then the PCM encoding is invalid, even if the individual samples in the PCM file are within the 0dBFS limit.
 
If the waveform being sampled by PCM exceeds the maximum allowed, then the PCM encoding is invalid.
Can you point me to that part of the PCM specs? Are you speaking about the ADC stage? If yes, then having a clean ADC stage is a non-issue, and it is not what’s being discussed here.
 
Last edited:
I think that "deliver files with sufficient headroom" isn't the answer. The answer is to specifically check for ISOs in production and scale/compress appropriately to ensure the waveform (not the samples at 44.1k) doesn't exceed 0dBFS. There's no guessing needed for the necessary headroom in this case.

So what's "an accurate test"? Again, just oversample the file, keep the samples floating point, and check for any exceeding +/-1.0 level. Easily done, and I believe already built-in to some software, such as SoX for example. If there's no plugins that do this already, this is easily fixed (if I thought it was important, I could probably write on in a day or two :) )
My job isn't done if I just leave enough headroom. I also look specifically for true peak (or oversampled, if you prefer) values. I do everything I can to ensure that new music and remasters are properly decoded, and I'm not the only one. True-peak meters, although imperfect, are already widely used today. This still doesn't solve the problem of all the music that's already been released without any user intervention, educated guesses or blindly reducing levels. Again, what if your CD player doesn't have a volume control and is digitally connected to a DAC?

One other funny thing is that the use of true-peak limiters can sometimes lead to even more overshoot.
 
To call music that doesn't strictly adhere to a specification 'garbage' is incredibly disrespectful to the artists and the labels... especially when that specification didn't always exist.

I don't think it's a very good solution to put the onus on the end user to find the right level of attenuation to enjoy their music properly... if they can do it at all. What if your CD player doesn't have a volume control and is digitally connected to a DAC?
So you proposed solution to their "problem" is to buy new DACs? :facepalm:
 
The PCM format does not account for further oversampling issues: it's up to the one who oversamples (for example oversampled DAC manufacturers) to account for any issue; for that matter, any DSP implementer should account for issues they may introduce; this is just basic DSP practice, isn't it?
That's true. Now the question is, how do we check, if all this is done properly? IMHO this is the problem posted by OP, but the discussion is getting off topic.
 
So you proposed solution to their "problem" is to buy new DACs? :facepalm:
No. If the solution to this problem was simply to "buy a new DAC", then... which one? :D
How do you test if the new one will handle ISPs better than the other?
That's why I'm proposing to raise awareness by creating a test so that, like jitter, it becomes a non-issue even for cheap DACs.

That's true. Now the question is, how do we check, if all this is done properly? IMHO this is the problem posted by OP, but the discussion is getting off topic.
Thank you. I certainly didn't expect to get so many answers stating it's a non-issue, although I do understand from where they come from. That's why I'm trying to answer every counter-argument.

So far @Sokel and @KSTR have posted the most helpful and informed responses. Many thanks for that. I like the idea of a 0dBFS test signal with +6dBTP values, ramping up in level to properly locate the clipping point.

@amirm, (or anyone else with an Audio Precision analyser, really) would it be possible to use such a signal in the standard test procedure? Or can you only use 'internal' tests from the AP? I'm not very familiar with this equipment.
 
Valid PCM data that represents a waveform that exceeds the limits specified in the format is not valid. PCM is not the end-goal, the actual waveform reproduction is. If the waveform being sampled by PCM exceeds the maximum allowed, then the PCM encoding is invalid, even if the individual samples in the PCM file are within the 0dBFS limit.
Allowed by whom?

BTW:

The Paradox of the Digital OVER

 
Valid PCM data that represents a waveform that exceeds the limits specified in the format is not valid. PCM is not the end-goal, the actual waveform reproduction is. If the waveform being sampled by PCM exceeds the maximum allowed, then the PCM encoding is invalid, even if the individual samples in the PCM file are within the 0dBFS limit.
That's true, but only a useful criterion for a direct and unprocessed mic (or other analog source) feed. The moment processing is involved, or the source is digital (synthesizer etc), IS-overs are likely to happen.
@amirm, (or anyone else with an Audio Precision analyser, really) would it be possible to use such a signal in the standard test procedure? Or can you only use 'internal' tests from the AP? I'm not very familiar with this equipment.
APx can play any .wav file. The bigger problem is the detector algorithm which would need to be very specific, basically doing a "shape" comparison (null test) of the waveforms where IS-overs will be present against a recording done at a safe level (like well below -6dBFS for a +6dBFS test signal).
 
user @ayane posted a music file which contains +6dBFS IS-overs. A quick 4x upsampling with Adobe Auditions the highest peak looks like this:
1698599372595.png


This is a signal very close to fs/2 with sort of a "break" sample inserted (basically causing a 180degree phase flip in that near-Nyquist frequency), readily observed in the original sample stream and with the effect already visible in Auditions's upsampled graphical representation:
1698599743392.png
 
I'd like to see more testing on this as I think DACs may not respond exactly as one thinks. Every ESS DAC I've tested (MOTU M4, Okto dac8 pro, MOTU Ultralite Mk5) doesn't show clipping when feed a +3 dBFS 11.025 kHz tone at 44.1 kHz but rather show an elevated noise floor. When digital attenuation is applied via the DAC volume control this goes away.

MOTU Ultralite Mk5 - +3 dBFS 11.025 kHz @ 44.1 kHz - 0 dB vs -3 dB volume control.png


Michael
 
I guess there's two ways of seen this problem by us listeners.

One way is to see it romantically and say that it shouldn't exist (as well as poverty,stupidity,etc,or pretend it doesn't as we can't (?) hear it ) and the other way is find a way so this does not manifest (the visuals are strong,once you know a song cause something like this is hard to unsee it) .

Same goes with DAC designers,some deal with,some don't.

My take on this is that as long as there is such music out there and as long as we don't have a word in recording this is of the instances that this must be dealt with.
Turning the blind eye to something that causes such a mayhem (even if inaudible,other instances of such a horrible graph even inaudible are faced with horror here) is not dealing with the raw truth.
 
My take on this is that as long as there is such music out there and as long as we don't have a word in recording this is of the instances that this must be dealt with.
Turning the blind eye to something that causes such a mayhem (even if inaudible,other instances of such a horrible graph even inaudible are faced with horror here) is not dealing with the raw truth.

Same - I don't get the, "oh, we should just ignore it because it doesn't happen very often" or "we can't fix the upstream so we just have to accept it downstream" viewpoints. I'd like to know which DACs handle this appropriately and which don't - just like those that properly reject jitter, have lower noise / higher DR, etc.
 
Benchmark shows a test using AP analyzer using their older DAC 1 compared to DAC 2 and also some samples of music with intersample overs.

 
The problem is how to deal with already released music that's not going to be remastered any time soon?
When you use a streaming service, don't disable volume normalization just because you read on the internet that it sounds bad.
 
P.jpg
This song came out in '94, liked it very much then. Now that I just found out that it has 3543 IS peaks above 0dBFS, does someone really expect me to start disliking it? Or to stop ignoring those IS overs? What's all the fuss about?
IS overs so what? Who cares? I don't...
 
When you use a streaming service, don't disable volume normalization just because you read on the internet that it sounds bad.
No one in this thread has ever mentioned that. Let's stay on topic please.

This song came out in '94, liked it very much then. Now that I just found out that it has 3543 IS peaks above 0dBFS, does someone really expect me to start disliking it? Or to stop ignoring those IS overs? What's all the fuss about?
IS overs so what? Who cares? I don't...
No one is telling you to stop liking or to stop listening to that song. But wouldn't you like to know if your current (or future) playback system is capable to handle it properly, without any more distortion than what's already baked in? :)

I'd like to see more testing on this as I think DACs may not respond exactly as one thinks. Every ESS DAC I've tested (MOTU M4, Okto dac8 pro, MOTU Ultralite Mk5) doesn't show clipping when feed a +3 dBFS 11.025 kHz tone at 44.1 kHz but rather show an elevated noise floor. When digital attenuation is applied via the DAC volume control this goes away.

MOTU Ultralite Mk5 - +3 dBFS 11.025 kHz @ 44.1 kHz - 0 dB vs -3 dB volume control.png
Thanks for sharing this, really interesting! Exactly the kind of informative analysis that would be great to see more often!

Same - I don't get the, "oh, we should just ignore it because it doesn't happen very often" or "we can't fix the upstream so we just have to accept it downstream" viewpoints. I'd like to know which DACs handle this appropriately and which don't - just like those that properly reject jitter, have lower noise / higher DR, etc.
Thanks, I have to admit I'm a little surprised at the apparent reluctance to measure and document how ISPs behave with different devices. Especially since we do have enough data to show that it's far from predictable.
 
No one is telling you to stop liking or to stop listening to that song. But wouldn't you like to know if your current (or future) playback system is capable to handle it properly, without any more distortion than what's already baked in? :)
Let's say I would, but what's that gonna change? I mean the knowing itself.
I used to not know that there was oxygen or nitrogen in the air, so the knowledge of their mutual prevalence did not make me stop breathing. Or to change my breathing in some way...
 
Let's say I would, but what's that gonna change? I mean the knowing itself.
I used to not know that there was oxygen or nitrogen in the air, so the knowledge of their mutual prevalence did not make me stop breathing. Or to change my breathing in some way...
What could change is better awareness of the issue for all users and audio enthusiasts.

If a DAC has a digital volume control, it would be nice to have a line in a review that says "be careful, you can get extra distortion from ISPs past -3dB on the volume control".
If a DAC has an analogue volume control, it would be nice to have a line in a review saying "be careful, you can get extra distortion from ISPs. Be sure to reduce the volume in your media player by x dB to experience it cleanly".
With a CD player, it would be nice to have a line in a review that says "be careful, the output can clip significantly on many commercial releases, with no way to prevent this" or "this unit handles ISPs very well".
 
What could change is better awareness of the issue for all users and audio enthusiasts.

If a DAC has a digital volume control, it would be nice to have a line in a review that says "be careful, you can get extra distortion from ISPs past -3dB on the volume control".
If a DAC has an analogue volume control, it would be nice to have a line in a review saying "be careful, you can get extra distortion from ISPs. Be sure to reduce the volume in your media player by x dB to experience it cleanly".
With a CD player, it would be nice to have a line in a review that says "be careful, the output can clip significantly on many commercial releases, with no way to prevent this" or "this unit handles ISPs very well".

"Better awareness of the issue" is irrelevant.
I proved it - I liked that song while not knowing it was distorting, I like it the same knowing it's distorting. Meanwhile, I lowered the playback level (in PC) by 3 dB, so it wouldn't additionally distort my Topping D10 Balanced, and now I like it less because it plays softer, and that's why I had to crank up the amp in return...which exactly is the fuss I was talking about.
 
Back
Top Bottom