I think the case has been made. The evidence is here. It’s a real issue.
If anyone wants to argue otherwise, then please provide evidence of the contrary. After all, that is what science is all about.
My humble take is that we should measure for inter-sample over clipping and let customers know about the performance of their devices.
It would also be great to measure whether or not lowering a particular DAC’s digital volume can indeed prevent clipping or not, since AFAIU that is not always the case.
I respectfully disagree with @amirm ‘s take that measuring clipping would incentivize manufacturers to lower the dynamic range of their products. If the argument is that ASR has an influence on product development, then I believe this issue can be solved by presenting the data in a balanced manner. What I mean is that we could simply write in a review that “this product does not have oversampling headroom, therefore it preserves dynamic range at the expense of potential for digital clipping due to inter-sample overs”, or that “this product has oversampling headroom, which means dynamic range is sacrificed in order to prevent most digital clipping due to inter-sample overs”. This way the user will be aware of the benefits and the downsides of either approach.
Ideally, it would be great if the customer could *choose* between the increased dynamic range and the clipping protection. Something like a “-3dB pad” option/button before the oversampling stage in DACs and SRCs could put an end to all concerns.
About the audibility issue. It probably is the case that inter-sample over clipping won’t be detected by the majority of people out there (I’m pretty sure I wouldn’t be able to tell). However, that could be said about digital clipping at the mastering stage as well. Why is it that if I produce a song that has transient peaks at 0.1dBFS all of a sudden my master is “illegal” while if my DAC is producing such clipping then it’s a non-issue? To me, this argument falls apart in a spectacular manner.
If digital clipping is to be avoided, then that’s that. End of story.
Of course, as we’ve seen in this thread, clipping is not the only issue caused by inter-sample overs and there is much worse stuff that can happen. Again, we’re talking about things that may not be easily audible but personally I still care about signal preservation. I want my converter to have a “what comes in, comes out” approach to the audio that I’m feeding it.
If anyone wants to argue otherwise, then please provide evidence of the contrary. After all, that is what science is all about.
My humble take is that we should measure for inter-sample over clipping and let customers know about the performance of their devices.
It would also be great to measure whether or not lowering a particular DAC’s digital volume can indeed prevent clipping or not, since AFAIU that is not always the case.
I respectfully disagree with @amirm ‘s take that measuring clipping would incentivize manufacturers to lower the dynamic range of their products. If the argument is that ASR has an influence on product development, then I believe this issue can be solved by presenting the data in a balanced manner. What I mean is that we could simply write in a review that “this product does not have oversampling headroom, therefore it preserves dynamic range at the expense of potential for digital clipping due to inter-sample overs”, or that “this product has oversampling headroom, which means dynamic range is sacrificed in order to prevent most digital clipping due to inter-sample overs”. This way the user will be aware of the benefits and the downsides of either approach.
Ideally, it would be great if the customer could *choose* between the increased dynamic range and the clipping protection. Something like a “-3dB pad” option/button before the oversampling stage in DACs and SRCs could put an end to all concerns.
About the audibility issue. It probably is the case that inter-sample over clipping won’t be detected by the majority of people out there (I’m pretty sure I wouldn’t be able to tell). However, that could be said about digital clipping at the mastering stage as well. Why is it that if I produce a song that has transient peaks at 0.1dBFS all of a sudden my master is “illegal” while if my DAC is producing such clipping then it’s a non-issue? To me, this argument falls apart in a spectacular manner.
If digital clipping is to be avoided, then that’s that. End of story.
Of course, as we’ve seen in this thread, clipping is not the only issue caused by inter-sample overs and there is much worse stuff that can happen. Again, we’re talking about things that may not be easily audible but personally I still care about signal preservation. I want my converter to have a “what comes in, comes out” approach to the audio that I’m feeding it.