• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Are inter-sample overs the new audiophile bogeyman?

If you can’t hear a difference between 0dB and -4dB, that doesn’t mean that your equipment won’t produce inter-sample overs.
I recognise that - but my point wasn't that my kit doesn't produce them - my point was - whether it does or not - I've never heard anything that I think *can* be ISO in real-world listening. Hence my question:

Has anyone else? If so, with what music (including what master) at what time?

So far, as far as I've seen, only one person has stated they heard them noticeably from listening to music - but that was a device with overflow/wraparound errors. One other has said they can hear them when listening for them - but even then, difficult to detect.

A few others have stated conditions under which they believe they will be audible - without specifically stating that they have heard them.

So my conclusion - at least from the posts here - is that they are not a real world issue as far as audibility is concerned.

However, I accept @Sokel's point, that if we are chasing general noise and distortion below levels of (say) -80dB, and deciding not to recommend DACs that don't achieve -100dB, and dumping them in the orange section of the chart - then it makes no sense to ignore ISOs.
 
I guess a lot of people here do eq pre-dac and will either not hear them or during eq experimentation will hear general clipping and reduce the gain.
 
I guess a lot of people here do eq pre-dac and will either not hear them or during eq experimentation will hear general clipping and reduce the gain.
Yes many applikations also offers functions to prevent clipping and even sugests a pre gain to be set in a menu you can set negative 3db or something , thats before even sending the music to any DAC or speaker . In case of my actives if the software is sound ( pun ) ISO could possible happen at 100% volume but then you get other problems ... like deafness and a fire ... :)

So a digital volume saves the day .

Also for the general non audiophile public who uses volume normalised app's for spotify and cast to a small bluetooth speaker that certainly has digital volume , they never se any ISO ever .

An audiophile insisting on "bit perfect" as a desirable feature ( it's not ) may actually get them :)
 
Also for the general non audiophile public who uses volume normalised app's for spotify and cast to a small bluetooth speaker that certainly has digital volume , they never se any ISO ever .
However, devices using absolute volume may have issues. When absolute volume is enabled, the Bluetooth audio stream is always transmitted at full-scale, and the volume is a command (0-127) to control the internal volume of the Bluetooth device. As I mentioned earlier, lossy compression can produce samples greater than 0dB, and a well-designed decoder should decode lossy audio into 32bit-float PCM. But many decoders decode lossy audio into 16bit int, which can cause clipping.

But your premise is Spotify. After loudness normalization, there are generally no issues because the volume has already been reduced. What needs to be cautious about are apps without loudness normalization.
 
Last edited:
Reading through all of this makes me appreciate the ADI-2/4 PRO SE even more, since it actually shows when sample peaks or true peaks hit the ceiling. It’s not that ISOs matter much in real listening, but it does show that RME understands how to measure this stuff properly and builds the headroom to handle it. It feels more like a sign of solid engineering than something that has any real impact on music.
 
I recognise that - but my point wasn't that my kit doesn't produce them - my point was - whether it does or not - I've never heard anything that I think *can* be ISO in real-world listening. Hence my question:

Has anyone else? If so, with what music (including what master) at what time?

I hear them a lot! It depends on the content though. I often listen to a lot of metal, which is often mastered at or near 0dB with a lot of ISO clipping, but also in many modern remasters made during the loudness war. CDs produced before the mid 90s are not much of an issue.

I find it very uncomfortable to listen to without reducing the DAC's output in some way, either via the DAC's volume control or via replaygain tags in the content. It's most noticeable in the high-hat cymbals, but also in high notes from other instruments and vocal sibilance.

Whether you can hear it or not depends on your hearing and your audio system.
 
A classic.
;)
They're also easily measurable. Eg:

index.php
 
I hear them a lot! It depends on the content though. I often listen to a lot of metal, which is often mastered at or near 0dB with a lot of ISO clipping, but also in many modern remasters made during the loudness war. CDs produced before the mid 90s are not much of an issue.

I find it very uncomfortable to listen to without reducing the DAC's output in some way, either via the DAC's volume control or via replaygain tags in the content. It's most noticeable in the high-hat cymbals, but also in high notes from other instruments and vocal sibilance.

Whether you can hear it or not depends on your hearing and your audio system.
Are you hearing intersample overs or clipping?
 
Are you hearing intersample overs or clipping?
Clipping distortion as a result of intersample overs. It's most obvious the higher frequencies of low DR/heavily compressed music when played on revealing headphones or speakers, such as HD660S headphones and Audio Physic speakers. I can still hear it on the Focal Chorus 706v speakers and JBL305P monitors though, depending on the content. It's less obvious when using roll-off filter in the DAC as the attenuation of the higher frequencies helps mask the audibility of it.

It's different from "baked in" clipping in the recording, as it's easily resolved by reducing the gain in the digital source or the DAC.
 
I know what they are - I know how they happen.

I've never knowingly heard distortion coming from intersample overs. When I first got my MiniDSP flex, I set the input gain to -3dB to avoid any risk. I recently set it back to 0dB and:

1 - Have not noticed any difference
2 - Have not ever heard anything I could attribute to intersample overs.

Is it not the case that only badly mastered music (basically clipping in any case) will cause them. And even then - if it only happens for an extremely brief period in the music it is going to be almost impossible to detect audibly in any case.

Yet there seems to be all sorts of FUD talked about it in conjunction with DACs. We even have a whole thread discussing how to test DACs for it. It feels to me like it is the "new jitter" Something that people can hang their hat of audble differences on - without it being an actual problem in reality.

What am I missing?

Has anyone else heard the effect of inter-sample overs - if so, which track, at what time? And what does it sound like?
Well, we do need a new audiophile bogeyman to take the place of transient intermodulation distortion I guess.

The issue is more germaine to mastering, though many, if not most, mastering engineers in rock and pop deliberately clip and crush the audio peaks.
 
Clipping distortion as a result of intersample overs. It's most obvious the higher frequencies of low DR/heavily compressed music when played on revealing headphones or speakers, such as HD660S headphones and Audio Physic speakers. I can still hear it on the Focal Chorus 706v speakers and JBL305P monitors though, depending on the content. It's less obvious when using roll-off filter in the DAC as the attenuation of the higher frequencies helps mask the audibility of it.

It's different from "baked in" clipping in the recording, as it's easily resolved by reducing the gain in the digital source or the DAC.
How do you know you're not hearing clipping due to hard limiting in the mastering process? Which albums do you notice this with?
 
How do you know you're not hearing clipping due to hard limiting in the mastering process? Which albums do you notice this with?
It's different from "baked in" clipping in the recording, as it's easily resolved by reducing the gain in the digital source or the DAC.
 
Examples of albums?
Almost any rock/metal album released from around the mid 90s to recently, due to the typical low DR loudness. Many modern remasters (mid 90s to 2010s) of classic rock albums exhibit it, although that's been improving in recent years (since about 2010s). I'd assume most other music genres too. Basically, if it's a relatively modern, low DR, digitally compressed recording, with peaks at or near 0dB, then it's likely to have issues with ISOs. For me, it's most noticeable on recordings with forward hi-hat cymbals and vocal sibilance.

An example of a well-known album that comes to use as a test is Dire Straits - Brothers in Arms. The original 1985 master is low DR without ISOs. The 1996 master has lots of ISOs. The 2005 20th anniversary remaster (CD layer of the SACD) is even worse. Then the recent 40th anniversary remaster has none. To test/demonstrate the effect, compare listening to the hi-hat cymbals on Money For Nothing from around 2:30 on the original or 40th anniversary remaster with the 1996 or 2005 versions. Make sure it's played at 0dB gain in the source/DAC. Also ensure that the DAC doesn't use ASRC for volume control, and that the DAC doesn't already provide its own internal headroom to mitigate ISOs. The 1985 and 2025 releases are your baseline of what the hi-hat cymbals should sound like. On the 1996 & 2005 remasters they should sound how I'd describe as "mushy", harsher, and stifled with less air compared to the 1995 and 2025 versions. Then reduce the digital gain by ~3dB and listen to the 1996/2005 remasters again (obviously volume match at the amplifier). The hi-hat cymbals should sound a lot clearer, more precise, and with more "air".

This is not the best example of the ISO issue as the audibility of it is relatively subtle, and of course you could simply choose to play the better mastered versions and not worry about it. However, we don't have that option for the vast majority of albums released since roughly the mid 90s. We can't do much about the dynamic compression but at least we can do this.

I'll post better examples to test with as they come to mind.
 
Almost any rock/metal album released from around the mid 90s to recently, due to the typical low DR loudness. Many modern remasters (mid 90s to 2010s) of classic rock albums exhibit it, although that's been improving in recent years (since about 2010s). I'd assume most other music genres too. Basically, if it's a relatively modern, low DR, digitally compressed recording, with peaks at or near 0dB, then it's likely to have issues with ISOs. For me, it's most noticeable on recordings with forward hi-hat cymbals and vocal sibilance.

An example of a well-known album that comes to use as a test is Dire Straits - Brothers in Arms. The original 1985 master is low DR without ISOs. The 1996 master has lots of ISOs. The 2005 20th anniversary remaster (CD layer of the SACD) is even worse. Then the recent 40th anniversary remaster has none. To test/demonstrate the effect, compare listening to the hi-hat cymbals on Money For Nothing from around 2:30 on the original or 40th anniversary remaster with the 1996 or 2005 versions. Make sure it's played at 0dB gain in the source/DAC. Also ensure that the DAC doesn't use ASRC for volume control, and that the DAC doesn't already provide its own internal headroom to mitigate ISOs. The 1985 and 2025 releases are your baseline of what the hi-hat cymbals should sound like. On the 1996 & 2005 remasters they should sound how I'd describe as "mushy", harsher, and stifled with less air compared to the 1995 and 2025 versions. Then reduce the digital gain by ~3dB and listen to the 1996/2005 remasters again (obviously volume match at the amplifier). The hi-hat cymbals should sound a lot clearer, more precise, and with more "air".

This is not the best example of the ISO issue as the audibility of it is relatively subtle, and of course you could simply choose to play the better mastered versions and not worry about it. However, we don't have that option for the vast majority of albums released since roughly the mid 90s. We can't do much about the dynamic compression but at least we can do this.

I'll post better examples to test with as they come to mind.
I can stream the lossless version from Apple Music with my RME ADI 2/4 Pro SE DAC, and it will clearly show any OVR from the source. I have seen this a few times. Red Hot Chili Peppers has some hot tracks. I do not see this with the Dire Straits album versions you mentioned. When this occurs and I notice it visually on the meter, I can often find a different version without the clipping, and I replace the offending album with the cleaner version.
 
Please note that the level meters on the ADI series do not show ISOs. We also never claimed them to do that. They do show overs in the traditional way (either single or multiple digital full scale samples). Of course, as an owner of an ADI-2/4 you can use DigiCheck's level meters, where ISOs can be easily verified as levels above 0 dBFS.
 
Please note that the level meters on the ADI series do not show ISOs. We also never claimed them to do that. They do show overs in the traditional way (either single or multiple digital full scale samples). Of course, as an owner of an ADI-2/4 you can use DigiCheck's level meters, where ISOs can be easily verified as levels above 0 dBFS.
Yes, and thanks.
 
I've never consciously heard it. I believe it's real but I'm not sure it matters. Compared to the artifacts of extreme compression and limiting it's a non issue.
 
Back
Top Bottom