• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Suggestion for reviews: Extend the frequency range for 'THD+N vs Freq.' test

BeerBear

Active Member
Joined
Mar 9, 2020
Messages
272
Likes
264
The current 'THD+N Ratio vs Frequency' test in ASR reviews has a range of 20-20kHz.
I think it would be useful to extend that range, especially into the lower, subsonic, region. (The APx555 can go down to 0.5Hz.)

Why? Because an audio device should handle the full spectrum gracefully, without a significant increase in noise and distortion (spoiler: that's not always the case).
And there's legit audio content out there that includes strong subsonic frequencies. It's not very common, but it's not extremely rare either.


Some background:
A few years ago someone noticed that the Apple USB-C dongle produces audible artifacts when playing strong low frequency content. I then proposed a subsonic test, because I worried that it might be a more widespread issue. As it turns out, my worries were justified, as it looks like there's an entire range of devices that suffer from this problem.
But I no longer think a separate subsonic test is necessary (edit: although it could still be helpful in some cases). Updating the current 'THD+N vs Frequency' test to include the lower frequencies would be enough.
Here's an example for one of those Cirrus Logic DACs (source):
index.php

Notice the big spike under 9Hz. It confirms my own audible testing with the Apple dongle.
It would be nice to be able to quickly spot problems like these.
 
Last edited:
- Add a 20kHz Curve to the Power vs. Distortion Chart -

o It would be valuable to include 20kHz in the THD+N vs. frequency curves. Poor results at 20kHz often indicate other issues.

o Some Class D power amplifiers have much poorer results at the 20kHz so this frequency can provide a differentiator between units. It can also show progress as Class D unit continue to mature.

o Stereophile even found significantly higher distortion in an Emotiva non-Class D power amplifier at 20kHz. Current ASR measurements would not have found this issue.

o Distortion at 20kHz provides insight into the feedback structure and tradeoffs made in Class AB power amplifiers.
 
Poor results at 20kHz often indicate other issues.
Such as?

Distortion products after 10kHz are out of the audible range and, asides from some concern about intermodulation distortion which has never been shown to be an audible problem AFAIK, generally aren't a concern. You also often see elevated THD at high frequencies if the tested bandwidth is large because you're capturing inaudible noise shaping or switching noise.
 
I'll just say I think subsonic reproduction is infinitely more important than ultrasonic, and although it's not very prevalent in most recordings, it's totally plausible that excess distortion under 20hz would cause audible problems. I think this is a pretty reasonable proposal.
 
although it's not very prevalent in most recordings, it's totally plausible that excess distortion under 20hz would cause audible problems. I think this is a pretty reasonable proposal.
I think it has to be more than just "excess" distortion though. I don't think even 100% distortion is much of an issue when we are talking infrasonics. It basically has to be broken at those frequencies. A simple check by playing back a file with test tones (or a tone generator) to make sure it doesn't make audible clicks or pops would be sufficient I think.
 
I don't think even 100% distortion is much of an issue when we are talking infrasonics.
It might be, 100% distortion at 10hz gets us a potentially very audible harmonic at 20hz and maybe 30hz. Also, if infrasonic performance is really wacky in some other way, it can affect transients of any description if they're not filtered in the recording. I am not saying infrasonics are a very pressing issue most of the time, but they're not a fake problem we can easily ignore like ultrasonic noise or something.

I would even say it's an underrated problem because people are just very accustomed to lots of THD in the sub-bass range. I never heard perfectly clean ~20hz or infrasound until I got a pair of Audeze LCD-XCs... what I considered "clean bass" for many years really wasn't.

A simple check by playing back a file with test tones (or a tone generator) to make sure it doesn't make audible clicks or pops would be sufficient I think.

Arguably yes, if you pump ~12hz into a device and hear nothing coming out... nothing to worry about? But since we're mostly talking about graphs and not ears-on testing, I don't see any harm in starting THD sweeps at 5 or 10hz for electronics instead of 20hz.
 
Last edited:
What we've seen with the Cirrus Logic infrasonic and low audible frequency behaviours is not really obvious with THD and even IMD measurements, because DRE is switched in and out with content level. In many ways it's deliberately or accidentally designed to be invisible to THD testing. Perhaps there needs to be some synthetic, dynamic "gotcha" signal tests.

At high frequencies, only children can hear distortion of a 10 kHz fundamental, so distortion of a 20 kHz fundamental tells us nothing about audibility.
 
As far as the DRE artifacts are concerned, I would propose the 32-tone multitone level sweep test and the CMaj test used here.

These tests should really be standard in reviews going forward, so that we can avoid situations like the FiiO KA11 review, where the dongle got a "Great (golfing panther)" score even though it really shouldn't have, given what we now know.

Continuing with the usual 0dBFS 1kHz sine and multitone puts us in the territory of "lying by omission" and that's really not okay.

That's not to say that the CS dongles are bad or unlistenable, but presenting them as SoTA DACs when they're clearly not is another story.
 
I don't think we need to add tests designed to catch a specific issue from a specific DAC chip (that was implemented incorrectly by the OEMs). It's not "lying by omission" to use standard tests that characterize performance adequately for the vast majority of devices.
 
- Add a 20kHz Curve to the Power vs. Distortion Chart -
At what bandwidth? At up to 39.99999 kHz, it would still not capture any harmonics of that frequency. If you assume to capture third harmonic, bandwidth would have to go up to 60 kHz. There, you would be capturing noise shaping and ultrasonic noise that would severely punish class D amplifiers.

For above reasons, I have carefully setup the test to have 45 kHz which gives us plenty of trend data for treble frequencies. It took a lot of experimentation to arrive at this test.
 
I think it would be useful to extend that range, especially into the lower, subsonic, region. (The APx555 can go down to 0.5Hz.)
No one is asking these amplifiers to power 0.5 Hz. Or even 20 Hz.
Notice the big spike under 9Hz. It confirms my own audible testing with the Apple dongle.
You are showing -80 dB at such low frequencies. What exactly did you hear?
 
It would be nice to be able to quickly spot problems like these.
You already are seeing a rising trend in low frequencies -- something that I almost always note in reviews.
 
Stereophile even found significantly higher distortion in an Emotiva non-Class D power amplifier at 20kHz. Current ASR measurements would not have found this issue.
If you mean this:

817Emotivafig06.jpg


I don't see any test condition, i.e. bandwidth.

Here is my review of an Emotiva amp clearly showing rail switching issues:

index.php
 
I don't think we need to add tests designed to catch a specific issue from a specific DAC chip (that was implemented incorrectly by the OEMs). It's not "lying by omission" to use standard tests that characterize performance adequately for the vast majority of devices.

What? I don't know if you've noticed, but these chips are everywhere and every single one of them is implemented incorrectly (some more than others). Only a couple of products have received patches that aim to fix performance.

No one listens to 1kHz sine waves. We listen to real music and audio content. And the sound defects of these chips only show up when music-like content that varies in amplitude is played (which means: all the time).

Tests should be designed to be representative of real use cases. Otherwise, what's the point?

I really wasn't expecting anyone to disagree...
 
What we've seen with the Cirrus Logic infrasonic and low audible frequency behaviours is not really obvious with THD and even IMD measurements, because DRE is switched in and out with content level. In many ways it's deliberately or accidentally designed to be invisible to THD testing. Perhaps there needs to be some synthetic, dynamic "gotcha" signal tests.
Right. I retested the Apple dongle now with 1-8Hz sine tones and it's possible to find pockets where the artifacts are inaudible or much quieter. This is definitely not typical distortion, which acts in a predictable way and always increases with volume.
That said, those pockets are very few and if you run a full sweep down there (at at least -15dB) you're guaranteed to spot the bad behavior on multiple frequencies. You might not catch every bad DAC/amp this way, but it's a start.


I don't think we need to add tests designed to catch a specific issue from a specific DAC chip (that was implemented incorrectly by the OEMs). It's not "lying by omission" to use standard tests that characterize performance adequately for the vast majority of devices.
We don't necessarily need to add another test, but we can at least improve the existing ones.
Testing THD+N on a 20-20kHz range might be 'standard', but it's not a good standard if it lets flawed devices pass through. I don't know why you wouldn't want to see a test of the full range that a device supports.


No one is asking these amplifiers to power 0.5 Hz. Or even 20 Hz.
Legit infrasonic content exists, more frequently in movies than in music. It's not very common, but it's out there. And some people use these dongles with external amps...

You are showing -80 dB at such low frequencies. What exactly did you hear?
I posted more on that, along with some recordings, in the Apple dongle thread (see the posts that follow after that one).
In short; I heard crackling/popping artifacts. They're very quiet, but audible without playing at extreme levels.
The graph is from @nick_l44.1 and I don't know if he tested the Apple dongle or something else. But I notice similar behavior (crackling with sine tones of 8Hz and lower), so I assume it's the same issue.
 
Last edited:
The graph is from @nick_l44.1 and I don't know if he tested the Apple dongle or something else.
It was a meizu hifi dac (cs43131). Practically the same result was obtained for the dongle on bare cs43198. I also tested the Apple dongle. Although its sampling frequency range is limited, it also has a similar high noise hump above 50 kHz and the same clipping from tones around 8 Hz. This means that a similar graph can also be obtained for it.
 
I'll just say I think subsonic reproduction is infinitely more important than ultrasonic, and although it's not very prevalent in most recordings, it's totally plausible that excess distortion under 20hz would cause audible problems. I think this is a pretty reasonable proposal.
Don't many tests show that even 30% distortion at 20Hz can't be really heard?

Personally I regard even 20Hz as distortion, let alone frequencies below that. It's just rumble, really. Sure there are like a dozen organs on the planet or so that can produce some sub 20Hz content, but that's about as meaningful for musical content as the cannons in 1812. No one ever said "The sound of the Flak 88 in Private Ryan was the cleanest to the original I ever heard" because nobody knows or cares other than a convincing and rich "BOOM".

As to ultrasonics, I don't have to cater to my cats' hearing range with my system. :-)
 
Last edited:
Back
Top Bottom