• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

CEA-2010 Subwoofer Testing Master Thread

I highly recommend you read the standard to ensure you do the measurements correctly. The newest version of the standard is the CTA 2010-C, available for instance here: https://www.sigbergaudio.no/pages/loudspeaker-and-subwoofer-measurement-standards

That peak dB from your graph is the right number if the measurement has been done in the correct way.
I am conducting an outdoor 2M test, does Max Pass SPL need to subtract 3db from the test value in my chart? Some people say they want to subtract 3db from the data base, but starobinCEA2010 did not mention it
 
I am conducting an outdoor 2M test, does Max Pass SPL need to subtract 3db from the test value in my chart? Some people say they want to subtract 3db from the data base, but starobinCEA2010 did not mention it

You don't have to subtract 3dB from the max SPL you get in your REW measurements.
 
I have been trying to understand why subwoofer distortion standards allow such high distortion. Reading through the standards they never mention where they got these numbers. The "distortion Pass/Fail" for 20 Hz is 100% distortion which to me seems incredibly high. I know there is some research that we are less sensitive to distortion at lower frequencies but with many speakers now having less than 0.5% distortion @100 Hz and above it makes no sense to me to allow 100% distortion as "passing".

I am thinking they have this spec backward at the very low frequencies (higher allowed distortion at lower frequencies) due to Fletcher-Munson. If you assume 100% distortion for a 20 Hz fundamental then it is producing the same SPL @ 40 Hz and according to Fletcher-Munson the 40 Hz tone is going to be perceived as ~20 dB louder than the fundamental. How is that "Hi-Fi" and how are these standards useful to a consumer? I guess the manufacturers like these standards because it allows them to advertise output levels with 100% distortion using tests that meet "CEA Standards" but I think this is very misleading. In my experience if you can't create "clean low bass" (maybe 5 to 10% THD below 40 Hz) you are better off rolling it off. This standard encourages the opposite. I wonder if that is one of the reasons why listener satisfaction with sub woofers is so low.
 
When distortion reaches or exceeds 100% in bass, it is massively audible in my sweep tests. I mean it sounds like the woofer is falling apart. So definitely agree with you that the industry has had a heavy hand in this spec, creating bigger numbers at the expensive of fidelity. In car audio they have "SPL" and "SQ" competitions. The former is just how loud, no matter the fidelity. This spec is not as bad but definitely not sound quality optimized.

It also lacks audibility concerns as harmonics fly into 1 kHz and beyond where our hearing is much more sensitive.
 
How is that "Hi-Fi" and how are these standards useful to a consumer?

This method of measuring was groundbreaking when developed 20 years ago, and helped level the playing field in the sense that there was now an objective way to rank subwoofers.

But it is far from perfect, and you can see that even the CEA agrees. In CEA RP22 they explicitly invoke CEA2034 for assessing speaker fitness for listening rooms, but they do NOT utilize CEA2010 (and are working on something else to be used instead).

I wonder if that is one of the reasons why listener satisfaction with sub woofers is so low.
I think that would be a tiny sliver of the pie. More likely: A fundamental lack of understanding of crossovers, phase, time alignment, and room modes.
 
Back
Top Bottom