I took this from the SMSL C200 review - 48kHz sample rate, 90kHz measurement bandwidth (consistent with other DAC measurements)
I've been wondering what, exactly, this chart would this look like, if it was done at 48kHz sample rate and 20kHz measurement bandwidth. And 48k BW / 20kHz measurement A-weighted.
Of course it's good to know how much distortion is actually present, but I think the effect of the distortion added by choosing a filter other than 1 or 4 (in C200's case) is overstated by these graphs. I don't know for sure, though, which is why I'm asking.
The C200 with filters 1 & 4, from 20Hz to 10kHz is perfectly flat at -107dB THD+n, while light blue filter 5 is flat to 300Hz at -101.5dB THD+n, then rises to -97dB THD+n by 1.3kHz, remains worse than -100dB THD+n until 16kHz. From 16k to 20k THD+n rises from -97dB to a horrible -40dB...
Normally in my system, I would be able to tell the difference between a DAC that measures distortion flat at -107dB THD+n from 20Hz to 20kHz from a DAC that measures -97dB THD+n from 20Hz to 20kHz at 48kHz sample, and 20kHz measurement bandwidth. I have a feeling, though, that being able to tell the difference between filters 1&4, from filter 5, would be a lot harder.
Edit: to the people who inevitably always have to say "you can't hear the difference between..." in this case -97dB THD+n and -107dB THD+n. My post isn't about that. But I'll say this: you can't hear the difference. And just because you can't do something doesn't mean I (or anyone else) can't do it either! Listening is an ability and a skill which you need to develop and maintain. The difference to me between DACs 10dB apart (with otherwise similar characteristics) isn't night and day, and some tracks I wouldn't be able to tell the difference... but given my library of music and my setup and 10 minutes? I can hear the difference!)
Last edited: