Promit
Active Member
- Joined
- Apr 1, 2020
- Messages
- 197
- Likes
- 523
This is prompted by another bad-faith discussion which I will not deign to link, but I think the core subject matter is worth discussing on its own. One of the fads that has been big over the last decade or more has been "high resolution audio", built around the idea that hifi audio is in need of more bandwidth and bit depth than the 16/44.1 format of Red Book CD. This started with the push for 24/96, and often goes as far as 24/192 source formats. This whole idea was notably backed by musician Neil Young with the Pono music service and digital music player, and somewhat famously rebutted by Xiph.org here: https://web.archive.org/web/20200310055211/https://people.xiph.org/~xiphmont/demo/neil-young.html
The Xiph article notes something quite crucial: audio systems are not necessarily designed to reproduce ultrasonics properly, and supplying them may have consequences, most notably spraying IMD back into the audible spectrum. The funny consequence here is of course that high resolution audio may indeed be clearly, provably audible compared to "standard" resolution - just for all the wrong reasons. IMD isn't usually desirable or pleasant to listen to, but in the world of audiofool nonsense, "different" plus "expensive" often equals "better".
That brings me to the point of this discussion: what are the consequences of including ultrasonics in the playback chain, from the source through DAC, amp, and speakers? Is it ever beneficial to do so? How often is it actively harmful to do so? Is the shape and strength of the bandwidth filter consequential in practice? Does it matter if you have something like the Adam A7X monitors, which claim a reference response out to 50 kHz? Is it a mistake to have my Windows configured in 24/96? I feel like there's very little technically informed discussion and testing of what ultrasonics mean in practice and whether high resolution audio is just a fashionable nothing or an active distortion generator.
The Xiph article notes something quite crucial: audio systems are not necessarily designed to reproduce ultrasonics properly, and supplying them may have consequences, most notably spraying IMD back into the audible spectrum. The funny consequence here is of course that high resolution audio may indeed be clearly, provably audible compared to "standard" resolution - just for all the wrong reasons. IMD isn't usually desirable or pleasant to listen to, but in the world of audiofool nonsense, "different" plus "expensive" often equals "better".
That brings me to the point of this discussion: what are the consequences of including ultrasonics in the playback chain, from the source through DAC, amp, and speakers? Is it ever beneficial to do so? How often is it actively harmful to do so? Is the shape and strength of the bandwidth filter consequential in practice? Does it matter if you have something like the Adam A7X monitors, which claim a reference response out to 50 kHz? Is it a mistake to have my Windows configured in 24/96? I feel like there's very little technically informed discussion and testing of what ultrasonics mean in practice and whether high resolution audio is just a fashionable nothing or an active distortion generator.