• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Denon DN-200BR Professional Bluetooth Receiver Review

Rate this balanced Bluetooth receiver

  • 1. Poor (headless panther)

    Votes: 28 25.9%
  • 2. Not terrible (postman panther)

    Votes: 65 60.2%
  • 3. Fine (happy panther)

    Votes: 11 10.2%
  • 4. Great (golfing panther)

    Votes: 4 3.7%

  • Total voters
    108

dumbsuck

Member
Joined
Sep 7, 2022
Messages
20
Likes
17
AptX is better than SBC. Here's why: People do not listen to pure sinewaves. At least most people don't. There are 2 problems with SBC. The first one is that vendors are free to implement SBC in their own way and no vendor ever implemented SBC in any better quality than the "High quality" profile, which maxes out at 328/345 Kbps (44/48 KHz). And at that bitrate, SBC barely makes it to be subjectively unrecognizable from lossless by most people (plus, its use of Joint Stereo collapses the stereo image in some cases even at this bitrate, in my own listening experience). And vendors often go even lower than the "High quality" profile on their devices; for example, I used to own the Creative Zen Air TWS IEMs and the BT profile on those only allowed a maximum of around 220 Kbps for SBC! That's even lower than the "Middle quality" SBC profile! Needless to say it sounded horrible. The sound was closer to mono than to stereo and full of noise resembling clipping noise. The second problem, in my opinion, is that a device using SBC is free to lower the bitrate during playback if the signal quality gets poor and usually the devices do not restore the original bitrate unless you pause the playback for a while. I personally prefer a few clicks in the playback to losing sound quality.

That's why the plain old AptX is my favorite codec. It sounds very good to my ears (tested with Sennheiser Momentum 3 and True Wireless 3). Also, it only supports constant bitrate, no bitrate lowering due to bad signal quality etc. 352 Kbps for 44.1 KHz, 384 Kbps for 48 KHz sampling rates. Vendors are not allowed to change that. And these bitrates are perfectly attainable if you are outside with the headphones on your head and phone in your pocket, even in, for example, shopping centers, where there is a ton of interference from WiFi and other people's phones etc. AptX HD should be better in theory, but it's just AptX with a higher bitrate. And I am afraid (and suspect that this is the reason why it's not implemented on almost any device) that its use of 500+ Kbps just is not suitable for outside use, those higher bitrates are just not attainable when the signal needs to traverse one's body (from your pocket to the HP's on your head) in areas with radio interference (city centers, shopping malls etc).
Edit: I might also add that AptX uses a simple predictor which was tuned on real music sometime in the 80s, it was not tuned to "predict" sinewaves. That is likely why AptX does not do too good when tested using sinewaves.

That is also the reason why I think LDAC is bad. LDAC is only good at 990 and 660 Kbps, its 330 Kbps profile is just bad. That means that for desktop usage, where you have the BT transmitter and receiver just a meter away from each other without any obstacles between, you will likely be able to attain 990 Kbps and it will sound good. But then you might as well just use a cable. In practical scenarios, like being outside, with the phone in your pocket and the HP's on your head, with some radio interference from nearby WiFi's and other ppl's phones, it is very likely LDAC will fall back to its 330 Kbps profile = bad sound quality. LDAC, just like SBC, adjusts its bitrate depending on the signal quality. It will jump between the 990/660/330 Kbps profiles. But I suspect that even on 330 Kbps, it still sounds subjectively good, as most people who have devices that support LDAC keep praising the codec, even though they are likely listening to 330 Kbps LDAC...

And to properly judge AAC, Amir should really try to use it with an Apple device. Android is notorious for bad BT AAC implementation and by default will never use AAC if any other codec except for SBC is available (like AptX or LDAC). Apple devices only support SBC and AAC and default to AAC.

Sources: 1 2 3 4 5

Edit2: I might also add an interesting thing I found out during my extensive bluetooth audio codec investigations. Windows 11 uses AAC by default. To my ears, MS's implementation of BT AAC is worse than Apple's and about the same as Android's. Also, there is a nice app called Bluetooth Tweaker that will tell you what codecs and bitrates your bluetooth audio device supports and what does Windows default to. But it's not free, it costs 8 dollars.
 
Last edited:

PeteL

Major Contributor
Joined
Jun 1, 2020
Messages
3,303
Likes
3,846
AptX HD should be better in theory, but it's just AptX with a higher bitrate. And I am afraid (and suspect that this is the reason why it's not implemented on almost any device) that its use of 500+ Kbps just is not suitable for outside use, those higher bitrates are just not attainable when the signal needs to traverse one's body (from your pocket to the HP's on your head) in areas with radio interference (city centers, shopping malls etc).
AptX HD's main claim is to maintain 24 bit encoding and decoding. Now I am not fully sure if that's so smart for a lossy codec. I mean if you know that you'll have to severely compress the data, isn't converting to 16 bit audio the first thing you'd want to do when there is still no real proven audible benefit to 24 bit playback?
 
Last edited:

dumbsuck

Member
Joined
Sep 7, 2022
Messages
20
Likes
17
AptX HD's main claim is to maintain 24 bit encoding and decoding. Now I am not fully sure if that's so smart for a lossy codec. I mean if you know that you'll have to severely compress the data, isn't converting to 16 bit audio the first thing you'd want to do when there is still no real proven audible benefit to 24 bit playback?

By "supporting" 24 bits, I guess the authors of the codec mean that AptX HD achieves a higher dynamic range than 16 bit LPCM - about 102 dB for frequencies below 5 KHz - which is still nowhere near "true" 24 bit's dynamic range of 144 dB across the entire frequency spectrum for LPCM. Above 5 KHz, the dynamic range starts decreasing with raising frequency, reaching 74dB for aptX and 86dB for aptX HD at > 8 KHz (according to Soundguys' testing). I agree that going over 16 bits with LPCM is a waste when talking about listening to music (24 and more bits do have their place in music recording/production). LDAC is even worse of an offender in this sense - it wastes precious bluetooth bandwidth on encoding ultrasound frequencies (> 20 KHz), which is utter nonsense. But I guess they had to sacrifice rationality to get the Hi Res Audio certification for their codec...
 

Kimbrough Xu

Active Member
Joined
May 18, 2022
Messages
229
Likes
100
The fact that it has a detachable antenna cable and even a... detachable bluetooth pair button connector... tells me this is meant to be stuffed somewhere dark while the interface and the wireless go to somewhere that can see light

A pretty specific target market.

One example, speaker test shelves? Where there are lots of buttons. Or basically anywhere that needs a button for Bluetooth... hotels and exhibition halls maybe?
Cannot imagine what situation it is!
 
Top Bottom