• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Denon DN-200BR Professional Bluetooth Receiver Review

Rate this balanced Bluetooth receiver

  • 1. Poor (headless panther)

    Votes: 28 25.9%
  • 2. Not terrible (postman panther)

    Votes: 65 60.2%
  • 3. Fine (happy panther)

    Votes: 11 10.2%
  • 4. Great (golfing panther)

    Votes: 4 3.7%

  • Total voters
    108
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
4V is a consumer consideration to be able to connect a DAC directly to a power amp.
What? Audio interfaces routinely output more than 4 volts (14 dBu). They are consumer too? Here are the specs for RME ADI-2 DAC:
  • Output level switchable +19 dBu, +13 dBu, +7 dBu, +1 dBu @ 0 dBFS
Balanced interconnects are used for very long lengths where you absolutely need the higher voltages to counter voltage drops and sources of interference. In this example, Topping with its 3.3 volt output managed to produce a whopping 15 dB better performance.

This box has an adjustable volume control anyway. So if it output 4 volts, then you can dial it down to whatever you say a mixing board requires. But without it going to 4 volts, there is no way to boost it without incurring more noise penalty.

These low voltages are vestiges of old times when sources didn't output much and amplifiers had to have high gains. This is the wrong gain structure for the best performance. We need to move up and make at least 4 volts standard for balanced out so that we can dial down the gain in the amplifier. We are leaving performance on the table without doing this.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
I have been reading about SBC and AAC and other BT codecs for a while. I can not say I understand how all of them works yet but the more I read about them the more I start doubting whether 1Khz single tone FFT is a good way of evaluating codec performance.
It *should* be the case. That is, a pure sinewave takes very little data to encode and transmit. Bluetooth has a bandwidth of roughly 1 mbit/sec. Even if we derate that to say 0.7 mbit/sec, that should still be huge amount of bandwidth relative to what it takes to code a single sine wave. That is 6X the bandwidth of 128 kbps MP3/AAC! Clearly folks are sleep at the helm allowing these codecs/implementations get out that produce these horrid results. Way early in development of WMA audio codec I ran a test like this and found clear flaws that my team fixed.

My reason for picking these tests is because I am testing *hardware*. A BT adapter is a DAC so we use DAC like measurements to characterize them. Per above, a simple 1 kHz tone should have been no barrier as far as the codec is concerned. But sadly it is.

What we should have seen is that all the codecs produce the same result, i.e. superior to what at the hardware can do, and then we would say, "well a 1 khz tone is too simple for a lossy codec so we need listening tests." But we are seeing the reverse of this for all the codecs other than LDAC. The codec is clearly distorting the audio before the DAC in the adapter gets a chance to output the bitstream.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
Frankly, your assertion for sound quality "SBC > AAC > aptX LL > aptX > aptx HD > LDAC" is correct.
It is not. Certainly objectively it is not. In my not so rigorous listening tests, I found AAC to be horrible and no match for SBC. Mind you, AAC is a superb lossy codec when used for offline encoding. In BT, it seems they are using a low complexity+low latency version which severely butches the fidelity. I just encoded a pure 1 kHz tone into AAC at just 128 kbps and here are the results:

1689529144653.png


Look at how clean it is. Compare it to what it looks like in BT implementation (which I have confirmed to be the same on both Android and iOS):
1689529213764.png


It is just an embarrassment.

I have not tested aptx HD so putting that aside, as far as I am concerned, there are only two codecs you want to use: SBC and LDAC. The former only if the latter is not there.
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK
If by receiver you mean the antenna, it is likely fine indeed, but fact remain that it's a BT 2.1 device. There have been improvements in BT chips over the last 10 years. Wonder how old is this product.
By receiver I meant the Bluetooth radio receiver, which is what this device is, nothing else. It is analogues to a DAB tuner.

Antennae is part of a receiver.

 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
Wonder why Amir says it only advertise SBC CODEC. This spec sheet clearly says SBC, MP3, AptX and AAC.
I can only go by what my phone shows. With Denon, the advanced developer options grayed out all codecs other than SBC with a note saying the grayed ones are not supported by the receiver. When I switched to Topping, the other options were available to select as you see from testing. I am sure whatever BT chip/module they are using supports all of those codecs. But whether they have tested it is another matter.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
One note about SBC: it was the first codec adopted by BT Forum. They put out a call for proposal for a codec that was a) free to license and b) had extremely low complexity (low CPU usage). We had a superior codec to it but could not meet the very low complexity bar. So SBC (from Philips) won by default. After all, nobody else wanted to give away their technology for free. SBC is inefficient so could not remotely compete with likes of AAC in offline usage. But since BT bandwidth is there, it works. Net, net, we could have had better than SBC but the shortsighted RFP put out there didn't allow it.
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK
Balanced interconnects are used for very long lengths where you absolutely need the higher voltages to counter voltage drops and sources of interference.
Balanced connections had been around for almost a century and 4dBu had been and still is producing everything you hear.

In this example, Topping with its 3.3 volt output managed to produce a whopping 15 dB better performance.

We are leaving performance on the table without doing this.
But isn’t that about device inner workings? Or available chips? What other reason is there to alter the existing “old” standards?
 

dasdoing

Major Contributor
Joined
May 20, 2020
Messages
4,301
Likes
2,770
Location
Salvador-Bahia-Brasil
I can only go by what my phone shows. With Denon, the advanced developer options grayed out all codecs other than SBC with a note saying the grayed ones are not supported by the receiver. When I switched to Topping, the other options were available to select as you see from testing. I am sure whatever BT chip/module they are using supports all of those codecs. But whether they have tested it is another matter.

a review on amazon

1689529906811.png
 

PeteL

Major Contributor
Joined
Jun 1, 2020
Messages
3,303
Likes
3,846
What? Audio interfaces routinely output more than 4 volts (14 dBu). They are consumer too? Here are the specs for RME ADI-2 DAC:
  • Output level switchable +19 dBu, +13 dBu, +7 dBu, +1 dBu @ 0 dBFS
Balanced interconnects are used for very long lengths where you absolutely need the higher voltages to counter voltage drops and sources of interference. In this example, Topping with its 3.3 volt output managed to produce a whopping 15 dB better performance.

This box has an adjustable volume control anyway. So if it output 4 volts, then you can dial it down to whatever you say a mixing board requires. But without it going to 4 volts, there is no way to boost it without incurring more noise penalty.

These low voltages are vestiges of old times when sources didn't output much and amplifiers had to have high gains. This is the wrong gain structure for the best performance. We need to move up and make at least 4 volts standard for balanced out so that we can dial down the gain in the amplifier. We are leaving performance on the table without doing this.
Absolutely, but audio interfaces are not pure sources, they are routinely connected directly to amplified devices like Powered speakers. They also serve the purpose of the mixer itself, so they need that flexibility. That headroom. Indeed used as a pure source, you will need compensation. RME-ADI-2 DAC is RME foray into the consumer world since it has no inputs, but you are correct, most pro interfaces will also have that flexibility . Now yes, you are correct if you use the volume control You can recalibrate, I said exactly the same, no argument there, but that's often the Idea with pro devices for the commercial install market, you want it idiot proof for the end customer. You don't want to receive a support call from the end user because someone cranked the volume and the next person after don't understand why it's distorted.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
But isn’t that about device inner workings? Or available chips? What other reason is there to alter the existing “old” standards?
It is called the CD format: 40+ year old format. Which proposed 2 volts unbalanced, 4 volts balanced. This device didn't exist in old times. It is a modern device and needs to comply with what is specified now for digital sources.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK
Net, net, we could have had better than SBC but the shortsighted RFP put out there didn't allow it.
Had you offered it free too?
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
Had you offered it free too?
Well, we didn't offer it because we could not meet the CPU requirements. But yes, we were fully willing to offer it for free. We also had severe deadline so couldn't optimize it further.
 

IAtaman

Major Contributor
Forum Donor
Joined
Mar 29, 2021
Messages
2,409
Likes
4,165
It *should* be the case. That is, a pure sinewave takes very little data to encode and transmit. Bluetooth has a bandwidth of roughly 1 mbit/sec. Even if we derate that to say 0.7 mbit/sec, that should still be huge amount of bandwidth relative to what it takes to code a single sine wave. That is 6X the bandwidth of 128 kbps MP3/AAC! Clearly folks are sleep at the helm allowing these codecs/implementations get out that produce these horrid results. Way early in development of WMA audio codec I ran a test like this and found clear flaws that my team fixed.
That is also my point. SBC seems to be allocating different number of bits to different sub bands based on how "loud" they are. When an easy to decode single tone is used for testing, capability of SBC might be overestimated as more tones would decrease the number of bits available to different sub bands, especially quieter ones.

From https://www.net.in.tum.de/fileadmin/bibtex/publications/papers/hoene_sbc2009.pdf

SBC supports two different algorithms for calculating how many bits should be allocated to each subband. The two modes are called SNR and LOUDNESS. The SNR mode is simple and calculates the number of bits needed, using (log2 scale f actor) − 1. The LOUDNESS mode calculates the bit needed similar to the SNR mode but it uses a weighting based on subband positions and the sampling rate. More bits are allocated to the lowest band whereas the highest bands require a lower number of bits. Also, subbands with a medium loudness are getting more bits at the costs of quiet bands.
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,079
Likes
1,881
Location
London UK
It is not. Certainly objectively it is not. In my not so rigorous listening tests, I found AAC to be horrible and no match for SBC. Mind you, AAC is a superb lossy codec when used for offline encoding. In BT, it seems they are using a low complexity+low latency version which severely butches the fidelity. I just encoded a pure 1 kHz tone into AAC at just 128 kbps and here are the results:






View attachment 299487

Look at how clean it is. Compare it to what it looks like in BT implementation (which I have confirmed to be the same on both Android and iOS):
View attachment 299488

It is just an embarrassment.

I have not tested aptx HD so putting that aside, as far as I am concerned, there are only two codecs you want to use: SBC and LDAC. The former only if the latter is not there.
Thank you for that, interesting.
Have you got one for SBC?
Back to AAC, I see artefacts starting at -80dB on BT. That's not too bad for BT and 128kb. If we get one SBC, then we can compare. To my ears, AAC had subdued high frequency, but cleaner everywhere else.
 

PeteL

Major Contributor
Joined
Jun 1, 2020
Messages
3,303
Likes
3,846
Balanced interconnects are used for very long lengths where you absolutely need the higher voltages to counter voltage drops and sources of interference.
A SM58 outputs -55 dBV when excited with a 94 dB SPL signal. I connected with 100 feet cables without issues.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,679
Likes
241,094
Location
Seattle Area
A SM58 outputs -55 dBV when excited with a 94 dB SPL signal. I connected with 100 feet cables without issues.
A microphone? One that picks up good bit of noise from what it is capturing? So no, that is not a critical application putting aside that we don't know what "issues" you have objectively tested for.
 

PeteL

Major Contributor
Joined
Jun 1, 2020
Messages
3,303
Likes
3,846
A microphone? One that picks up good bit of noise from what it is capturing? So no, that is not a critical application putting aside that we don't know what "issues" you have objectively tested for.
OK, I think that for many the sound of their voice is a more "critical" application than a BT receiver in a commercial space. I am just saying that audio signals have been carried with way lower levels trough balanced connections for long distance forever good luck hearing the degradation over 100 ft and that's just average, many times twice that is needed.
 

PeteL

Major Contributor
Joined
Jun 1, 2020
Messages
3,303
Likes
3,846
By receiver I meant the Bluetooth radio receiver, which is what this device is, nothing else. It is analogues to a DAB tuner.

Antennae is part of a receiver.

OK, the radio receiver is a chip by Qualcomm or others, in this case an old one borderline obsolete. BT 2.1 is old, we're at 5, but indeed good care must be taken with the antenna transmission line path and that can separate the good from the bad.
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK
OK, the radio receiver is a chip by Qualcomm or others, in this case an old one borderline obsolete. BT 2.1 is old, we're at 5, but indeed good care must be taken with the antenna transmission line path and that can separate the good from the bad.
Professional devices are not updated often as consistency is a virtue on that market.

There’s nothing that you can gain with higher BT versions on this device’s use scenario. BT 3 introduced side channels and UWB, which is not suitable for this job. BT 4 is for BLE, again doesn’t matter here. Finally, BT 5 is mainly to cater IoT.
 
Last edited:
Top Bottom