• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Does what we hear correspond to what we measure?

Which one do you prefer

  • N° 1

    Votes: 5 31.3%
  • N° 2

    Votes: 11 68.8%

  • Total voters
    16
Status
Not open for further replies.
The correct approach would have been to, having put away the smoke and mirrors, test the bluetooth device.
It seems to me that immediately saying that recording No. 1 came from a Bluetooth receiver would have irreparably biased the listening. For me, the best tests are when you don’t know you are taking a test... then the responses have a chance of not being biased.
 
Perhaps try something like this:
"Today I want to test the audibility of a BlueTooth device.
I have prepared two files, one with treatment A, one with treatment B. The files are level matched, with no artifacts except for the potential impact of the BT device I wish to test.
Please tell me if you can hear a difference. If so, please provide the validation from your ABX tool.
Please tell me if anything is amiss with the files since the hardest part is producing them without leaving artifacts not related to BT."

"If we can determine a difference, I may try to generate a controlled study of preference since that may be interesting in answering that persistent question."

I really don't want to put words in your mouth here. You write very well. But I hope you get my drift.
 
Honestly I don't get why this thread has instilled such polarized responses.

Do you REALLY think this "test" between two pretty darn acceptable media files -if possibly somewhat mismatched- addresses the original premise? The only thing it seems to answer is "through my pretty decent system, I could be happy with either, and I could not care less which one is uhm.. 'better'".
I imagined this comparison because for me, an audio reproduction system should distort the signal as little as possible, which in absolute terms is not easy, if only because of the speakers.
Knowing that a Bluetooth receiver uses a compression protocol to transmit the data, I thought that perhaps it might possibly be audible if I recorded the output signal.
Certainly, the method is not scientific but rather empirical. I also had in mind that very often we judge a piece of equipment by listening to the audio signal it produces. I did not think that it would lead to such discussions.
 
I agree with you on this. I think your analogy is correct in this regard.


Once again, I understand and appreciate your intention and goal here.

And in a spirit of trying to find common ground, I will also say that I have zero problem believing that Bluetooth transmission can create a sound that folks view as inferior to a similar sound without Bluetooth in the signal chain. Bluetooth audio compression is lossy, and some of the BT codecs are pretty bad (although some are pretty good and might be audibly transparent). I will take lossless over lossy any day, because while lossy compression might be inaudible, lossless compression is guaranteed to be inaudible so I never have to worry about it.

So I'm not out to trash your entire experiment here - and I don't think I'm alone in that. The problem, though, is that "the technical measurements of the Bluetooth receiver might partially explain the feelings and the observation" is pretty much all we can say. Was the signal chain within your Android phone a factor? Was the Bluetooth codec that your phone and the receiver "agreed" on the main issue? Did the Bluetooth signal have to operate in a lower-fidelity mode because of a dodgy wireless connection between your phone and the receiver? Was there something in the hardware design or software of the BT receiver that was the main cause? Why does sample #1 show evidence of dynamic compression (which has nothing to do with the lossy data-size compression of BT transmission)? Why does it look like there are clipped peaks that have subsequently been attenuated in volume? Why is there a channel imbalance in one sample that's nearly 3x the magnitude of the channel imbalance in the other? That has nothing to do with BT, so it must be something else.

All we can really know here is that Bluetooth might possibly have degraded the sonic fidelity to a degree that is audible. But this is nothing we didn't already know about Bluetooth beforehand - and your test is actually less informative than that, because even if we assume that people's reported listening impressions would stand up to a proper blind ABX test, we still wouldn't know if Bluetooth was the reason, or one of multiple factors, or not really a factor at all.

So with all respect, we aren't quite "sharing a moment of discussion while learning things." We are just sharing a moment of discussion - because there's no way to really know what if anything we can learn from this. The hypothesis that BT is a key factor here is a reasonable one - but it is not proven, or even established to be probable or highly likely, by your test. I'm sorry, and nothing personal, but your test is just far more limited in what it can tell us than you seem to think.
Thank you for all your explanations, and maybe you yourself don't learn anything, but personally, your arguments allow me to learn to better conduct my future tests.
 
But the title of this thread, visible at the top of every page in large bold letters just above the poll, isn't, "Do these two tracks sound the same or different to you, and what are the differences you hear?" No, the title is, "Does what we hear correspond to what we measure?" And that question can't be answered with this kind of test or setup, for reasons that have been covered exhaustively.
Beyond listening, the participants submitted the two recordings to various analyses using specific programs. This produced visual results (or measurements) that indeed show that the level of No.1 is higher, that there is indeed compression in No.1, and that the balance between right and left is not correct. Can we then say what I am questioning in the title with these elements?

But now, we can end the discussions if it helps prevent people from quarreling.
 
I imagined this comparison because for me, an audio reproduction system should distort the signal as little as possible, which in absolute terms is not easy, if only because of the speakers.
Speakers are irrelevant as everything is in the electrical domain.

Knowing that a Bluetooth receiver uses a compression protocol to transmit the data, I thought that perhaps it might possibly be audible if I recorded the output signal.
If you want to know that it would have been best to only test for that.
This means level matched.
Now you have more than just the protocol, you also have level differences and channel imbalance.

You basically were testing more than 1 difference at the same time and asked for preference.

For instance, when testing for preference between 320kbs MP3 and original WAV it is not uncommon people prefer MP3 but objectively the WAV is the better signal.

Certainly, the method is not scientific but rather empirical. I also had in mind that very often we judge a piece of equipment by listening to the audio signal it produces. I did not think that it would lead to such discussions.
I understand the thought behind it but for ASR folks the test (preference only) was not an interesting one as you evaded the 'level matched' part.

Throw a dice 8 times... you might end up with 2x odd and 6x even (the stats part).

What is interesting is that the 'majority' of voters did not have a preference for the clearly louder version which usually is the one that is shown to have a preference.
The number of respondents is a bit low.

What if one of the responders used a BT headphone and both files were BT'ed ?
 
Knowing that a Bluetooth receiver uses a compression protocol to transmit the data, I thought that perhaps it might possibly be audible if I recorded the output signal
You have had this explained to you before: there are (at least) TWO types of compression in audio.

DYNAMIC RANGE COMPRESSORS and limiters existed decades before digital audio. Compressors make quieter parts louder whilst keeping the loud bits the same level. This is a form of distortion. But it can be useful where the original dynamic range of music is considered too great for consumption.

Large digital files can be "compressed" to make them smaller. We all know about ZIP which losslessly compresses a big spreadsheet to a smaller file whilst NOT changing the data. In audio, FLAC works this way on a big WAV file

In audio and video there is also LOSSY COMPRESSION, where information that is considered "redundant" is stripped out. MPEG is a common lossy compressor on video and on audio. An MP3 compressed version of a WAV has thrown away a lot of sound content that research has shown is not needed by our brains to reconstruct the music. Many Bluetooth codecs employ this lossy form of compression.

BUT, a lossy audio compressor should not normally compress the dynamic range the way an audio compressor does.

One of your files has dynamic range compression which implies something is wrong with your experimental setup. Dynamic range compression is definitely and reliably audible (unless extremely subtle).
 
Throw a dice 8 times... you might end up with 2x odd and 6x even (the stats part).
Can someone who expresses a preference reasonably be considered a roll of the dice?
 
One of your files has dynamic range compression which implies something is wrong with your experimental setup. Dynamic range compression is definitely and reliably audible (unless extremely subtle).
Indeed, it is difficult to say what the source of this compression is, but one fact exists: it is audible and technically visualizable.
 
For what it is worth, as of now, with 15 votes, and 10 preferring one of the tracks, there is:

A 30+% chance that a result that extreme, or more could occur by random blind chance.
 
Can someone who expresses a preference reasonably be considered a roll of the dice?
In the case where the poll doesn't have a "don't know" or "don't care" option, then yes. Because if someone wants to participate they are forced to select a preference even if they don't have one. In this case it is effectively a roll of the dice.
 
Last edited:
Indeed, it is difficult to say what the source of this compression is, but one fact exists: it is audible and technically visualizable.
Absolutely - dynamic range compression should be audible. Otherwise there's not much point doing it!

Lossy file compression is designed to fool our brains into believing it hasn't happened (unless the lossy compression ratio is strong). Many people cannot hear a difference between a 1.5Mbit/s WAV and a 0.3Mbit/s MP3.

You introduced more than one thing into your experimental setup between file 1 and file 2. You should recall from school experiments that this is bad practice. Something is wrong with your experimental setup. You are not testing the audibility of the Bluetooth link, but something else.
 
My wife came up from the basement as I was playing the files. She was listening intently while feeding our subterranean dwellers and claimed that the first file was clearly reprocessed by Bluetooth 5.1 or an earlier version of lesser fidelity. I sent her back down with a shovel and bucket of feed to get a minimum of 9 opinions from the more large eared troglodytes. They were all very impressed by the attention that this exercise has been given.
 
You introduced more than one thing into your experimental setup between file 1 and file 2. You should recall from school experiments that this is bad practice. Something is wrong with your experimental setup. You are not testing the audibility of the Bluetooth link, but something else.
Yes, there is indeed the 'mobile phone' element that is added because it is the one emitting the signal. However, to test a Bluetooth receiver, you do need a Bluetooth source. However, when I look at the connection settings to the receiver on my phone, I only have 2 choices, AptX-HD and LDAC. Perhaps in this case I could compare the two compression modes to try to determine if they introduce any differences.
 
Track 1 more rumbling bass, higher volume.That's how I experienced it BEFORE I read the thread. Unfortunately I haven't done a blind test between the files.
Lack of blind testing, which could lead to me imagining hearing differences. I don't rule that out.

So what I can read is that it's compressed music (different recordings?) plus via Bluetooth in version 1. Sounds like a reasonable explanation. Damn now I regret not doing a proper blind test. :oops:

Yes, there is indeed the 'mobile phone' element that is added because it is the one emitting the signal. However, to test a Bluetooth receiver, you do need a Bluetooth source. However, when I look at the connection settings to the receiver on my phone, I only have 2 choices, AptX-HD and LDAC. Perhaps in this case I could compare the two compression modes to try to determine if they introduce any differences.
So one, file 1? recorded via a mobile phone? Those mixers that are in some OS can worsen the sound quality. This is especially true, if I remember correctly, for Android.

LDAC would be interesting to blind test.:) I used LDAC plus Spotify all summer. I don't think I would have heard any difference if I had used a cable and Spotify with my SMSL DO100PRO DAC (the same DAC has LDAC receiver).

Thanks for creating this test @Vintage02 .It was interesting.:)
 
Last edited:
LDAC would be interesting to blind test.:) I used LDAC plus Spotify all summer. I don't think I would have heard any difference if I had used a cable and Spotify with my SMSL DO100PRO DAC (the same DAC has LDAC receiver).

Thanks for creating this test @Vintage02 .It was interesting.:)
Thank you for taking the time to test.

Obviously, we must learn from those who know, but I like to believe that every exchange is constructive, even if only through the confrontation of points of view.
 
My wife came up from the basement as I was playing the files. She was listening intently while feeding our subterranean dwellers and claimed that the first file was clearly reprocessed by Bluetooth 5.1 or an earlier version of lesser fidelity. I sent her back down with a shovel and bucket of feed to get a minimum of 9 opinions from the more large eared troglodytes. They were all very impressed by the attention that this exercise has been given.

You clearly have a very progressive marriage arrangement.

Because invariably, in similar scenarios, dear wife is listening from the kitchen as she extols the virtues of what she hears from the next room.
 
A
Yes, there is indeed the 'mobile phone' element that is added because it is the one emitting the signal. However, to test a Bluetooth receiver, you do need a Bluetooth source
Android phones have a poor reputation for sound out of the box. There are lots of things you can do to fix this.

You were testing your android phone AND the Bluetooth codecs at the same time. Firstly you need to ensure your android setup is bit-perfect where possible.
 
... ensure your android setup is bit-perfect where possible.

But isn't that impossible with any BT codec? LDAC is limited to 990k, aptX' highest tier is 500k or so, I seem to recall. While the most demanding 16/44.1 FLAC files in my collection seem to top out around the 1000k mark, with inevitable transport overheads even LDAC can't keep up with lossless FLAC. And if you try to break into 20/48 "bit perfection" you are utterly doomed in that expectation.

That said, I am immensely pleased with my Smartphone- and BT-powered setup. And when I use that setup, I typically just listen to Spotify at 320k (even though I could do lossless, it seems wasteful). And I have FLAC files on my Smartphone, but again, I don't even bother to check if I can hear a difference 320k OggVorbis is pretty darn solid.
 
Status
Not open for further replies.
Back
Top Bottom