• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Sony Xperia 5 IV Audio Review

Rate this smartphone audio

  • 1. Poor (headless panther)

    Votes: 32 22.9%
  • 2. Not terrible (postman panther)

    Votes: 68 48.6%
  • 3. Fine (happy panther)

    Votes: 29 20.7%
  • 4. Great (golfing panther)

    Votes: 11 7.9%

  • Total voters
    140

A Surfer

Major Contributor
Joined
Jul 1, 2019
Messages
1,125
Likes
1,230
Thats not really true, all wireless formats have problems with the quality of the signal, not to mention that most wireless headphone are not that great in terms of sound quality.
They are great for listening on the go, noise isolating, headset function, mobility, but not for hi-fi use.
Not true at all. I have years of personal experience that is completely contrary to your assertion. There is no technical reason that properly implemented Bluetooth sacrifices any sound quality for normal music enjoyment.
 

Oso Polar

Member
Joined
Aug 7, 2021
Messages
99
Likes
154
There is no technical reason that properly implemented Bluetooth sacrifices any sound quality for normal music enjoyment.
And now you are just trolling. There are in fact plenty of technical reasons why Bluetooth objectively sacrifices sound quality, starting from the simple facts that it is a) utilizes lossy compression and b) "properly implemented" Bluetooth headphones are as rare as hen's teeth. Sure, it is possible to enjoy music even on AM radio, it all depends on the definition of "normal music enjoyment".

Yes, under ideal conditions with high bitrate LDAC quality can be very good (probably many people will not be able to distinguish from a good wired connection). Now, start moving or increase a bit the distance between Bluetooth transmitter and headphones - and the things will fall apart very fast.
 

A Surfer

Major Contributor
Joined
Jul 1, 2019
Messages
1,125
Likes
1,230
And now you are just trolling. There are in fact plenty of technical reasons why Bluetooth objectively sacrifices sound quality, starting from the simple facts that it is a) utilizes lossy compression and b) "properly implemented" Bluetooth headphones are as rare as hen's teeth. Sure, it is possible to enjoy music even on AM radio, it all depends on the definition of "normal music enjoyment".

Yes, under ideal conditions with high bitrate LDAC quality can be very good (probably many people will not be able to distinguish from a good wired connection). Now, start moving or increase a bit the distance between Bluetooth transmitter and headphones - and the things will fall apart very fast.
Back off there, you have no grounds to accuse me of trolling. You don't have to agree with me, and that is fine, but for you to have taken it to the next level with no provocative is significantly over the top.
 

A Surfer

Major Contributor
Joined
Jul 1, 2019
Messages
1,125
Likes
1,230
And for the record, I don't know of more than a handful of valid blind listening tests where people can tell the difference. Of the few that I have ever read about, the people essentially train themselves to detect very subtle differences that they work to discriminate. If you're just listening to music normally, Bluetooth is extremely capable of essentially audible transparency.

I conducted a multiple trial, blind listening test at a headfi meet I hosted. With very good equipment and multiple trials with 7 different young subjects not one of them could do better than around 50% (guessing) telling a 320 mp3 from the lossless master. People assume discrimination abilities but really don't test them.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,368
Likes
234,386
Location
Seattle Area
I conducted a multiple trial, blind listening test at a headfi meet I hosted. With very good equipment and multiple trials with 7 different young subjects not one of them could do better than around 50% (guessing) telling a 320 mp3 from the lossless master. People assume discrimination abilities but really don't test them.
320 kbps mp3 has far better quality than that Bluetooth codecs other than LDAC. SBC for example was selected because it was free and didn't take much CPU cycles. There was no requirement from Bluetooth forum for any type of fidelity. We (Microsoft) wanted to bid a higher fidelity codec but ours took more CPU cycles so we gave up. And codecs such as aptX are even worse. See: https://www.audiosciencereview.com/...c3-review-bluetooth-receiver-bt-codecs.23740/

index.php


AAC profile used there is even worse:

index.php


One of the challenges for BT and what makes it hard is to produce quality audio is having to have very low latency/buffering. Audio has sharply varying complexity from moment to moment. Classic lossy codecs such as MP3 and AAC use a memory buffer to smooth out the peaks and wind up with a much lower average rate. You can't do much of this if you require the data to get to the other side without say, half a second latency.

So while it is true that most audiophiles have poor ability to detect non-linearities, BT codec impairments (again, other than LDAC) are too much to say hardly anyone can hear it.
 

A Surfer

Major Contributor
Joined
Jul 1, 2019
Messages
1,125
Likes
1,230
320 kbps mp3 has far better quality than that Bluetooth codecs other than LDAC. SBC for example was selected because it was free and didn't take much CPU cycles. There was no requirement from Bluetooth forum for any type of fidelity. We (Microsoft) wanted to bid a higher fidelity codec but ours took more CPU cycles so we gave up. And codecs such as aptX are even worse. See: https://www.audiosciencereview.com/...c3-review-bluetooth-receiver-bt-codecs.23740/

index.php


AAC profile used there is even worse:

index.php


One of the challenges for BT and what makes it hard is to produce quality audio is having to have very low latency/buffering. Audio has sharply varying complexity from moment to moment. Classic lossy codecs such as MP3 and AAC use a memory buffer to smooth out the peaks and wind up with a much lower average rate. You can't do much of this if you require the data to get to the other side without say, half a second latency.

So while it is true that most audiophiles have poor ability to detect non-linearities, BT codec impairments (again, other than LDAC) are too much to say hardly anyone can hear it.
Not sure what to think. I had my Gustard X16 going into my NAD M3 with some very nice Monitor Audio PL200 supported by two nice subs. My M3 allowed me to send everything below 40Hz to the subs.

Not that this is a great system, but certainly adequate. I only played well recorded, dynamic material and used Bluetooth exclusively from my phone to the X16 and I never once noticed anything and I listened both critically and for pleasure.

I still think that most people, maybe even the vast majority of users would not actually be able to reliably ndetect a difference with even AAC. Would love to see the results of proper testing though. I would be prepared to eat humble pie.
 

lewdish

Active Member
Joined
May 29, 2021
Messages
247
Likes
178
Not true at all. I have years of personal experience that is completely contrary to your assertion. There is no technical reason that properly implemented Bluetooth sacrifices any sound quality for normal music enjoyment.
Technically It does have a pretty big impact BT is not a super consistent or or secure connection which is why its so susceptible to interference and dropouts. You can take a flipper zero and just destroy everyone's Bluetooth connections near you in like a 100 ft radius. same w/ regards to the compression codecs as well. Diff codecs all have different levels of transfer compression, encoding, and latency. MiniDSP also has a metric comparison showing the diff between AAC & LDAC. I think the avg normie prob cant pick up on it, but I've been able to tell the diff via AB testing pretty accurately. There are noticeable boosts or high cutoffs depending on the compression you use.

1705459973470.png
 

A Surfer

Major Contributor
Joined
Jul 1, 2019
Messages
1,125
Likes
1,230
Technically It does have a pretty big impact BT is not a super consistent or or secure connection which is why its so susceptible to interference and dropouts. You can take a flipper zero and just destroy everyone's Bluetooth connections near you in like a 100 ft radius. same w/ regards to the compression codecs as well. Diff codecs all have different levels of transfer compression, encoding, and latency. MiniDSP also has a metric comparison showing the diff between AAC & LDAC. I think the avg normie prob cant pick up on it, but I've been able to tell the diff via AB testing pretty accurately. There are noticeable boosts or high cutoffs depending on the compression you use.

View attachment 342464
You've tested blind, multiple trials, at least 10 and discriminated the difference at least 90% of the time? If not, while I believe that you believe you can reliably tell the difference, respectfully you haven't effectively tested your belief.
 

ThatGuyYouKnow

Active Member
Joined
Jan 31, 2023
Messages
140
Likes
133
Back off there, you have no grounds to accuse me of trolling. You don't have to agree with me, and that is fine, but for you to have taken it to the next level with no provocative is significantly over the top.
I don't think you are trolling and are at least being respectful, which is sort of refreshing given all the off-topic garbage I see elsewhere on this forum. From my perspective, though, I have never heard what I consider acceptable Bluetooth audio. Getting those ideal conditions could plausibly be out of reach for many of us.
 

lewdish

Active Member
Joined
May 29, 2021
Messages
247
Likes
178
You've tested blind, multiple trials, at least 10 and discriminated the difference at least 90% of the time? If not, while I believe that you believe you can reliably tell the difference, respectfully you haven't effectively tested your belief.
I mean I only had my one IEM, and i was testing on the train tbh which would make it even harder to discern, I was just testing a bluetooth dongle that I ended up returning because I find myself never really using Bluetooth anyways. I do think I could discerningly tell you which was LDAC, APTX, & SBC At least 75%+ of the time as long as I knew the song/track well enough. I havent tested again since, because I dont own the dongle anymore. But i distinctly remember being able to discern the difference in noisefloor between them as well depending on which codec you picked. It was a decent Fiio Bluetooth adapter as well, but keep in mind I was on a train ride so that in itself is pretty impressive that I could make it out. If you asked me to AB test it again Id prob need to kinda trial and error psychologically but I think id be pretty good at being able to tell you which was a higher birate stream. But it would have to be between a lower bit rate codec against 990kb LDAC, it was significantly much less discernable between APTX codecs and AAC/SBC.

Ironically I also dont care about the quality of the stream since my default listening platforms are Youtube Music and Soundcloud anyways both are notoriously compressed so maybe it was being able to discern something that was already fairly lossy on top of BT compression.

Though based on the difference just on shear measurements if you look at just the SINAD 70db vs 100+ should already be pretty apparent for a lot of people. If you've ever owned a bad measuring DAC its pretty easy to discern that. Imo I've been thinking here on ASR for a long time now why don't we have a standard for bad if we know that 80db+ sinad is a decent starting point for good/passable. Cause I do own a DAC that I think will 100% measure like trash and Its def discernible because things sound slightly warmer, fuzzier and distorted and theres a lot of pops and ticks in it. I think that would give people a general idea of range for where audibly bad starts.
 
Last edited:

jcebedo11

Member
Joined
Jul 16, 2020
Messages
63
Likes
49
Hello Amir,

Is it possible to measure the recorded microphone quality of this phone? I have the Samsung s10e, also owned many samsungs in the past and found them to be superior to many other phones i have tried. I have not tried Sony. This is important for me because I record singing and also live music events. I know this is a very niche feature but its something i really use quite often. I have tried the Asus Zenfone 9, Xiaomi 12, both have inferior microphones. My samsung s10e now has a green screen on the OLED so I i need a replacement phone.
 

staticV3

Master Contributor
Joined
Aug 29, 2019
Messages
7,535
Likes
12,002
Hello Amir,

Is it possible to measure the recorded microphone quality of this phone? I have the Samsung s10e, also owned many samsungs in the past and found them to be superior to many other phones i have tried. I have not tried Sony. This is important for me because I record singing and also live music events. I know this is a very niche feature but its something i really use quite often. I have tried the Asus Zenfone 9, Xiaomi 12, both have inferior microphones. My samsung s10e now has a green screen on the OLED so I i need a replacement phone.
DxOMark tests the recording quality of smartphones:

https://www.dxomark.com/smartphones/custom-ranking#/?criteriasChecked=0&criteriaScores={}&criteriaSubScores={%22audiorecording%22:%221%22}

Their methodology and scoring system is unfortunately pretty obscure, but perhaps you can still get some value out of it.
 
Last edited:
Top Bottom