• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Review and Measurements of Chromecast Audio Digital Output

I don't know. Subjectively, it doesn't feel quite right to use 256 kbps AAC if you want good HiFi. But audible, for me, I don't know?
If a good aac encoder is used, from the lossless files I would expect there wouldn't be important differences between that and 320 ogg, they are both good encoders.
Aren't there blind tests you can do to see if that's the case?
Do your own, it's very illuminating. Foobar has a module to do it, there is plenty of other software out there as well.
 
There has been a bunch, some of them quite scientific. The usual conclusion is that the difference is only audible in a few rare (pathological) cases (keys jingling etc.), and even then only if you really know what to listen for. I guess we still have JJ Johnston (the "father of AAC) here on this forum.
Back when DAB was first introduced in the UK, there were articles in HiFi News I recall, describing the sound of the FM technology we were used to (a slight added 'bloom' in the lower midrange that's there regardless of the tuner), but also if my memory holds, the way the BBC were deciding on streaming rate. 192 or so was acceptable, but the programme material had to be spotless as the codec apparently couldn't determine signal from distortion. 256 was judged the best overall, but I'm not sure the BBC went over to that (I've never been a DAB fan and in any case, DAB+ apparently sorts it but isn't backwards compatible I understand).

I've been surprised many times at the high perceived quality of Spotify as well as many recent Youtube uploads, the latter and maybe the former as well apparently 160mbs, believe it or not. I'm not directly aware of bits missing or excess compression (one recent album I did hear compression only to find it on the actual CD when I subsequently bought it!

It's the music that counts after all and I think I've now given up on sinad-chasing or a Klippel addiction, as in the latter, some speakers which deviate from 'true-flat' are deliberately done that way for whatever good or less good reason. My lugs aren't good enough now anyway, which I've had to come to terms with, despite hankering after past rigs I've had the pleasure to have owned.

Back to topic. My own CCA was used for years from its analogue output using an Amazon Basics 3.5 to RCA cable and I never found it wanting, streaming a variety of programme sources (internet radio, aforementioned Youtube as well as FLACs and WAVs via VLC player. It's now playing through the SU1 now the DS2 is on the second rig (no sonic change here) and again, I don't find the SU1 subjectively different, although I've kind of gravitated to the SU1 as it's so good with 'flutter reverb' for want of a better term and it's pretty good with a 3-D perspective I find, at least in the smaller scale rig it's played into. Whether owners of huge speakers in equally huge rooms would feel the same, I'm honestly and these days obviously, not sure :)
 
Back when DAB was first introduced in the UK, there were articles in HiFi News I recall, describing the sound of the FM technology we were used to (a slight added 'bloom' in the lower midrange that's there regardless of the tuner), but also if my memory holds, the way the BBC were deciding on streaming rate. 192 or so was acceptable, but the programme material had to be spotless as the codec apparently couldn't determine signal from distortion. 256 was judged the best overall, but I'm not sure the BBC went over to that (I've never been a DAB fan and in any case, DAB+ apparently sorts it but isn't backwards compatible I understand).
Kind of funny considering BBC analog FM has used 14 bit @ 32 kHz NICAM on the distribution chain to the transmitters ever since the 1980s.
 
256 was judged the best overall, but I'm not sure the BBC went over to that
They did for R3, for the other stations they started at 192 and worked their way down over the years as they squeezed more stations into the multiplexes. This was using the less efficient MP2 codec, and it was unlistenable at 128 on R1, which was really annoying as I had an automated way of time shifting John Peel onto my mp3 player to listen to whilst commuting.
 
If a good aac encoder is used, from the lossless files I would expect there wouldn't be important differences between that and 320 ogg, they are both good encoders.

Do your own, it's very illuminating. Foobar has a module to do it, there is plenty of other software out there as well.
Thanks for the tip. I'm probably just suffering from HiFi neurosis. :oops: :D
 
There has been a bunch, some of them quite scientific. The usual conclusion is that the difference is only audible in a few rare (pathological) cases (keys jingling etc.), and even then only if you really know what to listen for. I guess we still have JJ Johnston (the "father of AAC) here on this forum.
It seems so, see Soniclife's post and my reply to the post above.:)
 
Kind of funny considering BBC analog FM has used 14 bit @ 32 kHz NICAM on the distribution chain to the transmitters ever since the 1980s.
I *think* it was 13 bit, with a noise floor in the lower 70s... Low enough for most things, even radio 3 concerts.

Makes me wonder hugely why so much compression was put on commercial digital recordings when it wasn't necessary. Hopefully and with fingers crossed, those days may be over by and large...
 
I *think* it was 13 bit, with a noise floor in the lower 70s... Low enough for most things, even radio 3 concerts.
Hmm, BBC might have used one bit for some special purpose, as the NICAM standard is 14 bits (compressed to 10 bits using non-linear coding, so small signals have 14 bit resolution, but large amplitude signals only 10 bits - supposedly the lower S/N gets masked by the loud signal.
 
Back
Top Bottom