Hi, basic question, I hope. I haven't kept up with this for several years.
I have Spotify Connect with up to 320kbps streaming. Streamers and receivers today often have 384kHz DACs, but the optical, coax or RCA connections max out at 192kHz.
Is there any point, at the moment, to buy hardware with DACs over 192kHz? Is 320/384kHz a hoax if you can't get the signal to the endpoint?
Will something like WiiM Ultra receiving 320kbps Spotify audio and delivering to an amp with 384kHz DACs via 192kHz optical be any better than Spotify going to 192kHz DACs and connections?
I'm just getting back into this. At the moment, I just want to hear Metallica's cymbals via Spotify with minimal compressed distortion.
Thank you
For a quick bit of rundown:
"320kbps" refers to the compression rate of a file or stream. Typically uncompressed data for digital audio is 1411kbps (44100 x 16 x 2 / 1024 - 44100 samples per second, 16 bits per sample, two channels for stereo audio, plus a bit of extra metadata = 1,444,864 bits per second or 1411kbps). Compression brings that down to somewhere between 32kbps and 320kbps for lossy audio, typically.
Hz or kHz refers to the sample rate of audio - how many times per second the analog waveform is sampled. CD-quality audio is 44.1kHz. Most tv/movie audio is 48kHz. Recording studios typically use 96kHz not because it sounds better but because studios will be doing a lot of processing of the audio and capturing more data gives you more headroom.
Bit depth - 16bit, 24bit, or 32bit - refers to how large a "slice" of the waveform is being sampled each time the ADC is sampling the waveform - 16 bits gives you 96dB of dynamic range, 24bit gives you about 120dB of dynamic range, and 32bit gives you almost infinite dynamic range.
CD-quality audio is 16bit, studios typically use 24 or 32 bit audio for similar reasons to using higher sample rates. In theory, sampling at higher frequencies and larger bit depths will lead to the DAC reconstructing a more accurate analog waveform. Some people claim to be able to tell the difference between "standard definition" 16/44 digital audio and "high-definition" 24/96 digital audio. However, in practice, 16/44 is accurate enough to perfectly reconstruct the original analog audio and most tests have shown that if the masters are the same, it's almost impossible to differentiate between 16/44 and 24/96 (or other high sample rate) audio.
A 320kbps Spotify stream is 16/44 and in most cases it's very hard to tell a lossy-compressed 320kbps stream from uncompressed CD-quality audio, but if you have experience in audio engineering and know what kinds of artifacts to listen for, it's possible to hear. If you're just listening for enjoyment though and not trying to prove how good your ears are, you almost certainly won't notice a difference.
No matter how you connect your WiiM to your amp, the WiiM is going to be converting the compressed signal to an uncompressed 1411kbps PCM audio signal. It'll then send that signal to your amp over a digital connection (HDMI, optical, composite) or will further converting it to an analog signal and sending it via RCA.