You seem to have confused latency with jitter. Latency in itself is never an issue for music playback. The music we listen to has generally been recorded at least a few months prior, if not years or decades. A few more milliseconds won't make a difference. When streaming, the receiver needs to have a sufficient buffer size to handle variations in latency, aka jitter. Due to randomly changing network conditions, there will be occasional latency spikes, meaning the data stream will be interrupted for 100 ms or so, sometimes more. To avoid glitches in playback when this happens, the receiver must have a buffer holding a second or more of audio data. The minimum buffer size required depends on the worst-case variation in latency (and the bandwidth utilisation). The absolute latency does not matter. A ridiculously high latency would be noticeable only as a delay when playback is first initiated.