Digital data is not as noise sensitive as an audio analog signal. A one's a one and a zero's a zero, and we don't care about what's in between, and the digital signal swings between the two states are large. There's also error correction available to digital signals that are not available to analog ones. Of course, analog audio signals are slow changing compared to digital ones, so transmission of analog is not as tricky as sending something with 5nSec edges down a cable. They both have their own problems, all of which can be successfully dealt with.
Off topic, but one of my favourite areas. In the real world there isn't any such thing as a digital signal. One of the first lessons any digital designer learns is that everything is analog. Nice crisp waveforms representing ones and zeros are great for representation and explaining principles, but if you hook a scope up to your signals and they look like that, all it tells you is that you can go a lot faster, and are not trying hard enough. Eventually everything returns to Shannon. The information carrying capacity of a channel is defined by its signal to noise and its bandwidth (actually the integral over the bandwidth wrt the signal to noise). This is true no matter what signal you run down the line. What actually gets sent down the wire looks a lot different to the simple idea of ones and zeros. That went out with serial lines decades ago. Now quadrature encoding is pretty standard, and the actual line driver and receiver systems are near science fiction. Getting a gigabit signal to transit through a connector that was designed for nothing more than voice is a triumph. Every gigabit phy contains adaptive active line driver logic and effectively a miniature TDR. What they are capable of doing to get the signal though is incredible. 10Gb systems defy the imagination. They work because they wring every last drop of capability out of the channel. They don't do this by sending nice sharp edged bits down the wire, they do it by treating the system as a total mess of an analog system, characterising it to the n'th degree and pushing the edge of what Shannon says they can achieve by pulling every analog level trick possible to find corners they can cram a few more bits into.
But of course the protocols include error correction, and TCP is robust against packet loss. So if you don't have a real time communication problem you can relax a bit and let that take care of problems. But if you have a real time problem, one where latency really matters, you need to be on top of things.
Latency is the big problem with any digital audio system. Domestic users hardly ever see this. People moan if the lip sync is a bit off. That is about it. You simply cannot perform audio recording if there is perceptible latency. The wheels fall off. This places a very hard and implacable boundary on what a digital system can achieve. One that no amount of fancy computational capacity can solve.
People may have heard stories about a live sound engineer taking out revenge on a band that had annoyed him. Larson drew a famous cartoon.
There are various stories. One involves pitch shifting the foldback. But it is much much easier. Just delay the foldback slightly. The band will instantly suck.