It only matters when DIGITAL signal is being converted to ANALOG one (and vice versa, to slightly lesser extent). You were talking in your post about "approximate" ones and zeroes traveling across the network being affected by jitter. This is not true, there is no issue in transmitting and storing DIGITAL information, unless of course your computer is broken.
The difference between S/PDIF (as well as HDMI) and USB protocols is not in their intrinsic error rates, it actually arises at the receiving end, i.e. at the MCU chip which converts digital stream into analog signal, or to another digital protocol (I2S) to route it to a codec. This involves synchronization of two clocks (master and slave ones, sorry for using politically incorrect terminology) which is a tricky operation and potentially can create jitter. Once again, there is no jitter within digital domain, only when signal traverses into analog domain.
USB audio is easier to implement because of asynchronous mode, which automatically reclocks incoming signal using slave's clock instead of extracting embedded master's clock from the signal itself and then reclocking it to the slave's clock. In modern electronics, however, both types of protocols (embedded clock and serial async) are completely solved designs, done deal. That said, there are still tons of OEMs which simply does not care about audio and use cheap MCUs unsuitable for digital audio. Quite commonly found in computer motherboards and various embedded devices, including mobile phones. To be fair, such devices usually do not claim having any HiFi audio capabilities.