Although I’m not a network specialist, my understanding is fiber optic per definition doesn’t transmit common mode noise, whereas Ethernet can.
Assuming that the twisted pairs in the Ethernet cable were not doing their job, how would this common mode analogue noise in the physical layer affect the digitally encoded audio data delivered to the application layer? (see OSI Model) The worst that can happen is that the packet is corrupted, is dropped by the receiver and has to be re-transmitted. The time to retransmit a packet will be measured in milliseconds, but streamers buffer several seconds of audio, so thousands of packets* can be re-transmitted with zero impact on the audio data that is sent to the DAC.
*If you're getting this much packet loss, then this would usually point to a cabling issue, rather than a noise problem, although I have experience of a pantograph on top of a train inducing noise in server systems causing them to reboot (the solution was to build a partial Faraday cage on the side of the building facing the train station where the train switched from third rail power to overhead power ).
Here's a fun paper on CMNR in Ethernet published by Intel: https://www.ieee802.org/3/bq/public/nov14/cibula_3bq_02a_1114.pdf
Last edited: