• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Wifi VS Ethernet for streaming quality

Indydan

Member
Joined
Feb 17, 2026
Messages
47
Likes
239
Yesterday, I put to the test, a long standing assertion in audiophile realms: ethernet is better than Wifi. I have seen countless threads on ethernet cables and switches. I have gone down this rabbit hole myself. Common audiophile wisdom is that a physical ethernet connection provides better sound quality than Wifi. Yeah right…

I disconnected my switch last night and went Hifi. I am getting 700 mbps download speeds through Wifi. More than enough for hi res music and 4K HDR streaming.

I perceive no quality differences when listening to music, as well as watching 4K HDR movies.

I no longer have the clutter of an ethernet switch and cables. Another Hifi myth down the toilet!
 
Down the toilet maybe for your system. In reality it is far from disproved for many folks who have poor wifi reception in some areas or interference from other nearby wifi systems or a congested network etc. etc.
I agree that ethernet is more reliable. But, if one has strong Wifi with no drop outs, there is no loss in quality.
 
Audio is at the most low single-digit mbit/s range, most typically around ~1 mbit/s if lossless compressed or ~0.256mbit/s if lossy.

Most home internet lines are capable of hundreds of mbit/s now, with gigabit increasingly available in the West. both wifi and ethernet can easy do hundreds of mbit/s to even gbit/s range.

Audio just doesn’t use very much bandwidth in the context of internet and home networking - both wifi and ethernet should be able to comfortably handle 100x the bandwidth required. If reliability of your wifi is an issue, then that is its own problem unrelated to what it is you’re streaming. Either your music packets will be delivered or they won’t - this isn’t like analogue where they’ll be somehow worsened by the medium in which they got to you.
 
Down the toilet maybe for your system. In reality it is far from disproved for many folks who have poor wifi reception in some areas or interference from other nearby wifi systems or a congested network etc. etc.
Incorrect. Anyone who knows how data and networking actually work knows that this is nonsense. Either the bits get delivered or they don't. If the wi-fi is unreliable enough and the bits don't get delivered in time, then you have a problem. It's not subtle changes to the sound, however. It's drop outs or the stream simply stopping.
 
I agree that ethernet is more reliable. But, if one has strong Wifi with no drop outs, there is no loss in quality.
Except that is not really what you said in your first post, since you failed to touch on the issue of reliability under sub-optimal conditions.

Audio quality under normative operating conditions is identical for both ethernet and Wi-Fi. That has been the "Common Wisdom" for quite some time.

And for many people, the reliability of both will be sufficiently high so that there is no practical difference. For example, my Wi-Fi access point and my wireless audio devices are in the same room, tuned to a Wi-Fi channel that is not in use by my neighbors. Reliability of Wi-Fi there is perfectly fine for my needs.

But, reliability (and thus quality) of audio over Wi-Fi can be affected by a lot of different factors — especially those living in higher density areas such as apartments. So the issue of whether Wi-Fi is an equivalent option for any particular person depends on their situation.
 
Incorrect. Anyone who knows how data and networking actually work knows that this is nonsense. Either the bits get delivered or they don't. If the wi-fi is unreliable enough and the bits don't get delivered in time, then you have a problem. It's not subtle changes to the sound, however. It's drop outs or the stream simply stopping.
I'm curious why you said "Incorrect". It seems you agree with @gwing that audio quality can be bad for people with Wi-Fi reception issues.
 
I'm curious why you said "Incorrect". It seems you agree with @gwing that audio quality can be bad for people with Wi-Fi reception issues.
I guess it's possible I misunderstood them. But considering that the OP was specifically referring to audiophile claims that the sound quality -- which does not generally refer to the obvious issues caused by bits not arriving on time but rather audiophile myths and bugbears like PRAT or musicality or whatever -- was affected by ethernet versus wi-fi, my interpretation is that they were saying that wi-fi can in fact have a negative impact in terms of those audiophile myths.
 
WIFI has become very good, but a cable is dirt cheap, very rarely needs an upgrade to reach most speeds and never - never needs a new password, gets disturbed or tangled with -except if you have rodents.
When for example I bought my first Logitech G703 mouse with lightspeed technology - a Logitech term - then I really for the first time felt that a wireless mouse became fast enough to fully retire my wired mouse - even for gaming. So, we're getting good at this.
 
I guess it's possible I misunderstood them. But considering that the OP was specifically referring to audiophile claims that the sound quality -- which does not generally refer to the obvious issues caused by bits not arriving on time but rather audiophile myths and bugbears like PRAT or musicality or whatever -- was affected by ethernet versus wi-fi, my interpretation is that they were saying that wi-fi can in fact have a negative impact in terms of those audiophile myths.
And it is possible I misunderstood @gwing (so they can jump in and speak for themselves :cool:), but I think they were noting that audio over Wi-Fi may be affected by issues such as poor wifi reception, interference from other nearby wifi systems, congested networks, etc.

My background in networking technology leads me to agree with @gwing that such issues are real-world possibilities and not just theoretical. And that when present, they might (or might not) cause reliability issue with audio transmitted over Wi-Fi for the particular person.

Edit: I also agree with @Digital_Thor. Wi-Fi has gotten really good over the past few generations. Especially with increased capabilities for adaptive channel management.
 
People that don't have workable Wi-Fi should definitely use the wires. But I can't recall when it was last time I used ethernet? Probably before Wi-Fi went to 5gh which was ages ago.

I do remember how ethernet port looks like as needed to plug in my 2 year old dirt cheap telecom provided router that does incredibly good job despite all the prejudice. I honestly can't tell the difference between the 30 and 700 bandwidth in two of my setups. 30 is definitively a better value for my use case.
 
Audio is at the most low single-digit mbit/s range, most typically around ~1 mbit/s if lossless compressed or ~0.256mbit/s if lossy.

Most home internet lines are capable of hundreds of mbit/s now, with gigabit increasingly available in the West. both wifi and ethernet can easy do hundreds of mbit/s to even gbit/s range.

Audio just doesn’t use very much bandwidth in the context of internet and home networking - both wifi and ethernet should be able to comfortably handle 100x the bandwidth required. If reliability of your wifi is an issue, then that is its own problem unrelated to what it is you’re streaming. Either your music packets will be delivered or they won’t - this isn’t like analogue where they’ll be somehow worsened by the medium in which they got to you.
I remember in late 2000s in the airforce (me as a "cellar child" aka technician) we once were tasked to upgrade the network wiring in the medical staff building. It was - gasp - 10mbit BNC. We put new wire and sockets for 100mbit.

Even the rusty old network from the 90s would've been perfectly fine for audio. That's how little the required bandwidth is.
 
I remember in late 2000s in the airforce (me as a "cellar child" aka technician) we once were tasked to upgrade the network wiring in the medical staff building. It was - gasp - 10mbit BNC. We put new wire and sockets for 100mbit.
You are officially old. And I'm even older since I go back to the 1980's installing and managing a network of Xerox workstations with the yellow coax cable and vampire taps for ethernet.

As others noted, there is no reason not to use Wi-Fi for the absolute vast majority of people... including me, even though I always opt for ethernet in nearly all cases out of personal perference.
 
My experience is very "digital"! It is either 1 or 0. If for whatever reason the signal gets weak, there is no sound. If it is strong enough for proper reception the sound is the same, regardless if it is 10 dBm more or less. It is not a matter of feeling but Node tells you how strong your signal is.
 
You are officially old. And I'm even older since I go back to the 1980's installing and managing a network of Xerox workstations with the yellow coax cable and vampire taps for ethernet.
Well, that's flattering and kinda insulting at the same time, if you get my meaning. I guess I'm therefore officially welcomed to "the old club". Not bad having held out as "still young" until the ripe age of 45. :D

Sidenote telling my age: back then in the mid 2000s my unit still used oldschool technology like tape backup machines for digital telephone/data exchanges and directional radio transmission systems. It was all cold war technology at heart and decommissioned shortly after. Still a lot of fun handling real-time air defense data transmission on text prompt terminals running on 386 computers. At that time a fifth of the whole German air defense network (my battalion) was still running on seven off the shelf Windows NT and 2000 computers - really mere Pentium PCs - one of which exclusively handling seasonal bird migration tracking and military secrecy safe filtered data transmission to relevant civilian environmental authorities. Imagine that, national security depending on that... Haha!
 
Last edited:
As others noted, there is no reason not to use Wi-Fi for the absolute vast majority of people...
I would have to disagree with that a bit. RF is a limited resource. The airtime should be conserved for things that actually need it, not for things that don't generally move and could easily be wired in. The 5GHz band is already a congested mess in any populated area, the only saving grace for indoor use is that walls are pretty effective at attenuating it. When I'm trying to do outdoor PtP or PtMP links, 5GHz has become a nightmare due to the number of devices screaming out in that band on 80MHz or even 160MHz channels these days. Pretty much impossible to find a clear channel in any remotely populated area.
 
It is getting to the point that devices may get a faster connection via WiFi than via their wired socket. Typical wired sockets only go to 1gbps, which WiFi 6 can exceed.

(If a device can have a wired connection I still prefer this for the robustness reasons already metioned and also to conserve WiFi bandwith for devices that don't have the option of a wired connection).
 
It is getting to the point that devices may get a faster connection via WiFi than via their wired socket. Typical wired sockets only go to 1gbps, which WiFi 6 can exceed.
In theory. In practice often not so unless you're in pretty close proximity with clear LoS to the access point, have the requisite amount of antenna diversity, and have a decently clear channel of sufficient size. Plus, any halfway competent install of cat5e can easily support 2.5Gb ethernet. Even 5Gb is a good bet as long as the cable run isn't too long.
 
Back
Top Bottom