RedCometZero
Member
- Joined
- Apr 13, 2018
- Messages
- 64
- Likes
- 82
Saw this linked on reddit.
Poster on reddit said:
The reddit guy seems right to me, but Archimago measured a very low level of 60Hz mains noise that got past the galvanic isolation of ethernet. I still don't see how that could cause any of the alleged benefits of the Polk Audio post.
Taking a picture of a screen is really bad/unscientific way to show a difference like this, as conditions are almost impossible to stay the same, and the camera CCD is going to have different noise every time it takes a picture. There's no assurance from the author that he's controlled for time of day lighting conditions, and then there's things like LG OLED's un-defeatable AFAIK brightness limiting and burn-in protection that could cause the difference in pictures. You'd probably need a bitstream comparing device to really confirm whether there's a difference.
What do others think? It seems pretty far-fetched that a NAS's PSU could affect audio/video quality.
Poster on reddit said:
"This guy is claiming that using a linear power supply (which produces less noisy power than a switching mode power supply) on your NAS will lead to better picture quality on the device you are streaming to, and presumably better audio quality.
This is totally insane, the type of power-supply will not cause datastream errors (ethernet is galvanically isolated to prevent this), and if somehow it could change the 1s and 0s transmitted by your NAS and decoded by your player onto your display, the result would be large glitches, not "ooh this one is slightly darker and less detailed."
This is the rough equivalent of saying that your blu ray looks better because you filled your car up with premium gasoline before you went to pick it up."
The reddit guy seems right to me, but Archimago measured a very low level of 60Hz mains noise that got past the galvanic isolation of ethernet. I still don't see how that could cause any of the alleged benefits of the Polk Audio post.
Taking a picture of a screen is really bad/unscientific way to show a difference like this, as conditions are almost impossible to stay the same, and the camera CCD is going to have different noise every time it takes a picture. There's no assurance from the author that he's controlled for time of day lighting conditions, and then there's things like LG OLED's un-defeatable AFAIK brightness limiting and burn-in protection that could cause the difference in pictures. You'd probably need a bitstream comparing device to really confirm whether there's a difference.
What do others think? It seems pretty far-fetched that a NAS's PSU could affect audio/video quality.