Typically if you are using a digital output of a given device, there is zero cause for concern when it comes to actual sound quality. While there is maybe some cause for concern around sample rate conversion, it's generally done in a proper / transparent way.
Digital formats sometimes have outright failures that cause very obvious distortion (like the sample rate is wrong and it's NOT converted) but most of the time, you can just say "bits are bits" - a digital signal that isn't undergoing sample rate conversion will typically be 100.0% intact all the way from the internet to your DAC no matter how many digital devices sit in between.
Well, because of the above, I do think they're all full of it because placebo has taken hold of them.
DACs have varying levels of output quality, although most of the >$200 ones in today's market are totally innocent.
Streamers, if they are not also acting as a DAC or converting bit depth / sample rate, just download audio from someplace and spit it out through a port on the back. This is no different than a phone, WiFi router, computer, TV, or anything else. Bits are bits, whether those bits comprise your banking transactions, a Netflix video, this forum, or an audio stream.
Taking it one step upstream, the existence of "audiophile" ethernet cords and switches doesn't prove anything about digital audio. It proves that people are gullible and don't know how IP data transmission works.
The idea that a streamer (again, excluding the DAC section) could even HAVE a sound signature, let alone a better one because it's expensive, is incorrect. This is no different than asserting your Nvidia Shield will somehow have a warmer color temperature on the HDMI video output than a Roku or Apple TV. If that sounds stupid, then so does "expensive streamers sound smoother" or whatever.