I may not quite understand your question so will try to answer what I think you asked. Feel free to tell me if I'm wrong. The way the movie file is compressed, transmitted, and then uncompressed for viewing can by lossless or lossy depending on the codec used. Just like mp3 vs FLAC. Most of the streaming services use a lossy codec to compress and stream the videos to you over the internet to save money and bandwidth. These will likely be the same codec whether it's streaming to your smart TV or Nvidia shield unless the streaming app has a feature that allows to prefer a certain codec in over another if the client device supports it.
Yes that would be an issue all in itself and my main concern. There’s no additional processing?
I mean do you have a HT setup at home or an AVR that it is hooked up too. When you stream to your AVR does any codecs pop up? Or what does it typically say.
I’m just curious as to why it supports all these different codecs. Does it work a nas or hard drive that used ripped movies? I see the USB ports, is that the only time those codecs actually function?
Cause all in all I am summing up what you are saying and basically that means if it’s streamed it’s going to be crap and there’s no way to upgrade it. My bdp also had streaming services and I’ve never tried those but I am guessing it will be the same results if what you are saying is accurate and I don’t doubt that.
I guess the only way out of this pickle is getting that foolish expensive server with movies that those rich people have but I don’t really want to spend like 2-3 grand. Forgot the name of that company. *Kaleidescape. 7000 grand >_<
Edit: I think I will get the Zidoo Z1000 pro and rent movies from red box or bd and rip it and have it upscale cause audio between 4k and bd are exactly the same according to current understanding in audio output.
Seems like the upscaling on that unit is better than the shield