There will be 8K TVs from every manufacturer in a couple of years. 4K/UHD has already been commoditized so they have no choice but to jump on the "8K" bandwagon.
A gamer's perspective.
PC gamers are stuck with relatively low resolutions like 1080p or 1440p if we want to run games in high or ultra high settings. 2160p is not realistic. 2160p is sometimes possible with two high end GPU's, if the game actually supports SLI. I would not bother with upsampling. I think the consoles often upsample to 2160p and then they claim they have "4k gaming," but the game is actually rendered at a lower resolution.
PC gamers are obsessed with refresh rates. 60 Hz looks plenty smooth to me, but many people are buying 144 Hz or 240 Hz monitors. If you go by the minimum or consistent frame rate, these are not realistic either. The consoles run games in terrible fame rates like 20-30 fps in order to make up for the outdated hardware. These frame rates look terrible for gaming, but they are OK for recorded video I guess. Still, if you have that awful fucking shaky cam video, 24 FPS is just not enough, and it looks like shit. You can see a lot of stuttering in recorded video if the camera moves too fast. I think 48 FPS in recorded video would be a great thing to have.
Personally, I am perfectly happy with 1080p/60 for gaming.
Most movies I only want to watch once or twice and then never see them again, so I do not even own a blu-ray drive.
Not sure if I agree with your comments on PC gaming. Sure, there are games that won't get 60fps no matter your settings, but I've been getting 60fps quite happily on a lot of games (including more modern ones) at 4k, with a single 1070. In a good few of those I can get a fair bit higher, too -- a cheaper 1060 would probably do alright as well, there. Then again, I don't play a lot of shooters (or other very heavy 3D games), so maybe things would be different there.
I'm looking forward to more displays supporting 8k/60Hz proper. Source material wouldn't be an issue, as I'd just have my computer output the high res with whatever it's doing. Would love to have that added workspace for programming and the like! (although I'd probably end up at a monitor with an in-between resolution, 8k does get to the point where it's very hard to squeeze in without scaling)
Damn 100 mb/s. I just realized you have to have a pretty fast disc spinner to manage that.Yes, for real 8K streaming high speed Internet is required. Right now it is preferable to have 30 Mbps for 4K, @ minimum 25 Mbps (Netflix 4K). Amazon is cheap, very cheap.
I don't know what the minimum requirement would be for 8K...twice, 60 Mbps, more?
• https://www.lightreading.com/video/...bandwidth-for-4k-here-comes-8k!/d/d-id/737330
On physical 4K Blu-ray disc, some films used three layers...BD-100, and the bit rate peaks over 100 Mbps on some of them, like this one, among others:
The only one in the world encoded @ 60fps.
_____
If ever they make 8K Blu-ray discs and disc players, the discs would need 5, 6 layers?
Rainbow discs?
Damn 100 mb/s. I just realized you have to have a pretty fast disc spinner to manage that.
What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.
My point is, a TV picture isn’t the same as a drawing or letters on a board. To make a TV picture, algorithms are employed for processing.
What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?
The analysis is based on actual acuity/resolution of our eyes. Here is a post I wrote back in 2010 on another forum on the topic.What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.
Compression artifacts are the major visible problem now especially since we have gone to online streaming with much more restrictive bandwidth caps.What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?
Compression artifacts are the major visible problem now especially since we have gone to online streaming with much more restrictive bandwidth caps.
Pixel resolution only matters if the image is static and you sit very close.
The smarts actually use very low resolution proxies. The image is downsampled and then compared to a library of known image types to decide what to do with it. The extra pixels don't help with that. But they serve to highly increase the computational power needed to generate them.I wonder then, will «smart» algorithms have more room for making their magic if they have more pixels to play with?
The smarts actually use very low resolution proxies. The image is downsampled and then compared to a library of known image types to decide what to do with it. The extra pixels don't help with that. But they serve to highly increase the computational power needed to generate them.
What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.
My point is, a TV picture isn’t the same as a drawing or letters on a board. To make a TV picture, algorithms are employed for processing.
What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?
Hello.
I think this needs to be put into 2 different categories. Live and Posted single camera.
For live especially sports it is my guess as most of us on this board at our advanced age will never see anything above 4k uhd hdr.
I am saying this based on first hand experience and not just spewing stuff.
I am at the fore front of live sports television. With that being said there is so much I could go into that it hurts my head but at the end of the end day the best you can hope for now is 1080p hdr and for the near future.
The Olympics will be done this way in this country and 4k hdr outside this county.
For people who say oh why not 4k here i can tell you do a side by you can not tell the difference . This is live not posted and the 2 are not the same. I could ramble on but at the end of the day the way it is now that's the way it will be for the foreseeable future.
Cheers.
Hello.
I think this needs to be put into 2 different categories. Live and Posted single camera.
For live especially sports it is my guess as most of us on this board at our advanced age will never see anything above 4k uhd hdr.
I am saying this based on first hand experience and not just spewing stuff.
I am at the fore front of live sports television. With that being said there is so much I could go into that it hurts my head but at the end of the end day the best you can hope for now is 1080p hdr and for the near future.
The Olympics will be done this way in this country and 4k hdr outside this county.
For people who say oh why not 4k here i can tell you do a side by you can not tell the difference . This is live not posted and the 2 are not the same. I could ramble on but at the end of the day the way it is now that's the way it will be for the foreseeable future.
Cheers.