• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

HI-RES, 8K & MAGIC

NorthSky

Major Contributor
Joined
Feb 28, 2016
Messages
4,998
Likes
937
Location
Canada West Coast/Vancouver Island/Victoria area
Expensive:
https://www.bestbuy.com/site/samsun...rt-8k-uhd-tv-with-hdr/6295150.p?skuId=6295150

More affordable but much smaller too:
https://www.dell.com/en-ca/shop/acc...&acd=1230881379347390&VEN3=114804042980503031

But we are in, in the 8K zone, now.
One of my brothers emailed me the other day with a link (in French) regarding Sharp 8K TVs being the first ones to be released back east...Canada, in Quebec?
Let me see if I can relocate that email with that link ...

Here, in English, a different source, but he was right; Sharp is leading the 8K way, and all across America:
https://www.pcworld.com/article/298...l-the-worlds-first-8k-tv-from-next-month.html

They should be in stores soon now.

Extra: https://asia.nikkei.com/Business/Bu...nd-LG-follow-Sharp-with-early-leaps-to-8K-TVs
 
Last edited:

bennetng

Major Contributor
Joined
Nov 15, 2017
Messages
1,634
Likes
1,692
Lossless video is not a practical idea in today's technology. They generally need over 10Gbps of bitrate, no mechanical harddrives can handle this speed, not to mention optical media or network streaming.

https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_for_standard_video

For example the 1080p 60fps youtube demo I posted is only 4.2Mbps, and a typical 1080p video on a standard bluray disc could be in 20Mbps, not even close to the lowest bitrate in the table.

In fact video upscaling is happening even if the video resolution is same as the monitor resolution. Almost all videos are distributed in the 4:2:0 format which means the color (chroma) resolution is stored as half of the video's resolution before sending to a lossy encoder. The chroma needed to be upscaled during playback.
https://en.wikipedia.org/wiki/Chroma_subsampling

This screenshot shows the quality of chroma scaling of different video renderers, yeah it is zoomed and in single color to make the differences more noticeable.
madvr_1196.jpg


Also, some videos are encoded in interlaced formats, but all display panels today are noninterlaced. The deinterlacing process is also an upsampling process between alternate lines, it may also involve motion compensation between different frames if the video renderer is an advanced one.

All of these having nothing to do with the 8k resolution itself, just the quality of the algorithms (encoding/decoding/rendering/upscaling etc).

Those video codec developers have a sense of humour, for example the x264/265 encoders have a preset called "placebo", aimed at maximizing the perceptual model of the codec at the expense of extremely slow encoding speed, in order to make the videos look (or measure, in metrics like PSNR and SSIM) as good as possible at a given bitrate.

For something like "VSR", visit doom9 forum, there are some experts and developers there.
http://forum.doom9.org/
 
Last edited:

NorthSky

Major Contributor
Joined
Feb 28, 2016
Messages
4,998
Likes
937
Location
Canada West Coast/Vancouver Island/Victoria area
Yes, for real 8K streaming high speed Internet is required. Right now it is preferable to have 30 Mbps for 4K, @ minimum 25 Mbps (Netflix 4K). Amazon is cheap, very cheap.

I don't know what the minimum requirement would be for 8K...twice, 60 Mbps, more?
https://www.lightreading.com/video/...bandwidth-for-4k-here-comes-8k!/d/d-id/737330

On physical 4K Blu-ray disc, some films used three layers...BD-100, and the bit rate peaks over 100 Mbps on some of them, like this one, among others:

Billy-Lynns-Long-Halftime-Walk-4K-Ultra-HD.jpg


The only one in the world encoded @ 60fps.
_____

If ever they make 8K Blu-ray discs and disc players, the discs would need 5, 6 layers?
Rainbow discs?
 
Last edited:

XpanD

Active Member
Forum Donor
Joined
Apr 7, 2018
Messages
146
Likes
171
Location
Netherlands
A gamer's perspective.

PC gamers are stuck with relatively low resolutions like 1080p or 1440p if we want to run games in high or ultra high settings. 2160p is not realistic. 2160p is sometimes possible with two high end GPU's, if the game actually supports SLI. I would not bother with upsampling. I think the consoles often upsample to 2160p and then they claim they have "4k gaming," but the game is actually rendered at a lower resolution.

PC gamers are obsessed with refresh rates. 60 Hz looks plenty smooth to me, but many people are buying 144 Hz or 240 Hz monitors. If you go by the minimum or consistent frame rate, these are not realistic either. The consoles run games in terrible fame rates like 20-30 fps in order to make up for the outdated hardware. These frame rates look terrible for gaming, but they are OK for recorded video I guess. Still, if you have that awful fucking shaky cam video, 24 FPS is just not enough, and it looks like shit. You can see a lot of stuttering in recorded video if the camera moves too fast. I think 48 FPS in recorded video would be a great thing to have.

Personally, I am perfectly happy with 1080p/60 for gaming.

Most movies I only want to watch once or twice and then never see them again, so I do not even own a blu-ray drive.

Not sure if I agree with your comments on PC gaming. Sure, there are games that won't get 60fps no matter your settings, but I've been getting 60fps quite happily on a lot of games (including more modern ones) at 4k, with a single 1070. In a good few of those I can get a fair bit higher, too -- a cheaper 1060 would probably do alright as well, there. Then again, I don't play a lot of shooters (or other very heavy 3D games), so maybe things would be different there.

I'm looking forward to more displays supporting 8k/60Hz proper. Source material wouldn't be an issue, as I'd just have my computer output the high res with whatever it's doing. Would love to have that added workspace for programming and the like! (although I'd probably end up at a monitor with an in-between resolution, 8k does get to the point where it's very hard to squeeze in without scaling)
 

Grave

Senior Member
Joined
Jun 30, 2018
Messages
382
Likes
204
Not sure if I agree with your comments on PC gaming. Sure, there are games that won't get 60fps no matter your settings, but I've been getting 60fps quite happily on a lot of games (including more modern ones) at 4k, with a single 1070. In a good few of those I can get a fair bit higher, too -- a cheaper 1060 would probably do alright as well, there. Then again, I don't play a lot of shooters (or other very heavy 3D games), so maybe things would be different there.

I'm looking forward to more displays supporting 8k/60Hz proper. Source material wouldn't be an issue, as I'd just have my computer output the high res with whatever it's doing. Would love to have that added workspace for programming and the like! (although I'd probably end up at a monitor with an in-between resolution, 8k does get to the point where it's very hard to squeeze in without scaling)

This is not possible. I am talking about minimum frame rates, not averages.
 
Last edited:

XpanD

Active Member
Forum Donor
Joined
Apr 7, 2018
Messages
146
Likes
171
Location
Netherlands
Wanting high minimums and very high quality settings is a particularly torturous combination, though. I wouldn't necessarily call that "being stuck on 1080p/1440p", considering a lot of stuff (or at least the stuff I've tried) does seem to run fine at 4k with the settings dropped a little and still looks great. Not sure what the exact minimums would look like there (Steam's overlay only shows averages, I'm guessing?), but I can get most games feeling smooth with only some slight tweaking. Might be down purely to opinion at that point, though -- I can imagine you needing extremely fast hardware if you do want to hit those specific targets at 4k.
 

garbulky

Major Contributor
Joined
Feb 14, 2018
Messages
1,510
Likes
827
Yes, for real 8K streaming high speed Internet is required. Right now it is preferable to have 30 Mbps for 4K, @ minimum 25 Mbps (Netflix 4K). Amazon is cheap, very cheap.

I don't know what the minimum requirement would be for 8K...twice, 60 Mbps, more?
https://www.lightreading.com/video/...bandwidth-for-4k-here-comes-8k!/d/d-id/737330

On physical 4K Blu-ray disc, some films used three layers...BD-100, and the bit rate peaks over 100 Mbps on some of them, like this one, among others:

Billy-Lynns-Long-Halftime-Walk-4K-Ultra-HD.jpg


The only one in the world encoded @ 60fps.
_____

If ever they make 8K Blu-ray discs and disc players, the discs would need 5, 6 layers?
Rainbow discs?
Damn 100 mb/s. I just realized you have to have a pretty fast disc spinner to manage that.
 
OP
svart-hvitt

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Agree.

This chart shows that 1080p is adequate for most domestic situations:

View attachment 16745

What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.

My point is, a TV picture isn’t the same as a drawing or letters on a board. To make a TV picture, algorithms are employed for processing.

What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?
 

NorthSky

Major Contributor
Joined
Feb 28, 2016
Messages
4,998
Likes
937
Location
Canada West Coast/Vancouver Island/Victoria area
Damn 100 mb/s. I just realized you have to have a pretty fast disc spinner to manage that.

I don't know if we will see 8K disc spinners in our lifetime.
Everything it seems get more and more compressed to save space and data, à la MQA.
It's also happening with video "packing". ...A smart algorithm more compact without becoming compromised picture quality wise.

Japan is the place to get our dose from.
 

maverickronin

Major Contributor
Forum Donor
Joined
Jul 19, 2018
Messages
2,527
Likes
3,308
Location
Midwest, USA
What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.

My point is, a TV picture isn’t the same as a drawing or letters on a board. To make a TV picture, algorithms are employed for processing.

What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?

It's based on the smallest a pixel can get before you can't see it any more.

Video artifacts are pretty much all due to encoders, renderers, and frame rate conversion issues.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,368
Likes
234,384
Location
Seattle Area
What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.
The analysis is based on actual acuity/resolution of our eyes. Here is a post I wrote back in 2010 on another forum on the topic.
------------------

I am sure everyone has seen the various calculators out there for the distance to sit relative to the resolution of the display. Those computations are based on the eye having the resolving limit of "minute arc." Put in English, it is said that the eye resolves a detail that is 1/60th of a degree.

The above metric came from observation of the density of the cones at the fovea. One degree of light projects onto an area that is 288 micrometers within which there are 120 cones. Assuming you want to resolve a dark line next to a white line, the actual resolution is said to be 120/2 = 60/degree.

Unfortunately the above computation ignores the rest of the optical system including the lens. As a result, the above math overstates the actual acuity of the eye. A better method is to look at experimental data using human subjects. Here, let's steal the slide from Professor Girod of Stanford university class:
MTF_human_eye.gif


Let's interpret the chart. The vertical axis is the same as any MTF chart in that it shows how much contrast the eye resolves relative to perfection (1.0). The horizontal axis shows increasing "frequency" of a sine wave measured in CPD or cycles per degree. (The different graphs show varying pupil size.)

As you see, the eye response is pretty much shot by the time you get to around 50 cycles per degree with MTF of around 0.2. A black and white patch there would be a smoothed out shade of gray.

The actual response of the eye is even more complex as the instrument acts more as a contrast detector. Its response in that respect is highly optimized around lower frequencies:

csf.gif


By now you are wondering what this has to do with the topic at hand. Well, if you look at these curves, you more or less see a low pass filter. The contrast detected by the eye keeps getting reduced as the details gets finer and finer (i.e. has higher frequency). Since CPD increases with distance (more detail is seen in the same space), the act of moving back acts like the coefficient for a low-pass filter!

Armed with the above knowledge, we can improve our simulation fidelity of how you view pixel edges (which have the highest frequency in the image). Let's take Poynton's super blocky image and apply progressive amounts of low pass filtering to it to simulate longer viewing distances and see what happens:

807565816_NGLRD-O.png


Kind of revealing, no? As you sit further and further back, the pixel edges start to disappear given you the more analog like feel. Resolution is also lost of course as that is what a low pass filter does.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,368
Likes
234,384
Location
Seattle Area
What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?
Compression artifacts are the major visible problem now especially since we have gone to online streaming with much more restrictive bandwidth caps.

Pixel resolution only matters if the image is static and you sit very close.
 
OP
svart-hvitt

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Compression artifacts are the major visible problem now especially since we have gone to online streaming with much more restrictive bandwidth caps.

Pixel resolution only matters if the image is static and you sit very close.

I wonder then, will «smart» algorithms have more room for making their magic if they have more pixels to play with?
 

Krusty09

Active Member
Forum Donor
Joined
Jul 3, 2018
Messages
257
Likes
165
Hello.

I think this needs to be put into 2 different categories. Live and Posted single camera.

For live especially sports it is my guess as most of us on this board at our advanced age will never see anything above 4k uhd hdr.

I am saying this based on first hand experience and not just spewing stuff.

I am at the fore front of live sports television. With that being said there is so much I could go into that it hurts my head but at the end of the end day the best you can hope for now is 1080p hdr and for the near future.

The Olympics will be done this way in this country and 4k hdr outside this county.

For people who say oh why not 4k here i can tell you do a side by you can not tell the difference . This is live not posted and the 2 are not the same. I could ramble on but at the end of the day the way it is now that's the way it will be for the foreseeable future.

Cheers.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,368
Likes
234,384
Location
Seattle Area
I wonder then, will «smart» algorithms have more room for making their magic if they have more pixels to play with?
The smarts actually use very low resolution proxies. The image is downsampled and then compared to a library of known image types to decide what to do with it. The extra pixels don't help with that. But they serve to highly increase the computational power needed to generate them.
 
OP
svart-hvitt

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
The smarts actually use very low resolution proxies. The image is downsampled and then compared to a library of known image types to decide what to do with it. The extra pixels don't help with that. But they serve to highly increase the computational power needed to generate them.

Thanks!

I understand I understand even less of video than audio...

My only practical experience of some importance (to me) till now has been about OLED’s «blackness», which introduces a somewhat different perception. And this experience has nothing to do with resolution or «intelligent» algos.
 

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,459
Location
Australia
What is the method and content (test procedure) behind that table? If the test content behind the table isn’t the same as viewing TV material, its relevance may be low.

My point is, a TV picture isn’t the same as a drawing or letters on a board. To make a TV picture, algorithms are employed for processing.

What I wondered about, is whether higher resolution and a faster chip plus a pinch of AI magic could make for a more convincing TV experience in practice. Could the aforementioned ingredients lead to less visible artifacts?


Science behind the chart.

Think of 1080p video as similar to RBCD in terms of consumer adequacy.

However you probably won't find modern tech features in a 1080p TV.
 
Last edited:

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,459
Location
Australia
Hello.

I think this needs to be put into 2 different categories. Live and Posted single camera.

For live especially sports it is my guess as most of us on this board at our advanced age will never see anything above 4k uhd hdr.

I am saying this based on first hand experience and not just spewing stuff.

I am at the fore front of live sports television. With that being said there is so much I could go into that it hurts my head but at the end of the end day the best you can hope for now is 1080p hdr and for the near future.

The Olympics will be done this way in this country and 4k hdr outside this county.

For people who say oh why not 4k here i can tell you do a side by you can not tell the difference . This is live not posted and the 2 are not the same. I could ramble on but at the end of the day the way it is now that's the way it will be for the foreseeable future.

Cheers.

Agree.
 

NorthSky

Major Contributor
Joined
Feb 28, 2016
Messages
4,998
Likes
937
Location
Canada West Coast/Vancouver Island/Victoria area
Hello.

I think this needs to be put into 2 different categories. Live and Posted single camera.

For live especially sports it is my guess as most of us on this board at our advanced age will never see anything above 4k uhd hdr.

I am saying this based on first hand experience and not just spewing stuff.

I am at the fore front of live sports television. With that being said there is so much I could go into that it hurts my head but at the end of the end day the best you can hope for now is 1080p hdr and for the near future.

The Olympics will be done this way in this country and 4k hdr outside this county.

For people who say oh why not 4k here i can tell you do a side by you can not tell the difference . This is live not posted and the 2 are not the same. I could ramble on but at the end of the day the way it is now that's the way it will be for the foreseeable future.

Cheers.

Do you think 4K UHD HDR in front projection rooms like Dolby Theaters is a good advancement in cinema film's reproduction.
Do you think there is an advantage in shooting films with advanced 4K, 6K, 8K cameras and project them in their respective HDR resolution in our theaters and @ home?
 
Top Bottom