• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

The Case Against OLED

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
Has anybody mentioned that OLED is not all that bright!
If you play proper HDR on Oled, it can clip.
I went for QLED!
My 75" QLED is brighter than any oled I could afford.
I still paid just under £3K for it, and I could get an oled, even better oleds at £1K more, but I preferred the QLED.
It also has the lowest picture noise without filters, that I could find.
 

maverickronin

Major Contributor
Forum Donor
Joined
Jul 19, 2018
Messages
2,527
Likes
3,311
Location
Midwest, USA
Maybe it's just because I've been away from HT for so long now, but short of using the display outdoors in broad daylight, I fail to see the point in making screens even brighter. They've already been too bright for more than 15 years now. Pretty much every screen I own is set to 10% brightness or less. I don't want a pan up shot to the noonday desert sun to look like the actual sun. That's just a few steps less silly than giving your speakers the dynamic range to more accurately reproduce gunshots and explosions.

What am I missing here?
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
What am I missing here?
HDR.
First came Hi-def (1080p), then came 4K (2160p). So far the resolution got better and better.
Then came HDR and Dolby Vision. With these advents, the contrast ratio was enhanced.
In short, to display HDR, one needs a display with extraordinary contrast ratio, hence max. brightness.
Only the brightest Oleds may be up to the task.
Barely!
 

maverickronin

Major Contributor
Forum Donor
Joined
Jul 19, 2018
Messages
2,527
Likes
3,311
Location
Midwest, USA
In short, to display HDR, one needs a display with extraordinary contrast ratio, hence max. brightness.

Like I said. Why would I want that? Screens are already painfully bright for indoor use.

The eye has fairly limited instantaneous dynamic range. I don't want to have to squint and wait for my pupils to readjust every time there's a cut from night to day.

It seems like another pointless standard designed to fuel upgrade cycles.
 

Tim Link

Addicted to Fun and Learning
Forum Donor
Joined
Apr 10, 2020
Messages
776
Likes
661
Location
Eugene, OR
Like I said. Why would I want that? Screens are already painfully bright for indoor use.

The eye has fairly limited instantaneous dynamic range. I don't want to have to squint and wait for my pupils to readjust every time there's a cut from night to day.

It seems like another pointless standard designed to fuel upgrade cycles.
Careful there! With that attitude you're going to end up falling behind while enjoying yourself and saving money.
 

Tim Link

Addicted to Fun and Learning
Forum Donor
Joined
Apr 10, 2020
Messages
776
Likes
661
Location
Eugene, OR
Has anybody mentioned that OLED is not all that bright!
If you play proper HDR on Oled, it can clip.
I went for QLED!
My 75" QLED is brighter than any oled I could afford.
I still paid just under £3K for it, and I could get an oled, even better oleds at £1K more, but I preferred the QLED.
It also has the lowest picture noise without filters, that I could find.
Yes, I have mentioned that. A lot of people are satisfied with their current brightness. I'm one who prefers more brightness. I can see better when it's brighter, and I'd rather that dark scenes look brighter than reality than have bright scenes looking a lot darker. I don't enjoy trying to see in the dark. That's why I sleep mostly at night.
 

Krusty09

Active Member
Forum Donor
Joined
Jul 3, 2018
Messages
264
Likes
173
Yes, I have mentioned that. A lot of people are satisfied with their current brightness. I'm one who prefers more brightness. I can see better when it's brighter, and I'd rather that dark scenes look brighter than reality than have bright scenes looking a lot darker. I don't enjoy trying to see in the dark. That's why I sleep mostly at night.
Hello.

Not sure what you mean by this statement.

(If you play proper HDR on Oled, it can clip.
I went for QLED)

What is proper hdr?
 
D

Deleted member 48726

Guest

interesting to see how dim and bland plasma tvs seems now, yet they were SOTA at their time
Haven't watched it. Are they calibrated? What do you watch it on?

Also that's Oled. I don't think anyone claims plasma is looking better than oled. I would take my old plasma over most LED though.
 

Tim Link

Addicted to Fun and Learning
Forum Donor
Joined
Apr 10, 2020
Messages
776
Likes
661
Location
Eugene, OR
Hello.

Not sure what you mean by this statement.

(If you play proper HDR on Oled, it can clip.
I went for QLED)

What is proper hdr?
I meant that I have mentioned that OLED is not that bright. As for the part about proper HDR clipping on an OLED, that shouldn't happen because the TVs do their own tonemapping to re-map content that is too bright for the TV to properly display. What bothers me about OLED is that they can end up dynamically re-tone mapping brightness in different ways depending on how much of the screen is getting bright. If a lot of the screen is supposed to be bright, the OLED may only be able to reach less than 300 nits. If a smaller portion is supposed to be bright only briefly, it may allow a temporary highlight to reach 1200 nits. This varying brightness based on characteristics of the scene is not part of the film makers intent, and it makes it hard for the eye to adjust to the TVs capabilities, because they're a constantly moving target. To avoid that, an OLED needs to limit any point on the screen to about 220 nits or less, because that's what the entire screen can do sustained. 220 nits isn't enough for true HDR no matter how black the blacks can get. That said, the dynamic tonemapping trickery can make a lot of scenes get that HDR look. It's not as bad as I'm putting on here, but some of us are bothered by it.
Proper HDR is content that has brightness information that goes up beyond SDR. This involves encoding the video or pictures with 10 or 12 bit color rather than the standard 8 bit color that standard dynamic range uses. The extra bits allow brighter parts of the scenes to get brighter without blowing out, so detail and color is maintained in both the bright areas and dark areas of the picture, while also maintaining smooth color gradients. You can push an SDR video into the HDR dynamic range on an HDR TV, but the penalty is often color banding. There aren't enough colors available to smoothly fill out the transitions.
Standard HDR I think is graded with the intent of 100 nits peak brightness, and about 300 to 600 to 1 contrast ratio. At that brightness and maximum contrast there should be no color banding visible to the human eye. We often find it more satisfying to push the brightness and contrast higher than that, but that does lead to color banding sometimes becoming obvious. HDR facilitates smooth gradients up to 4000 nits, maybe 10,000 nits, and much higher contrast ratios. Some types of LCD displays can do 6000:1 or more when viewed on axis. With mini LED local dimming backlights they can reach much higher than that, although there are some blooming artifacts, the severity of which depend on how many dimming zones there are and how effectively they are utilized. OLEDs can reach theoretically infinite contrast when viewed in a perfectly dark room with no reflective surfaces. But our eye has a lower limit of what it can detect so anything below that is not meaningful. You have to go brighter too for the HDR to have visual impact.
 

Krusty09

Active Member
Forum Donor
Joined
Jul 3, 2018
Messages
264
Likes
173
Hey.

So what your saying is that if you feed a oled like the hx310 with hdr content and there are a few different kinds and is properly exposed under 109 where the clip is on the camera sensor for hdr that the oled monitor is going to clip aka color shift aka change the tone mapping? I have never seen this happen on a properly exposed camera. I have seen cameras clip and if there is a knee set blow clip then yes I have seen color shift but that is the camera and not the monitor.
 

holdingpants01

Addicted to Fun and Learning
Forum Donor
Joined
Mar 18, 2023
Messages
669
Likes
1,040
It depends on the settings:
sure, if you reduce the brightness and color saturation on the OLED you can come close with the lights turned off, but that was mentioned in the video I posted as well
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
Like I said. Why would I want that? Screens are already painfully bright for indoor use.
And my wife is perfectly happy listening to music through her phone, she finds Hifi speakers too big and loud!
The eye has fairly limited instantaneous dynamic range. I don't want to have to squint and wait for my pupils to readjust every time there's a cut from night to day.
It seems like another pointless standard designed to fuel upgrade cycles.
Considering that HDR material has less contrast-ratio capability, compared to many films made in the 50s, and that Home-Cinemas are a reality these days, your view on the subject, is at best a personal one.
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
Not sure what you mean by this statement.
(If you play proper HDR on Oled, it can clip.
I went for QLED)
What is proper hdr?
"Proper" was a bad choice of word on my part. Not all HDR material are the same, some have larger contrast ratios, because of material used.
I have a hobby of enhancing old movies using Ai. My recent project has been Lawrence of Arabia. I got the official BR HDR to use as source material. Even the official release, has great contrast ratio compared to many other titles I have seen. After my enhancements, this can get even larger!
On bright scenes (Sahara sun), the whites can clip on my screen, even though I have it at max brightness, while darker night scenes can clip to black.
I am sure the white clipping would be even worse on an OLED, if the screen tries to show the darker scenes correctly.
To draw a parity with Hifi, think of it as dynamic range. if your DAC has only 8bit resolution, if you try to play a symphony that has dynamic range of 12 bits, then either the hi's clip, or the lo's disappear.
 

Tim Link

Addicted to Fun and Learning
Forum Donor
Joined
Apr 10, 2020
Messages
776
Likes
661
Location
Eugene, OR
I think this is an unusual use of the word clip. Professional HDR monitors clip. They follow the EOTF curve ruthlessly and don't re-tone map anything, or so I'm told. That way the content creator can see very accurately what is happening, anything that goes beyond the monitor's abilities is super obvious. They count on the end user's TV to soften those transitions appropriately. An OLED or MiniLED for consumer use will use curves to flatten the image in various ways specifically so that it does not clip. I guess that's called soft clipping in audio, so you could call any tone mapping from a larger color space into a smaller space a sort of soft clipping. It can all be avoided if you set your color grading white point to the limits of whatever TV you plan to use. And since you're making your own re-mastered content as a hobby you can do that. Now that I'm thinking about it, color grading during mastering involves a sort of pre-tone mapping. The consumer's TV then re-tone maps. If you really want to be sure people are going to see what you intended you need to set your max white point to whatever any viewer's TV can handle without having to process the image. With OLED, that's not going to be very bright. I have a fairly bright mini LED. For it to be consistent the max level is 650 nits because that's how bright the whole screen can be sustained at once. That's still not bright enough to be true HDR. It can handle over 1000 nits for smaller areas of the screen, so if the content is created to only allow small highlights to reach that level, then my TV should be able to show material graded to 1000 nits correctly, and so can an OLED, so long as those highlights don't stay in one place on the screen for too long, and larger areas of the screen don't get above a few hundred nits. Sustained whole screen brightness is only about 220 nits for the latest OLED, about 1/3 as bright as my mini LED. It puts some handcuffs on the content creator, with the OLED handcuffs being tighter. If you know your viewer is going to be watching on an OLED, you might opt for a lot of darker scenes with small highlights, because OLEDs are great with that. If you know they're using a mini LED, you might have less black areas on the screen and focus more on brightly illuminated sunny outdoor scenes, because mini LED do a great job on those but reveal blooming and light bleed on very dark areas.
 
Last edited:

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
I think this is an unusual use of the word clip. Professional HDR monitors clip. They follow the EOTF curve ruthlessly and don't re-tone map anything, or so I'm told.
1702746766724.jpeg


This what I mean. This monitor, follows the ETOF pretty well, until it reaches its max. brightness, then it clips (for the lack of a better word).
So any scenes coded for 400 cd/m² and above, would show as white patch. This monitor can not display any detail variation of super bright scenes.
Equally, because it is not OLED, it can not go to zero either, so it will always have some leakage below 1.5 cd/m² or so.
 

Tim Link

Addicted to Fun and Learning
Forum Donor
Joined
Apr 10, 2020
Messages
776
Likes
661
Location
Eugene, OR
View attachment 334753

This what I mean. This monitor, follows the ETOF pretty well, until it reaches its max. brightness, then it clips (for the lack of a better word).
So any scenes coded for 400 cd/m² and above, would show as white patch. This monitor can not display any detail variation of super bright scenes.
Equally, because it is not OLED, it can not go to zero either, so it will always have some leakage below 1.5 cd/m² or so.
Thanks. Yes, that's a great example. This TV follows the curve closely for a while, and then it softly curves over. A professional monitor would have a very sharp and abrupt kink where the brightness quits going up. This TV will still look natural as it blows out highlights. You can play with the settings so that it starts to knee over sooner so that more highlights can be preserved for detail and color at the expense of the image looking duller and darker overall. Some blowout of the highlights is always going to happen even with a 10,000 nit TV that can do zero blacks. The cameras can only do so much, so that curving over has to start somewhere unless you very carefully control the scene lighting and camera exposure so that nothing is every under or over exposed - like they used to do in movies in the old days. The scale on that chart is interesting, and shows more how we percieve brightness. 10,000 nits doesn't appear 40 times brighter to us than 244 nits. But it does require 40 times the electrical power to produce! So a 10,000 nit TV would look great, and would need to be very energy efficient to be practical. I've read the reviews from a trade show where Sony demonstrated an experimental 10,000 nit capable TV. Nobody complained that it was too bright. Everybody seemed to agree it looked great, and that 10,000 nit peak was a very desireable thing.
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
Thanks. Yes, that's a great example. This TV follows the curve closely for a while, and then it softly curves over. A professional monitor would have a very sharp and abrupt kink where the brightness quits going up. This TV will still look natural as it blows out highlights. You can play with the settings so that it starts to knee over sooner so that more highlights can be preserved for detail and color at the expense of the image looking duller and darker overall. Some blowout of the highlights is always going to happen even with a 10,000 nit TV that can do zero blacks. The cameras can only do so much, so that curving over has to start somewhere unless you very carefully control the scene lighting and camera exposure so that nothing is every under or over exposed - like they used to do in movies in the old days. The scale on that chart is interesting, and shows more how we percieve brightness. 10,000 nits doesn't appear 40 times brighter to us than 244 nits. But it does require 40 times the electrical power to produce! So a 10,000 nit TV would look great, and would need to be very energy efficient to be practical. I've read the reviews from a trade show where Sony demonstrated an experimental 10,000 nit capable TV. Nobody complained that it was too bright. Everybody seemed to agree it looked great, and that 10,000 nit peak was a very desireable thing.
That graph is for my PC monitor, a 27" Dell. I chose it, because it was cheap (ish) has 4K HDR capability. Colours are pretty accurate too.
I do a bit of video work, so I needed something.
 

Ken Tajalli

Major Contributor
Forum Donor
Joined
Sep 8, 2021
Messages
2,081
Likes
1,888
Location
London UK
TCL inkjet printed RGB OLED by 2025?
And Samsung's QD-OLED which is supposed to fix the brightness of the OLED efficiently, using Quantum Dot technology.
This could really be a game changer.

 
Top Bottom