• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

I don't like how HDR is handled

ThatM1key

Major Contributor
Joined
Mar 27, 2020
Messages
1,140
Likes
985
Location
USA
When 4K came along, it was amazing. Then HDR came, and it was a nice treat with 4K. I sometimes wish we didn't get HDR or at least wish it was better implemented.

Whether its a 4K Screen or 4K media, HDR is usually shoved along with it. Most of the 4K TV's in my household cannot handle HDR, due to the lack of lumens. Even in "Filmmaker" modes, colors are washed out and the contrast is too dark. These same TV's look pretty good with SDR. You can convert HDR to SDR but its a coin flip if your Physical/Software player can even convert it right. Heck, even some players do not let you turn off HDR and TVs don't have it either, ironically they usually have a SDR to HDR modes.

I have a 4K monitor that can do HDR but luckily if you use it, colors remain the same, well I mean, at Fullscreen HDR Media. Although HDR support on Windows is not that great in general. My personal TV can display HDR well even with Dolby Vision. I bought it for its Out-Of-The-Box No-Calibration-Necessary performance, although it would perform better with a Calibration, like ELAC speakers.

Generally with TVs, it literally is Quality or Quantity. Its easy to buy a big 4K TV that has great SDR performance but suck at HDR, for dirt cheap. At the other end of the scale, You can find a Quality TV that can reproduce HDR well but requires more research, usually smaller and costs the same as those big 4K TV's. Personally, I would still choose a quality TV, since I would get great HDR performance and never have to worry about flipping HDR off switches and flaky HDR to SDR solutions. The last time I saw the "TVs accepting signals they can't produce" is when 2000s 720p TVs accepted 1080p signals and downscaled them.
 
IMO, HDR is really great advancement, but there are definitively issues with mapping 1K nit mastered content to displays not capable of producing such light output. Some TVs do it better than the others, but overall the true solution are displays that can do this. 2023/4 generation of mini-LED TVs are definitively there and can at the top of the line reproduce 1K nit HDR. Some are already up to 4K nit masters that are few but out there.

Not sure where OLED stands in here as gave up on OLED due to size/price limitations. TCL's 2024 C855 (EU model) with 2k nits would be the great display to start with. X955 (EU model) with 5k nits would be the flagship and priced accordingly. However, I don't recall that a top of the line 98" TV like 98x955 would be priced below EUR 5k, ever. Not cheap, but will get cheaper over time.

Another thing to consider is that Hollywood is remastering their content to 4K nits, slowly but surely. So, in the longer run, that should be considered as well. Many of 4K 1K nits remasters did not even come with Atmos. The endgame is to offer these in 4K 4Knits and with Atmos. So we can buy them again in yet another incarnation of the same thing. Hollywood business model.
 
Half of the 4k movies were processed by effects shops that couldn't do 4k intermediates and upscaled the stuff back to 4k after they were done. HDMI-ARC that breaks down too often... HDR10(+) vs dolby vision vs hybrid log gamma. Total debacle everywhere you look with TVs.

It's hardly surprising that half the screens out there don't implement HDR very well, but it does look awesome when it works properly.
 
I find that this site Rtings.com has wealth information on how to set up your TVs for various content and viewing situations especially if the sets are less than five years old. Find the specific model or one with a close model number are read the review which will tell you what settings to use for each situation.
 
I followed the display industry technically for years. Your planned obsolescence theory is likely. HDR is part of an end-to-end color and dynamic range management pipeline. As you point out, studios can tweak it at very low cost and re-release, as long as they have the bandwidth to get it to the viewer.

The display industry has test standards and equipment analogous to Audio Precision. So at least I'm not aware of snake oil display makers! Rtngs, as mentioned, would be a consumer display test database. Their sid.org is our aes.org.

The display industry is massive, and they are always advancing. The price of entry for a manufacturer is high, you need a fab with very large substrates, or you have to be TI owning the DMD market. The display industry has theatrical, gaming, and medical segments. There will always be a newer and more expensive technology. It is a brutally competitive market. Look at Sharp who tried to lead on technology. (Many of the Sharp Labs people went to Dolby)

Manufacturers make their R&D back on the high end products and the volume is in the moderate to low end.

You could look at laser projection, ambient light-rejecting projection screens, and controlling your viewing room lighting. The other thing that is just coming in is a little swing down calibrator for auto calibration. At the ultra high end, very heavy and power-hungry, micro-LED pitch - distance between pixels, is making great advances. The best theatrical systems are laser projection. Down the road, we will see expanded gamut and a whole new display race.
 
As a colleague who's an expert in the field once reminded me - done properly, HDR would result in you needing sunglasses when watching summer outdoor scenes or content recorded in a desert!
 
I found this to be an interesting video on HDR.

A few years ago I bought an HDR TV and some software so I could see the pictures from my Nikon D7200 in HDR. To do that I had to turn them into HDR movies and be an amateur HDR colorist. One thing I found is that it's tough to make some scenes look good, especially bright scenes. At about 29:00 in the video it explains what I was seeing. The brighter you let the highlights get, the harder it is to make bright scenes look bright. I was finding I had to push the overall brightness up a lot to prevent a sense of the scene looking strangely dark and underexposed. So much for the idea that HDR should look mostly the same as SDR but only have brighter highlights. That only works with darker scenes featuring nice highlights.

The lesson I learned is that even with a 1600 nit TV, for daylight scenes we still have to compress the real brightness levels of the scene into a much smaller dynamic range, just like we do with SDR, only not quite as much. To make that look good for sunny settings we're forced to use the brightness capabilities of the display generously, which means overall brightness has to get closer to peak brightness on daylight scenes. Someday if TVs can hit 10,000 nit peaks we might be able to get away with much less dynamic compression and curves most of the time, so grading would be easier in that respect. But watching this stuff might be too intense. Going from indoor settings to outdoor settings might result in very realistic changes in brightness that could be eye searing while watching in a darkened room. Some people might like that, maybe even myself sometimes. But do we really need that?

This all has me questioning how much of this is a benefit, and when is lots of dynamic range too much of a good thing? My father spent his career as a photographer and illustrator in advertising. Before he died I explained to him what I was doing with displaying pictures in 10 bit color on an HDR display. He did not think that was a good idea because we can get perceptually good images without the need for that much range. I figured he just didn't understand because he hadn't seen it. Now I'm thinking more like him. I'm more aware that visual media is representational art, with some of the artistic effect being in the tone curves used for compressing each scene to look good. Reducing the dynamics from real world is a huge benefit, not a curse. I watch movies in SDR and it seems they have no problem transitioning from daylight scenes to night scenes without the weirdness I'm seeing in HDR unless I resort to letting the daylight scenes get up to eye searing in a darkened room. I also understand now why I like HDR TVs that don't follow the EOTF slope, but rise above it. HDR with bright highlights looks weird if it's not bright enough overall when it needs to be, and a lot of content seems to benefit from the TVs that boost.

In the computer display world, I see some HDR monitors have an HDR 400 mode, which limits peak brightness to 400 nits. Full screen brightness is getting up close to 300 nits on OLED, so that's a reasonable difference and should allow all scenes to look perceptually bright enough without searing your eyeballs, while also providing beautiful contrast when viewing in a not too bright room.
 
Last edited:
I found this to be an interesting video on HDR.

A few years ago I bought an HDR TV and some software so I could see the pictures from my Nikon D7200 in HDR. To do that I had to turn them into HDR movies and be an amateur HDR colorist. One thing I found is that it's tough to make some scenes look good, especially bright scenes. At about 29:00 in the video it explains what I was seeing. The brighter you let the highlights get, the harder it is to make bright scenes look bright. I was finding I had to push the overall brightness up a lot to prevent a sense of the scene looking strangely dark and underexposed. So much for the idea that HDR should look mostly the same as SDR but only have brighter highlights. That only works with darker scenes featuring nice highlights.

The lesson I learned is that even with a 1600 nit TV, for daylight scenes we still have to compress the real brightness levels of the scene into a much smaller dynamic range, just like we do with SDR, only not quite as much. To make that look good for sunny settings we're forced to use the brightness capabilities of the display generously, which means overall brightness has to get closer to peak brightness on daylight scenes. Someday if TVs can hit 10,000 nit peaks we might be able to get away with much less dynamic compression and curves most of the time, so grading would be easier in that respect. But watching this stuff might be too intense. Going from indoor settings to outdoor settings might result in very realistic changes in brightness that could be eye searing while watching in a darkened room. Some people might like that, maybe even myself sometimes. But do we really need that?

This all has me questioning how much of this is a benefit, and when is lots of dynamic range too much of a good thing? My father spent his career as a photographer and illustrator in advertising. Before he died I explained to him what I was doing with displaying pictures in 10 bit color on an HDR display. He did not think that was a good idea because we can get perceptually good images without the need for that much range. I figured he just didn't understand because he hadn't seen it. Now I'm thinking more like him. I'm more aware that visual media is representational art, with some of the artistic effect being in the tone curves used for compressing each scene to look good. Reducing the dynamics from real world is a huge benefit, not a curse. I watch movies in SDR and it seems they have no problem transitioning from daylight scenes to night scenes without the weirdness I'm seeing in HDR unless I resort to letting the daylight scenes get up to eye searing in a darkened room. I also understand now why I like HDR TVs that don't follow the EOTF slope, but rise above it. HDR with bright highlights looks weird if it's not bright enough overall when it needs to be, and a lot of content seems to benefit from the TVs that boost.

In the computer display world, I see some HDR monitors have an HDR 400 mode, which limits peak brightness to 400 nits. Full screen brightness is getting up close to 300 nits on OLED, so that's a reasonable difference and should allow all scenes to look perceptually bright enough without searing your eyeballs, while also providing beautiful contrast when viewing in a not too bright room.
Flagship mili LEDs have 5k not brightness capabilities and have had for more than a year. New generation is coming that will be brighter, but not really sure if that will be needed or distinguised.

I don’t think that max brightness is worst issue. It is the low nit mastered scenes that look terrible on any display. I guess the absence of light is difficult to reproduce on displays that are meant to produce light.
 
Flagship mili LEDs have 5k not brightness capabilities and have had for more than a year. New generation is coming that will be brighter, but not really sure if that will be needed or distinguised.

I don’t think that max brightness is worst issue. It is the low nit mastered scenes that look terrible on any display. I guess the absence of light is difficult to reproduce on displays that are meant to produce light.
Can you give me an example to watch of low nit mastered material that looks bad on brighter displays? I've been watching some Blu-Ray discs of 1970s and 80s Roger Moore Bond movies and they are very enjoyable to my eyes. Not sure if they are mastered at low nits, or if they got re-graded for Blu-Ray.
I could imagine that 100 nit material that was accurately displayed at 100 nits on a TV that's capable of being much brighter would look disappointingly dim. 100 nits was never optimal. I don't set up my TV to do that. I think it's interesting in the video they mentioned that the first color TVs could produce a wide gamut, but couldn't get very bright. People cared more about brightness than color range, so they chose different phosphors with less color capacity but better brightness.
 
Last edited:
The worst torture scene is 1 nit mastered scene from House of Dragons, at the beach with the Dragon. There are many others and generally include lack of light and contrast by design/ creators intention.

These look terrible on mini LED and OLED alike and has nothing to do with brightness or with contrast. Perhaps the creators are aware of this fact but don’t care. For one, I wish they staged a scene that would look better to the audience.
 
The worst torture scene is 1 nit mastered scene from House of Dragons
I see you are nit-picking! :D. For the full eye-gouge, try watching streamed from Now TV (UK). We couldn't see anything at all for half of the fateful GoT S8 E3 "The Long Night". I don't think it mattered how many bits there were in the colour channels for that, the compression block edges and stepped grey tones might as well have been Ceefax. Looked great off bluray...

I find it all the funnier now that photography has been brought up, that HDR on a still camera can mean dynamic range compression in the resulting image, and refers to exposure bracketing and compositing an image with the best of the highs and lows.

I watched the SchubinCafe video linked above, and the point about human light adaptation is an interesting one, not to mention reflection pollution. I definitely suffer from reflection issues of an afternoon unless I close the curtains, and that's whether or not there's HDR content in play. Dark movies in the afternoon are... not much fun. But then if you get too dark you observe bloom and light leakage... This brightness stuff is a pain!
 
The worst torture scene is 1 nit mastered scene from House of Dragons, at the beach with the Dragon. There are many others and generally include lack of light and contrast by design/ creators intention.

These look terrible on mini LED and OLED alike and has nothing to do with brightness or with contrast. Perhaps the creators are aware of this fact but don’t care. For one, I wish they staged a scene that would look better to the audience.
I've never watched that series. I watched some trailers on YouTube last night and those are pretty bad looking. They seem to suffer from data compression and related blocking and banding so that may be part of what's displeasing. But they also look dark and hazy overall, which I guess is to create the mood they're after.
I saw some dragon scene, and I thought of a line in a book I heard about on NPR where it says "Help! Dragon!" It's actually supposed to be read: "Help Dragon!" The dragon is asking for help because some guy in a metal suite with sharp pokey things is trying to kill him.
 
Flagship mili LEDs have 5k not brightness capabilities and have had for more than a year. New generation is coming that will be brighter, but not really sure if that will be needed or distinguised.

I don’t think that max brightness is worst issue. It is the low nit mastered scenes that look terrible on any display. I guess the absence of light is difficult to reproduce on displays that are meant to produce light.
Max brightness capability being higher leads to all sorts of interesting and potentially gorgeous presentation possibilities, and also allows for excellent viewing in bright rooms. I'd like to experiment with a 5k display, but too expensive for me currently. They also use a lot of energy if they're large, with current LED tech still losing 60% of energy to heat. I did one experiment a few years ago where I printed the same picture twice on transparency film, and then stacked the two pictures (pita to get aligned!). Using bright white clouds as the backlighting created an amazing look. A picture of flowers in the sunlight looked amazingly realistic.
It'll be interesting to see if VR or AR goggles can reach high enough resolution, brightness and dynamic range to make realistic looking scenes. The energy requirements would be considerably lower. The Apple Vision Pro is said to be HDR capable, but I can't find a number on how bright it actually gets.
 
Last edited:
I got an 77 LG C4 during the black friday deals last week. So far I'm extremely impressed with this display and it has seamlessly switched hdr modes from the Google TV streamer and Xbox. I have watched reviews on this set that suggest disabling filmmaker mode and max out the brightness. I have complete control of the ambient lighting in the room, so I haven't felt the need, but is this something that would increase the HDR effect? My initial feelings of HDR with my old display (an inexpensive HDR 10 capable led) were that it was too often gimmicky and distracting. On this set it makes much more sense, but some content- especially via Apple TV- has the 'soap opera effect' I've experienced on other sets that did frame interpolation or something similar. Is there a way to minimize this? This is my first OLED and I haven't really had much time to experiment. I have to say though, if inexpensive displays are going to get significantly better than this one, I'm excited for the future, lol.
 
I got an 77 LG C4 during the black friday deals last week. So far I'm extremely impressed with this display and it has seamlessly switched hdr modes from the Google TV streamer and Xbox. I have watched reviews on this set that suggest disabling filmmaker mode and max out the brightness. I have complete control of the ambient lighting in the room, so I haven't felt the need, but is this something that would increase the HDR effect? My initial feelings of HDR with my old display (an inexpensive HDR 10 capable led) were that it was too often gimmicky and distracting. On this set it makes much more sense, but some content- especially via Apple TV- has the 'soap opera effect' I've experienced on other sets that did frame interpolation or something similar. Is there a way to minimize this? This is my first OLED and I haven't really had much time to experiment. I have to say though, if inexpensive displays are going to get significantly better than this one, I'm excited for the future, lol.
 
I got an 77 LG C4 during the black friday deals last week. So far I'm extremely impressed with this display and it has seamlessly switched hdr modes from the Google TV streamer and Xbox. I have watched reviews on this set that suggest disabling filmmaker mode and max out the brightness. I have complete control of the ambient lighting in the room, so I haven't felt the need, but is this something that would increase the HDR effect? My initial feelings of HDR with my old display (an inexpensive HDR 10 capable led) were that it was too often gimmicky and distracting. On this set it makes much more sense, but some content- especially via Apple TV- has the 'soap opera effect' I've experienced on other sets that did frame interpolation or something similar. Is there a way to minimize this? This is my first OLED and I haven't really had much time to experiment. I have to say though, if inexpensive displays are going to get significantly better than this one, I'm excited for the future, lol.
I don't have an OLED big TV, but I have played with those modes that max out the brightness on my mini LED, and I prefer to leave it in Filmmaker mode. You should try it and see what you think. I'd say yes, it can emphasize HDR effects, although not necessarily in a natural way. It can be very impressive, but in an over-the-top way. I'm guessing I'd be even more inclined with the OLED than my TV to not emphasize peak brightness because your OLED TV can reach much higher peak brightness in small areas than it can maintain over large areas of the screen, so if a scene comes on that requires a lot of the screen to be nearly as bright as the highlights, it'll be automatically dimmed down by the TV, and that looks weird to me. However, not everyone agrees with me about it looking bad, and it reduces the eye searing effect you'd get otherwise. A lot of people are telling me it rarely actually happens in typical HDR movies when the TV is in Filmmaker mode, so it's a non-issue. They're in the HDR brightness is only for small highlights camp.

As for the soap opera effect, I have to turn off, or at least turn down motion smoothing effects in the TV settings to prevent that. It's funny that I can even see it on cartoons and video games. I'm not sure why it looks so wrong to me even though the motion is noticeably smoother.
 
Last edited:
The worst torture scene is 1 nit mastered scene from House of Dragons, at the beach with the Dragon.

We all loved it at home. It looked glorious on OLED with Dolby Vision in a dark room. The dark room part is very important.

I do agree you need a very specific setup to properly display this kind of super dark content though. We had no trouble watching the GOT series finale but so many people complained about this on social media.
 
As for the soap opera effect, I have to turn off, or at least turn down motion smoothing effects in the TV settings to prevent that. It's funny that I can even see it on cartoons and video games. I'm not sure why it looks so wrong to me even though the motion is noticeably smoother.
I agree. I find it horribly distracting even if it subjectively improves the image quality. It appears to me that the image has less grain or maybe it's dark detail. I know very little modern content is shot on film and I don't know if my brain is adding this effect, but the soap opera effect seems wildly artificial and lacking verisimilitude to the extent I find it difficult to concentrate on the scene. Paradoxically I enjoy the effect watching sports, but it seems less pronounced on this set than on a much older Visio LCD I have that makes football players look like they are being rendered in CGI. Perhaps it's knowing the content is real and requires no suspension of disbelief.
 
Perhaps it's knowing the content is real and requires no suspension of disbelief.
The first time I saw the soap opera effect was on a TV in a store that was showing a scene from Return of the Jedi. Later I saw a scene from on of the Pirates of the Caribbean movies. In both cases, at first I thought I was seeing behind the scenes outtakes or something. The look of actors wearing makeup on a set with CGI going on in the background was so overwhelming and completely destroyed the theatrical effect.
 
not much now use that rubbish hdr or dolby labs vision both are rubbish the 4k discs are rubbish atmos near field junk mixes , garbage out into the system , near field degrades audio , laserdisc the only format to offer theatrical mixes and often decent transfers picture colour , i not brought 4k atmos disc in many months now , only buy laserdisc for great THX cinema , i'm not wasting my great JBL THX professional cinema on rubbish 4k atmos anymore
 
Back
Top Bottom