Have you seen videos or pictures that have dark sections, and there’s “banding” where there’s a noticeable difference between something black and something very black? Like a sharp border where it’s obvious the conversion process from the camera to your screen didn’t fully capture a gradient of darkness?
That’s due to the process not being able to handle darker areas compared to very bright areas. It’s not enough to have an HDR display; the whole chain before then has to support it, as well. When it’s done, not only does it get rid of banding, but finer elements in darker areas can pop out and join the rest of the scene.
In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more “pop” by having certain elements be brighter than others. There’s more too, and it’s up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).
If you have a theater nearby that offers Dolby Vision films you can try out a version of HDR. They use laser projectors so the blacks can really be pure black. When the screen goes dark just before the movie the entire theater will be pitch black except for emergency lighting. It’s glorious.
I’ve been meaning to ask, what is HDR, why do I need it, and (new question) what do lice eggs have to do with it?
Have you seen videos or pictures that have dark sections, and there’s “banding” where there’s a noticeable difference between something black and something very black? Like a sharp border where it’s obvious the conversion process from the camera to your screen didn’t fully capture a gradient of darkness?
That’s due to the process not being able to handle darker areas compared to very bright areas. It’s not enough to have an HDR display; the whole chain before then has to support it, as well. When it’s done, not only does it get rid of banding, but finer elements in darker areas can pop out and join the rest of the scene.
In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more “pop” by having certain elements be brighter than others. There’s more too, and it’s up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).
If you have a theater nearby that offers Dolby Vision films you can try out a version of HDR. They use laser projectors so the blacks can really be pure black. When the screen goes dark just before the movie the entire theater will be pitch black except for emergency lighting. It’s glorious.