It isn’t, it’s just that marketing is really bad at displaying what HDR is about.
HDR means each color channel that used 8 bits can now use 10 bits, sometimes more. That means an increase of 256 shades per channel to 1024, allowing a higher range of shades to be displayed in the same picture, and avoiding the color banding problem:
That’s just 10 bit color, which is a thing and does reduce banding but is only a part of the various HDR specs. HDR also includes a significantly increased color space, as well as varying methods (depending on what spec you’re using) of mapping the mastered video to the capabilities of the end user’s display. In addition, to achieve the wider color gamut required HDR displays require some level of local dimming to increase the contrast by adjusting the brightness of various parts of the image, either through backlight zones in LCD technologies or by adjusting the brightness per pixel in LED technologies like OLED.
I assume HDR has to be explicitly encoded into images (and moving images) then to have true HDR, otherwise it’s just upsampled? If that’s the case, I’m also assuming most media out there is not encoded with HDR, and further if that’s correct, does it really make a difference? I’m assuming upsampling means inferring new values and probably using gaussian, dithering, or some other method.
Somewhat related, my current screens support 4k, but when watching a 4k video at 60fps side by side on a screen at 4k resolution and another 1080p resolution, no difference could be seen. It wouldn’t surprise me if that were the same with HDR, but I might be wrong.
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell#!nix-shell -i bash --packages xautomation xclipsleep 0.2
(echo'::: spoiler Anti Commercial AI thingy
[CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
Inserted with a keystroke running this script on linux with X11
```bash'cat"$0"echo'```
:::') | xclip -selection clipboard
xte "keydown Control_L""key V""keyup Control_L"
yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth. Some content on YouTube is in HDR (that is noted in the quality settings along with 1080p, etc), but the option only shows if both the content is HDR and the device playing it has HDR capabilities.
Regarding streaming, there is already a lot of HDR content out there, especially newer shows. But stupid DRM has always pushed us to alternative sources when it comes to playback quality on Linux anyway.
no difference could be seen
If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.
yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth.
Ah, that’s what I thought. Thanks.
If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.
I tried with the most known test video Big Buck Bunny. Their website is now down and the internet archive has it, but I did the test back when it was up. Also found a few 4k videos on youtube and elsewhere. Maybe me and the people I tested it with aren’t sensitive to 4k video on 30-35 inch screens.
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell#!nix-shell -i bash --packages xautomation xclipsleep 0.2
(echo'::: spoiler Anti Commercial AI thingy
[CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
Inserted with a keystroke running this script on linux with X11
```bash'cat"$0"echo'```
:::') | xclip -selection clipboard
xte "keydown Control_L""key V""keyup Control_L"
But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell#!nix-shell -i bash --packages xautomation xclipsleep 0.2
(echo'::: spoiler Anti Commercial AI thingy
[CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
Inserted with a keystroke running this script on linux with X11
```bash'cat"$0"echo'```
:::') | xclip -selection clipboard
xte "keydown Control_L""key V""keyup Control_L"
People aren’t “sensitive” to 3D movies due to lack of stereoscopic vision (That’s typical for people who were cross-eyed from birth for example, even if they had corrective surgery). Or they can see them and don’t care or don’t like the effect.
If you’re not “sensitive” to 4K, that would suggest you‘re not capable of perceiving fine details and thus you do not have 20/20 vision. Given of course, you were looking at 4K content on a 4K screen in a size and distance, where the human eye should generally be capable of distinguishing those details.
I have never seen banding before, the image seems specifically picked to show the effect. I know it’s common when converting to less than 256 but color, e.g. if you turn images into svgs for some reason, or gifs (actual gifs, not video)
Also dithering exists.
Anyway, it’ll surely be standard at some point in the future, but it’s very much a small quality improvement and not something one definitely needs.
the image seems specifically picked to show the effect
Well, of course it is.
Banding is more common in synthetic gradients though, like games and webpages. A really easy way to see it is using a css gradient as a web page background.
Fair enough. Dithering would still be an option though. But if it’s not done I agree there can be visible stripes in some cases.
Also I wanted to apologize for the negative wording in my above comment. That was uncalled for, even if I think HDR is totally not worth it at the moment.
It isn’t, it’s just that marketing is really bad at displaying what HDR is about.
HDR means each color channel that used 8 bits can now use 10 bits, sometimes more. That means an increase of 256 shades per channel to 1024, allowing a higher range of shades to be displayed in the same picture, and avoiding the color banding problem:
That’s just 10 bit color, which is a thing and does reduce banding but is only a part of the various HDR specs. HDR also includes a significantly increased color space, as well as varying methods (depending on what spec you’re using) of mapping the mastered video to the capabilities of the end user’s display. In addition, to achieve the wider color gamut required HDR displays require some level of local dimming to increase the contrast by adjusting the brightness of various parts of the image, either through backlight zones in LCD technologies or by adjusting the brightness per pixel in LED technologies like OLED.
Thank you.
I assume HDR has to be explicitly encoded into images (and moving images) then to have true HDR, otherwise it’s just upsampled? If that’s the case, I’m also assuming most media out there is not encoded with HDR, and further if that’s correct, does it really make a difference? I’m assuming upsampling means inferring new values and probably using gaussian, dithering, or some other method.
Somewhat related, my current screens support 4k, but when watching a 4k video at 60fps side by side on a screen at 4k resolution and another 1080p resolution, no difference could be seen. It wouldn’t surprise me if that were the same with HDR, but I might be wrong.
Anti Commercial AI thingy
CC BY-NC-SA 4.0
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
yes, from the capture (camera) all the way to distribution the content has to preserve the HDR bit depth. Some content on YouTube is in HDR (that is noted in the quality settings along with 1080p, etc), but the option only shows if both the content is HDR and the device playing it has HDR capabilities.
Regarding streaming, there is already a lot of HDR content out there, especially newer shows. But stupid DRM has always pushed us to alternative sources when it comes to playback quality on Linux anyway.
If you’re not seeing difference of 4K and 1080p though, even up close, maybe your media isn’t really 4k. I find the difference to be quite noticeable.
Ah, that’s what I thought. Thanks.
I tried with the most known test video Big Buck Bunny. Their website is now down and the internet archive has it, but I did the test back when it was up. Also found a few 4k videos on youtube and elsewhere. Maybe me and the people I tested it with aren’t sensitive to 4k video on 30-35 inch screens.
Anti Commercial AI thingy
CC BY-NC-SA 4.0
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
So you’re saying you need glasses?
But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.
No. Just like some people aren’t sensitive to 3D movies, we aren’t sensitive to 4k 🤷
Anti Commercial AI thingy
CC BY-NC-SA 4.0
Inserted with a keystroke running this script on linux with X11
#!/usr/bin/env nix-shell #!nix-shell -i bash --packages xautomation xclip sleep 0.2 (echo '::: spoiler Anti Commercial AI thingy [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) Inserted with a keystroke running this script on linux with X11 ```bash' cat "$0" echo '``` :::') | xclip -selection clipboard xte "keydown Control_L" "key V" "keyup Control_L"
People aren’t “sensitive” to 3D movies due to lack of stereoscopic vision (That’s typical for people who were cross-eyed from birth for example, even if they had corrective surgery). Or they can see them and don’t care or don’t like the effect.
If you’re not “sensitive” to 4K, that would suggest you‘re not capable of perceiving fine details and thus you do not have 20/20 vision. Given of course, you were looking at 4K content on a 4K screen in a size and distance, where the human eye should generally be capable of distinguishing those details.
I have never seen banding before, the image seems specifically picked to show the effect. I know it’s common when converting to less than 256 but color, e.g. if you turn images into svgs for some reason, or gifs (actual gifs, not video)
Also dithering exists.
Anyway, it’ll surely be standard at some point in the future, but it’s very much a small quality improvement and not something one definitely needs.
Well, of course it is.
Banding is more common in synthetic gradients though, like games and webpages. A really easy way to see it is using a css gradient as a web page background.
Fair enough. Dithering would still be an option though. But if it’s not done I agree there can be visible stripes in some cases.
Also I wanted to apologize for the negative wording in my above comment. That was uncalled for, even if I think HDR is totally not worth it at the moment.
Yeah, they’ve reduced the colour depth the show off the effect without requiring HDR already.
I find it a lot more noticeable in darker images/videos, and places where you’re stuck with a small subset of the total colour depth.