HDR to SDR tonemapping

I still don’t get the whole idea of tone-mapping HDR to SDR. Personally, I was always against even making it available to users, due to the complete and huge (in my opinion) inconsistency with the end results. I agree that whatever you will be looking at will probably look better tone mapped than not, but not-tone-mapped, at least it will look “bad” consistently and equally to most people.

HDR and SDR are two completely different, and in no way compatible video formats.
With HDR content, an HDR TV will do its best to map the content and the required luminosity and chromacity levels to the display’s capabilities, to provide you with the best HDR experience the hardware can produce.
HDR content works in absolute values for luminosity. If there’s a pixel that’s supposed to be 300 nits, that pixel will be displayed at 300 nits, or as close as possible to that number, given display limitations.
SDR works in a completely different way. It is a relative system, that’s called Gamma.

You just can’t map HDR to SDR without providing the tone mapping algorithm with the appropriate data about your display capabilities. Whatever values that come programmed in the kernel/hardware are the values you will have to live with. The only thing you can do is tweak the backlight level, and perhaps dial the color a little bit to make it look “better”. But you’ll never get anywhere near the results of modern a mid-end display.

There is one way to get decent results, and it’s to use a LUT. From the start I’ll just say that’s it’s currently not possible, and probably will never be possible to create your own LUT on CoreELEC. But even if it was possible, you need display calibration equipment and special software to measure the capabilities of your display and to generate said LUT.
I did that once with a SDR 4K TV, and the results were actually surprisingly decent. But I can’t see most people dealing with this stuff.

4 Likes