HDR to SDR tonemapping

Hi all

I see that the HDR to SDR with tonemapping is supported in kernel 4.9. But it requires S922X or it will eventually work also with S905X2 and maybe with S905X and S912 when they will be supported by NG images?

It should work on all -ng devices.

Is there any preliminary image of NG for S905X?
Or a buildable somehow-working branch?

-ng only supports newer devices, such as S905X2, S922X and A311D. We are also working on adding support for the S905*3 variants.

But I got the impression that you were, low priority, working to add also support for S905X and S912, or I am mistaken?

It is something that @cdu13a is, or at least was, working on. I’m not sure what the status is on that.
IMHO, I wouldn’t hold my breath for -ng on older SoCs.

The S905X3 uses different tone curve for HDR->SDR tone mapping than S922X?

Yes, on NONHDR TVs s905x3 does better tonemapping than s922x. I saw on an older LG TV the clear difference between Vim3 pro and an x96max s905x3, x96max plays better.
some pictures here: Vim3 playback issue on LG TV uf7787

Tonemapping can only do so much when it comes to automatic HDR->SDR conversion.

A manual SDR grade and/or pass used to create a 1080p SDR Blu-ray is always likely to look better on an SDR display than HDR material automatically tone mapped to SDR with no manual intervention.

In other words - a 1080p SDR Blu-ray rip is likely to look nicer than a 2160p HDR UHD Blu-ray rip tone mapped to SDR on an SDR display.

Which is the very experience that I was having, hence the question of whether, in the case of the C4, it may have managed to hit a sweet spot that makes the use of 4k HDR more desirable.

I still don’t get the whole idea of tone-mapping HDR to SDR. Personally, I was always against even making it available to users, due to the complete and huge (in my opinion) inconsistency with the end results. I agree that whatever you will be looking at will probably look better tone mapped than not, but not-tone-mapped, at least it will look “bad” consistently and equally to most people.

HDR and SDR are two completely different, and in no way compatible video formats.
With HDR content, an HDR TV will do its best to map the content and the required luminosity and chromacity levels to the display’s capabilities, to provide you with the best HDR experience the hardware can produce.
HDR content works in absolute values for luminosity. If there’s a pixel that’s supposed to be 300 nits, that pixel will be displayed at 300 nits, or as close as possible to that number, given display limitations.
SDR works in a completely different way. It is a relative system, that’s called Gamma.

You just can’t map HDR to SDR without providing the tone mapping algorithm with the appropriate data about your display capabilities. Whatever values that come programmed in the kernel/hardware are the values you will have to live with. The only thing you can do is tweak the backlight level, and perhaps dial the color a little bit to make it look “better”. But you’ll never get anywhere near the results of modern a mid-end display.

There is one way to get decent results, and it’s to use a LUT. From the start I’ll just say that’s it’s currently not possible, and probably will never be possible to create your own LUT on CoreELEC. But even if it was possible, you need display calibration equipment and special software to measure the capabilities of your display and to generate said LUT.
I did that once with a SDR 4K TV, and the results were actually surprisingly decent. But I can’t see most people dealing with this stuff.


and that is the point, you said:

I agree that whatever you will be looking at will probably look better tone mapped than not, but not-tone-mapped, at least it will look “bad” consistently and equally to most people

Fact is, some people own a SDR-only TV, but have HDR content on their discs and want to view it.
Of course, with a HDR2SDR tone mapping the viewing result is not perfect as with a HDR-display, but better as without a tone mapping. And to look “bad” consistently is nothing people want…

And the N2 (S922X) has a bad HDR2SDR tone mapping, which creates artifacts and color banding on SDR displays as @borza and others noticed.

So I am considering to buy a S905X3-box, which shouldhave a better tonemapping.
Buying a new 80" HDR-TV as a replacement for my 79" LG-SDR-TV is a no-go at the moment (wife!)

So the C4 is an option for me, but I miss the bluetooth for my headphones. I know, I could use a BT-USB dongle, but then I loose one USB-port for my discs.

Any recommendations for a S905X3 box with bluetooth and Giga-LAN supported by CoreELEC?

In my opinion, you need SDR content for an SDR TV, otherwise you just sacrifice picture quality for no reason.

1 Like

Yes, I agree, that would be the best solution.
But I want to view 4K-HDR content on my 4K-SDR-TV…

Amazing. Fancy just buying a HDR display if you want to watch 4K HDR content…why buy the content without the equipment to view it correctly…

X96 Max+ 4/32

Tone mapping is something that needs to be done though - and there are various approaches to doing it. Without tone mapping it will be next-to-impossible to produce live TV in HDR and make it available in decent quality for those watching in SDR. (Early HDR productions used a dual-path for two outputs from the broadcast cameras - as modern TV cameras can output an SDR and HDR output simultaneously - but routing and cutting two simultaneous productions in-sync is unsustainable long term)

There are two distinct approaches to broadcast tone-mapping for HDR->SDR EOTF conversion :

  1. Scene Light - where the HDR signal is converted back to the scene light domain, and then converted as if an SDR camera was shooting it. This gives you an SDR-like picture - and intercuts well with other standard SDR sources. It’s likely to be the approach used for sport, entertainment and live events. This simulates

  2. Display Light - where the HDR signal is converted back to the light levels that a reference HDR display would produce, with a conversion to an SDR signal that would produce similar light levels produced (obviously levels in the HDR luminosity range are clipped/rolled off to keep them in the SDR range of the output signal). This simulates the look of the HDR signal in SDR and preserves the subjective artistic ‘look’

Documentary could benefit from either approach depending on the aims of the production.

Of course when you are talking about SDR you then need to decide whether you are BT.1886 EOTF or following a power law gamma (as was the case for decades in general production)

There are also different approaches to be considered in the conversion of Rec 2020 gamut colour to Rec 709 gamut colour (this isn’t HDR->SDR - but most HDR productions are shot Rec 2020, and most SDR display are Rec 709)

There is very little discussion in the HTPC community about what gamut conversion and tone mapping approaches are being used… There should be…

Not strictly true - that’s just how PQ HDR works. There are other forms of HDR that don’t use Perceptive Quantisation and thus don’t have a fixed pixel value->light output mapping.

The HLG system isn’t PQ - and is thus not fixed to absolute luminosity values. Like power law gamma SDR - it’s a scene-referred system, not display-referred.

Yes - and no. SDR has standardised on 100nits for peak white with a BT.1886 EOTF. Of course in non-control room environments this will be very dim, so people drive their HDR TVs into the >100nit HDR range when watching SDR… Then they complain that PQ HDR is dim…

Yes - this is true - but Scene Light conversion mitigates this, whereas Display Light doesn’t to the same degree.

The BBC covered Prince Harry and Meghan Markles wedding in HDR and SDR simultaneously. The BBC One domestic feed of the wedding was an SDR simulcut - taking the SDR CCU outputs from the cameras and cutting in SDR HD. The HDR CCU outputs (which I think were HLG native - rather than Slog 3 - though I may be wrong) were simulcut in UHD HDR.

HOWEVER - the international feed taken by Sky News UHD, and most other rights holders was a Scene Light tone mapped HDR->SDR conversion generating a UHD SDR (Sky don’t do HDR yet) and HD SDR downconversion from the HDR source.

There was very little difference between the SDR simulcut and the HDR->SDR scene light conversion.

Yet whenever I read about Tonemapping in the Kodi space - these things are never discussed. Neither is gamut mapping - which is a major part of the process -


Thanks for your explanations, although I didn’t understand everything in detail… :thinking:

I’m aware of HLG, but my rant was about HDR10 --> SDR tonemapping.
HLG doesn’t really look that great either, IMHO. But we don’t have actual HLG channels where I live, so I can only judge by a few demo clips I’ve seen.

In short, my point was that to properly map HDR10 to SDR, you need to get a LUT of your particular display.
Once you have the LUT, you know what the display’s actual capabilities are. Then you can use whatever method for tone mapping to suit your display.
When I made a LUT for my SDR TV, I upped backlight to the maximum, and cranked color up a little bit to get the most out of what it was capable of. The result was actually half decent.
But none of the tonemapping I’ve seen thus far is worth it in my opinion. I mean, for casual viewing, maybe, but why would you bother with HDR content for casual viewing on an SDR display anyway?

Have you seen any of the Planet Earth or Blue Planet series? They were posted in HLG, and then converted to HDR10 for UHD Blu-ray release.

They were broadcast on BBC iPlayer in HLG here and looked great!

If you see the same material in HLG and HDR10 you probably would struggle to tell the difference on a decent display - they are two different routes of delivering the same quality material - but the HLG system doesn’t proscribe a luminosity to a pixel value (though you can obviously decide to match PQ should you wish with it

That sounds like a Display Light approach ? Have you tried a Scene Light approach?