I can’t say I really have a very deep understanding of how all of this works.
I think it was back around 2016, when I was playing around with it. I used a freeware application (which I can’t remember the name of right now) to generate the LUT and then fed it to madVR on a PC.
I have seen both Planet Earth II and Blue Planet II, and indeed they both look pretty pretty amazing. Was not aware they were converted into HDR10 from HLG. As I said, I only saw just a few HLG demos, and they didn’t look that impressive to me.
Having spent the day watching copious amounts of videos about HDR, it has become clear to me that the HDR standards are rather a mes and even HDR on an HDR TV can often yield the same washed out colours that we see with HDRtoSDR tone mapping.
Whilst HDR increases contrast and detail, it is at the sacrifice of other elements.
It seems that the very high end TV’s might be able to show HDR to better effect but at quite a cost.
Ergo the TV industry has IMO created yet another successful marketing con and what we have in CE (and Android) is probably the best things can realistically get.
I prefer the vivid colours of SDR and so for the time being, for me, 4K HDR is not worth the effort for what is, overall a poorer resulting output.
All I can say is that you haven’t watched any well mastered HDR on a decent display. Having seen proper HDR on a broadcast HDR display, and then similar material on my home TV (Sony consumer LCD with FALD) it really does stand up well.
If you are seeing washed out colours in HDR - then something is very wrong with how you are viewing the HDR material, or how it has been generated…
I have read about HDR to SDR several articles and the only thing I understand is that there is no perfect method and that mapping is done using algorithms, functions with constants and variables. For an ordinary person, like me, it is something beyond me. Can anyone enlighten me a little, in a somewhat simple way about some things?
What hardware component handles this process (cpu, gpu, other chips)?
Which developers deal with these algorithms (Amlogic, Kodi, CE Team)? I guess the people at Amlogic.
Why does the conversion work better on S905 (X, X2, X3) than on S922x?
Can anything be done to improve the conversion for the S922X, or will it stay that way until there are no more SDR TVs?
There are some dilemmas I can live with them, but I would like to know.
Something is either wrong with the files, the way you are playing them or the set-up of your neighbours TV.
I’ve spent 30 years working in broadcast TV control rooms - and would never describe the pictures I see on my Sony HDR TV - whether SDR or HDR - as washed out and dark (unless that’s a creative decision taken by the director).
Comparing the HDR and SDR versions of a number of movies I have on both formats - I’ve not seen what you are describing.
I HAVE seen what you are describing when HDR10 material is played incorrectly and treated as Rec 709 - with that you get a very flat picture, with very washed out colours.
Thank you @cdu13a for your kindness and clarifying answers.
In conclusion, it could be said that for running video files with HDR content it is better to use a box with S905X3 than S922X on SDR TVs, at least for now, both on Andorid 9.0 and CE 9.2.2 (same kernel )?
Is it possible for some next kernel to make some adjustments to make improvements in this issue?
On the previous CE (9.2.1), until a few weeks ago, in a kind of compromise the playback of HDR files was better, but there were other problems described very well by @pepeq here Problem with tonemapping HDR-to-SDR mapping on N2
in my case this is a huge problem because my qled tv(q6 2017 model) even it is hdr compatible in coreelec the colours are totally washed out on hdr content… with the hdr2sdr the colours were perfect and my ambilight was working perfect… now i have both wached out colours on my TV and ambilight…
but this is the way my TV works my friend… (I know Samsung sucks) but i cannot do something about this… I can’t toggle on and off the hdr… it just grabs automatically the signal from the source… and with the previous hdr2sdr was doing the trick to not automatically enable hdr…
Q6 should support HDR as far as I’m aware. It’s a problem with your configuration, or something in the video chain not supporting HDR that needs to be worked around.
Using HDR2SDR on an HDR TV sounds completely ridiculous to me.
Maybe in the meantime you changed something through the TV settings and that’s why you have washed out colors.
You can check this: " In the ’ Colorspace ’ setting, it is preferable to leave it to ’ Auto '. When set to auto, the color space changes to match the type of content you are watching automatically. Setting the Color Space to ‘Custom’ will allow calibrating the TV for SDR content. Normally we do not recommend doing this as the TV is already fairly accurate out of the box. If you set the Color Space to ‘Custom’ you will have to adjust the settings each time you change from SDR to HDR.
For watching HDR content via an HDMI connection, it is important to set the 'HDMI UHD Color ’ on for each HDMI input that will receive the HDR content. This will permit the HDMI port to transmit all the bandwidth needed for HDR and tells the TV to expect a 10-bit color signal on that input. If it is not turned on, some devices will not detect the Q6FN as being compatible with HDR. For HDR content, it is also preferable to set the ’ Backlight ’ to maximum, set ’ Local Dimming ’ to ‘High’ and set the ’ Color Space Settings ’ to ‘Auto’. "
You aren’t feeding the HDMI output of your media player through a secondary box to interface to your Ambilight are you? If so that will almost certainly need to be HDR compatible… If it’s not it could well block all the HDR metadata to your TV, leaving it in SDR mode.
guys my particular model q6fam is well known for its pseudo hdr abilities…I don’t want hdr mode into my movies because the colours are washed out and they are leading my led lights also with inaccurate colours…please give me a solution to turn off hdr mode with hdr2sdr like the previous versions…