Please ssh to the device in CE20 and run dispinfo
. Past the link here, please.
There you go : http://ix.io/4iUx
Fresh install of CE20, I didn’t change any setting at all.
Can you try with the Matrix 19.3 release? You’ll find it in GitHub. I just want to discard something.
There you go : http://ix.io/4iUH
Guess what ? HDR10+ logo is displayed with this version
Can you retry with latest CE20 nightly switching HDMI ports (check the options on your ports, if there’s an enhanced mode) ans also trying other HDMI cables?
Here is the dispinfo from latest CE20 nightly : http://ix.io/4iVA
Enhanced HDMI is ON (Input Signal Plus), cables are HDMI 2.1 certified.
Also, setup shouldn’t be a problem since everything works with 9.2.8 and 19.3, no ?
There are changes made since then. Try force display colour depth
to 10 bits in video settings.
Well done Vasco ! Thanks !
When set to 10 bits, HDR10+ logo is back !
Do you think it’s a normal behaviour for HDR10+ ?
Or a bug ?
I think your TV doesn’t support the 12bit. @Blakey mentioned it in our “staff backroom”
Your TV doesn’t support HDR 12bit, in Nexus the display colour depth was reworked, it’s better, but not for all displays. Previous images like Leia and Matrix works at 10bit, that’s why you didn’t realised any issue.
It’s not as simple as that but it’s definitely linked, I agree.
According to the TV specs, it supports 12 bits but not in 4:4:4, only in 4:2:2 and 4:2:0
It’s detailed here :
What HDMI cable are you using?
I use a Belkin cable, certified HDMI 2.1 48 Gbps (this one exactly : https://www.amazon.fr/gp/product/B07GVQKJ9W/)
Okay, if the cable is okay, then the issue is with the TV or the firmware itself. CoreELEC CE-20 works great at HDR10+ 12bit 422. I never had issues with that.
Indeed, I tested 12 bits 4:2:2 and HDR10+ works fine.
12 bits 4:4:4 kills HDR10+ but it is definitely a TV limitation (as seen on the TV specs).
The strange thing is, when CE is set to its default settings (“force display colour depth” and “force colour subsampling” settings to “auto”), it uses 12 bits 4:4:4 and kills HDR10+.
The initial report from dekesone was exactly the same issue I had and I’m quite sure the solution is the same : don’t leave these 2 settings to “auto” and either use 12 bits 4:2:2 or 10 bits 4:4:4, but not 12 bits 4:4:4.
Now, I wonder what’s best : 12 bits 4:2:2 or 10 bits 4:4:4. I will do some more testings on that.
From what I can read online, there is no benefit to using 4:4:4 for movies or TV shows…
Blakey and Vasco, thanks a lot to both of you for the precious informations, I learned a lot today !
The default setting is 12bit 444 on SDR (BT.709), on HDR10+ switched to 12bit 422 BT.2020, which is good, but HDR10+ 10bit 420 BT.2020 is more than enough. Not neccesary to use 12bit. 10bit 444 is perfect for everything. You can find out later whether the problem is the TV or the HDMI cable.