Help, support CPM build

In the case of PJs, they don’t support DV, so the video settings are mostly the same for both HDR10 and LLDV. This means we use the same black level settings for both scenarios.
Anyway, we need a separate video mode for LLDV to compensate for this black clipping. Regardless, I can conduct further testing later.

yes, movies like 1917 will be a problem in DV because it has no shot so even the dark scene gets 600nits L1 metadata sent to the player/display which will over dim the image on any display that has a target brightness lower than 600nits. So that’s a different issue actually because that RPU source pq is 3079

I did myself a frame by frame rpu for that movie


1 Like

Well, you asked for a real example of the 0.001 nits looking better than the 0 nits. So, it does look better in 0.001 rather than 0 nits even after adjusting the raised black level on any LLDV Maximum Display Luminance setting of 1,000 nits, 4,000 nits and 10,000 nits.

@cpm When you compiled the A4 version are you using -O3 or -ftree-vectorize or anything like that?

EDIT: I’ll recompile from fresh

Hah. No video processor has a Dolby Vision license yet. They can’t operate like LG TVs. TV-led Dolby Vision requires a Dolby module, such as dovi.ko, which they don’t have.

This raises an interesting question. If devices don’t use the colour space options (which devices have actually been tested?), and all use ICtCp for tv-led, why do TV’s seem to support these options? Purely from a Dolby compliance perspective?

@doppingkoala , my guess is what you mentioned - Dolby Compliance - or it could be a device implementation bug. As you mentioned on the other thread you found out that the tunnel modes could be YUV422_BIT12, RGB_8BIT, RGB_10_12BIT, and YUV444_10_12BIT. RGB_8BIT is for traditional TV-Led and YUV422_BIT12 is for traditional LLDV but as we know the Dolby standard also allows for LLDV to operate in RGB_10_12BIT and YUV444_10_12BIT (which by the way the only player I know that can do that is the Magnatar). So I guess for some strange compliance reason Dolby is allowing the TV-Led tunnels to operate with the LLDV modes also. It could also be a device implementation bug which as we know there are some…

One question I have is - can you verify that when using YUV422_BIT12 as a tunnel for TV-Led is the TV actually doing the tone mapping? Or just accepting the tunnel but still operating as receiving a LLDV fully tone mapped signal?

The TV was definitely doing the tone mapping. Could see clear responses as the metadata changed. edit: and on my builds the device is also incapable of doing anything else but true tv-led.

Looking at the existing DV code, and the code to that decodes the edid from the TV, it clearly looks to me to be a supported mode for TV-led DV.

The only odd thing I found with that mode was that my TV actually supported the mode. It is not advertised as supported in the edid. That is not the first thing that I have discovered the TV actually supports that is not advertised in the edid though…

Yes, very interesting indeed. I am not sure if this is an “unintended consequence”. Since as far as I know you have the only device in existence that forces TV-Led and YUV422_BIT12 tunnel. There is no other device that does that and there is no Dolby published or internal documentation that states that this is possible that I know. So I believe that the TV is receiving the TV-Led stream in the tunnel you are forcing (YUV422_BIT12) and accepting it despite this not being part of the Dolby standard. On the thread you publish this and your code there were a few folks that said it was not working on their TVs and if my memory serves my right you did another version with the traditional RGB_8BIT so it worked on their TVs.

So when I say “unintended consequence” I mean this was not suppose to happen (although I am happy it did). The TV was supposed to reject the signal and show a blank screen as it happened in the other cases. So your TV accepted this non-standard signal and it worked as an unintended consequence probably because of the LLDV capability but I am not sure if there will be another player device that can do what you coded so will not be available in any commercial form. Although I think there will be more TVs that would accept this signal since from the TVs perspective it could think it is a LLDV signal.

Just my thoughts on this very interesting situation.

In the existing Amlogic code there are options to enable tv-led DV using these other tunnel modes - so presumable Amlogic received documentation regarding these tunnel modes. There is also actually not difference between 8-bit RGB and and 12-bit 4:2:2 YUV over hdmi aside from some metadata - the actual video signals are indistinguishable.


Correct, so clearly not all tv’s support this mode. Likely why there is a field in the EDID to advertise that if a TV can support the mode


Not quite, the TV stills needs to pull out, decode, and apply the metadata that is embedded in the signal for tv-led mode.

In the existing Amlogic code there are options to enable tv-led DV using these other tunnel modes - so presumable Amlogic received documentation regarding these tunnel modes. There is also actually not difference between 8-bit RGB and and 12-bit 4:2:2 YUV over hdmi aside from some metadata - the actual video signals are indistinguishable.

I have not been impressed by the Amlogic code so not sure if they just “jumbled” Std-DV and DV-LL in one block of code.

Correct, so clearly not all tv’s support this mode. Likely why there is a field in the EDID to advertise that if a TV can support the mode

The VSVDB EDID is very well documented and there is no documentation (again official or internal) that says that this is possible. There is only one “Reserved” space which is not being used.

Not quite, the TV stills needs to pull out, decode, and apply the metadata that is embedded in the signal for tv-led mode.

You are forcing Display-Led (TV-Led) which is documented on the VSVDB EDID so it would behave this way.

Again, I am just adding my thoughts.

If you want to further test different scenarios you could force in your code both RGB_10_12BIT and YUV444_10_12BIT tunnels (on different builds) to see what happens since they are clearly not documented on you TV EDID. My assumption is that this will not work since your TV EDID only shows YUV422_BIT12, RGB_8BIT compatibility but if both of those tunnels work then there is something more afoot.

You may be right about that. But there are at least distinct separate bits of code for both Std-DV and LLDV, so who really knows?


Clearly.

The one that would be more interesting to me is actually trying to use EMP packets to send the DV metadata rather than embedding that in the video signal.

That is an option for hdmi 2.1, which would be really nice to use. Haven’t tried it yet though, my TV says it doesn’t support that mode (but it is hdmi 2.1, so maybe?) and would probably also need someone that has that mode working on a licensed device to copy some data from to test that.

1 Like

I must admit I find the above discussion about Dolby Vision quite interesting. However I fail to see how it relates to the CPM build?

Am I missing something, or should this be split of into a different thread?

Currently, if there is both DV and HDR10+ on the file, the choice is to correctly prioritize the native DV vs the HDR10+ to DV conversion. The thinking is the native DV will always be better than the conversion.

However in some edge cases the source content has static DV while the HDR10+ layer does not (presumably). In these cases one would prioritize the HDR10+ conversion over the native static DV; even regular HDR is better than static DV.

Is there any interest in adding an override toggle to “Prioritize HDR10+ Conversion” to address this edge case? Basically if the file has both DV and HDR10+, it will use the HDR10+ → DV conversion instead. The HDR10+ hybrid of Sicario is one such example. It has static DV and HDR10+, so the converted DV should be superior to the native, but static DV.

1 Like

Speaking of toggle, what I really miss is a toggle for overall conversion. Sometimes conversion to DV is really great, often quite meh, if not totally bad too many times to my taste.
Being able to switch it off “live” (even with a play restart) would be a very handy feature. The need to go back to settings to toggle the DV conversion push me to the strange feeling that this awesome feature is a kind of a pain to use, and that I would be better without it all together. Quite sad to be honest.

1 Like

This has been requested in the past and the build is not straightforward:

Yeah I read that. I’m not versed in skin modding but can’t we just reach the VS10 toggle in settings from the player menu? I understand the change of the video format after pressing the button wouldn’t be seamless, it’s not the point. I just want a kind of speedy shortcut, like the passthrough toggle. Even if I need to restart the file wouldn’t be an issue.
Edit: trying clarity.

1 Like

You can use JSON-RPC API to control any of the system settings (all settings names are here). With this you would need to make a python script that toggles whatever setting you want. For example SDR8 and SDR10 DV off:

curl -X POST -H "Content-Type: application/json" -d '[
  {"jsonrpc":"2.0","method":"Settings.SetSettingValue","params":{"setting":"coreelec.amlogic.dolbyvision.vs10.sdr8","value":5},"id":1},
  {"jsonrpc":"2.0","method":"Settings.SetSettingValue","params":{"setting":"coreelec.amlogic.dolbyvision.vs10.sdr10","value":5},"id":2}
]' http://127.0.0.1:8080/jsonrpc

SDR8 and SDR10 DV enabled:

curl -X POST -H "Content-Type: application/json" -d '[
  {"jsonrpc":"2.0","method":"Settings.SetSettingValue","params":{"setting":"coreelec.amlogic.dolbyvision.vs10.sdr8","value":0},"id":1},
  {"jsonrpc":"2.0","method":"Settings.SetSettingValue","params":{"setting":"coreelec.amlogic.dolbyvision.vs10.sdr10","value":0},"id":2}
]' http://127.0.0.1:8080/jsonrpc

The script would then be mapped to a remote button with keymaps (/storage/.kodi/userdata/keymaps). The downside is that this requires enabling the webui
Settings > Control > Allow remote control via HTTP

Alternatively maybe the Kodi Keymap editor addon could be modified to access the VS10 settings so that they could be mapped to the remote directly from that UI.

Lastly you could make a separate Kodi addon just for toggling whatever settings you want. Since addons can access executeJSONRPC, turning on the webserver wouldn’t be required either.

3 Likes

Whoa, thank you so much @YadaYada for your hints. Can’t wait to work on it my next holidays!

As I try to implement it for everyone interested, I understand that addon would need to install a modified Keymap addon, then ask to set a keymap to call the RCP toggle function. Doesn’t it feel too complicated?
I try to grasp a simpler way to achieve it.