EDID Override - Injecting a Dolby VSVDB Block

This is the my hdfury string : EB0146D0004704005798A953 … i can use ?

The macro are enable .

Yep the bit you want is:

4704005798A953

Type that into the Payload.

1 Like

I’m a bit puzzled by the HDR metadata override. Isnt’it supposed to be as neutral as possible ?. I mean I understand that color space should be specified (and that it should match the one specified in DV VSVDB, -and preferably also the most native one of the renderer-), but regarding max and min luminance, if it is set it means the renderer will do another tone mapping (maybe unless maxlum exactly matches both the VSVDB and the renderer exactly… which is nearly impossible)… And doing too much digital operations on a signal is detrimental.

I take the example of the Formovie Theater. I understood recently that it seems to be set at maxlum500nits to manage tone-mapping and physical peak brightness (at contrast 50) so it means I can’t set the VSVDB to less than maxlum ~280nits if I want to keep the peak physical luminance of the projector. It’s a shame because the real maxlum of the projector with my screen is more around 150nits, but in this case I would need to ramp up the contrast to an impossible value of 166 … I could fix that by setting the HDR10 metadata to 280nits and it would then remap to 500 and would fix my peak luminance issue… but it’s completely absurd. It means that I see no case of wanting to set this max and minlum HDR10 metadata to something else than neutral (0/0), but maybe I’m wrong.

cc @DMDreview

Not sure the context here, are you referring to something on the spreadsheet for VSVDB calculation? (I would guess not) - maybe connected to the bad metadata for HDR10 content you mentioned before?

As you said:

I assumed you wanted to put settings for this HDR10 source metadata override in Player-led(HDR) mode. Like it’s possible when using HDFURY in forced LLDV mode (output as HDR) and thus above is what I think about having the maxlum/minlum/maxCLL/MaxFALL HDR10 metadata settings in your build. And I think it could be detrimental as it would lead the receiver (TV/projector) to do an aditonnal tone-mapping over the one already done by the VS10 engine. Or maybe I’m wrong ?

… Mmmm sorry I misunderstood what you said, I thougt you wanted to add settings to modify hdr10 metadata… but in fact you just have put a tab where some VSVDB values are already set and appropriate for player-led(HDR) mode, ok, sorry for this mistake.

Yeah no worries, always great to see people engaging and the community moving forward all input is good in my book, was just wondering how it was connected :slight_smile:

What I plan to do is allow easier VSVDB management directly in the UI with:

  • VSVDB Colour Space selection
  • VSVDB Max Lum selection
  • VSVDB Min Lum selection

And create the appropriate VSVDB payload from these - the rest of the payload can be fixed, can still always just put in a custom payload for other reasons as well.


Looking longer term for the “bad HDR10 metadata”, would it be appropriate just to change the static meta data mastering display max from 4000 to 1000 if the Max CLL is below 1000 if that master display max overhead is being taken into consideration leading to a dark image?

https://professionalsupport.dolby.com/s/article/Calculation-of-MaxFALL-and-MaxCLL-metadata?language=en_US

If your projector has a certain brightness, for example 150 nit, then you have to tonemap to this value, if the player tonemaps to 200 nit, then the projector will do additional tonemapping, or just clipping values above 150.
If you tell the player to tonemap to 120 nits, i.e. below the capabilities of your projector, the player will no longer do secondary tonemapping.
I think it is optimal to take the same video, preferably with grayscale and colors and adjust the player first at 120 nits, and the second time at 150 nits and the third at 200 nits for example and see which variant looks better. Keep in mind that given the very low brightness of the projector, it makes no sense to leave highlites above some value killing the brightness of the middle tones, in your case it is desirable to keep the range up to 80-100 nits approximately untouched by brightness, and the rest of the brightness range to squeeze and the brightest details just take away in clipping. At least that’s what I would do. Because the most important information for us are details up to 100 nits even in HDR. Although if the movie is made under 4000 nits, then there are important details up to 200 nits, which is beyond the capabilities of your projector. Here in any case brightness will drop even important part of the image.

@cpm thanks, seems the issue gone now.

1 Like

I completely agree with the first part of your post and theorically, given my setup (relatively small screen) I calculated a real max brightness at 220nits. So theorically I would set my VSVDB according to that. But the issue is that my projector is set to reach its peak luminance if the input signal reaches 500 nits when contrast slider is at ~50 (or if the input signal is 250nits when contrast slider is at 100). Consequently, I have 2 solutions:

  • 1/ Set VSVDB to a minimum of maxlum 280 nits, or maybe even 250 but not lower, and set my contrast to around 85.
  • 2/ Set VSVDB to the real nits (220) but I would also need to set HDR10 metadata override (in HDfury splitter) to maxlum220 (or rather maxCLL to 220 since it seems to react in priority to this parameter) so that the signal is mapped back to the projector “capabilities” which is wrongly set at 500. Thus I would be able to reach peak luminance.

In the second part of your post, I understand that you describe what would be the consequences of a wrongly set tone-mapping.
Indeed in the solution 1/, the image can seem a little bit dark but it’s still allright to me.
In solution 2/ it would be better in terms of midtones, there would be slightly more contrast and brightness in those, but it would be at the expense of having maybe some banding effect due to the second tone-mapping.

yeap, there are roughly 2 situations possible regarding “bad metadata” that leads to dark image :

  • MD maxlum set too high and no maxCLL data.
  • MD maxlum set too high and maxCLL present but too high as well. For some weird reason.

Workaround could be to force a computed maxcll value:

  • based on full movie scan (as said previously with the help of madvr tools)
  • based on some kind of frame average light level of signals < 200 nits for at least let’s say 10% of the movie (so that scan is not too long), the MaxCLL would be increased based on that result (with some kind of coefficient). It won’t be very accurate but still better than dark image.

But first I need to hack those values in bad HDR10 metadata MKV file that I have. I need time to do so. This is to confirm that VS10 is taking those values into account (and in which priority) when it does HDR10 to DV mapping.

What do you think ?

I would be glad to have the MadVR tool equivalent (MadVRhdrMeasure) but built for linux… I don’t very much like windows. I haven’t find anything like that so far

My man, this new calc works. Super dope bro. you making great work and you one of the best reasons to use CoreELEC as the mapping not only for the s95b tv I use but mostly for my projector is super helpful

I have a request, what about presets for different displays, I end up moving the player rn between places and wanted automatic or atleast presets for each display of mine, for example : epson tw9400(player_led(HDR)) , lg C2(s10 all) , samsung s95b (conversion between hdr10+ to dv, saw the other thread you working on it)

So if I get it correctly looking to recalculate the HDR10 static metadata to better fit the actual content in the case of these bad metadata files.

Noting in this case, as no longer have access to the source master we need to use the compressed video data to base the calculation - not sure if that causes any material difference.

Then as you say need to test the theory that those parameters are actively used by the DV Engine in the tone mapping.

I think a couple of parameters could be created on the kernel to allow the override of the static metadata - i.e. in the code segment posted previously.


Thoughts/Speculation:

Not looked closely but sounds like this may fit with the tool set that @R3S3T_9999 puts together? .e.g. already a way to extract the more “correct” HDR10 static metadata.

Maybe rather than everyone re-calculating their own values, it would be better to have a central database that could be referred to for well known titles/content, if enough interest possibly the CE team would be able to host it, but then again maybe too much of a niche on an already niche subject.

If lucky there maybe a way to get clever with that - for example when someone watches the content a running max is kept (if not a high overhead), and then at end published to the central DB, and a consensus of the results kept for all to use.

1 Like

500 nits is apparently the projector’s internal tonemapping setting, when sending a CEAM signal after VS10, MDL and other data is zero,
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : R: x=0.000000 y=0.000000, G: x=0.000000 y=0.000000, B: x=0.000000 y=0.000000, White point: x=0.000000 y=0.000000
Mastering display luminance : min: 0.0000 cd/m2, max: 0 cd/m2
MaxFall and MAxCLL not sending.
this means that the projector should accept the data as it is, so it probably should not tonemap a signal up to 200 nits and still output this 200 nits as 200 nits, i.e. based on its maximum brightness.
So you need to do a test. with multiple brightnesses to make sure the projector is correctly receiving and not trying to do tonemapping again

Feature Complete - Update 15

  • Dolby VSVDB Payload [for Player Led (HDR)]

    • New UI elements to create payload directly.
    • Usage:
      • Choose Type [Player Led(HDR)]
      • Enable Dolby VSVDB
        • Set Colour space
        • Set Minimum Luminance
        • Set Maximum Luminance
      • Payload string automatically generated for the above values.
      • Note: these steps will overwrite any prior payload - if want to keep a custom (not generated) payload then don’t touch the Colour Space, Min Lum, Max Lum.
    • Caveat: Designed for Player Led (HDR), other usage may give strange results.
  • Fix for DV mode switching not working on some setups when content allowed to run to finish and exit.


When testing use the below tar to update a fresh install of CE-21 ng.

Update tar


If upgrading from an older version and you experience issues with wrong playback mode e.g. DV playing in HDR etc. then try:

  • Set - For Dolby Vision [SDR]
    • Play some DV content
  • Set - For Dolby Vision [off]
    • Play some DV content

Other settings related issues then toggle the setting in question, all else fails can try a clean install.

7 Likes

How do I find out the min/max luminance on a display of unknown values, pull down the EDID and read from that?

I am not aware outside of the Dolby VSVDB other elements in the EDID that contain the min and max, if we can find one then can automate more.

Can try looking up the specs for the display - may have, maybe someone already tested the TV/Panel for a review, or just ballpark guess and tune form there.

That’s what I meant. I exported edid/vsvdb hex values and plugged into the edid decode tool to find min/max.

Could you follow the same method as the EDID decode tool you linked and find the min/max from that? Baking that functionality into CE so you can directly feed the min/max fields.

https://git.linuxtv.org/edid-decode.git/

Maybe I am missing the point, but the only min and max you are getting via the EDID is from the Dolby VSVDB? - and if you have a Dolby VSVDB from the Display then it is already DV Capable and no need for Player Led (HDR).

Note: If you do choose Player Led (HDR) with a display that is DV capable, then the Dolby VSVDB from the display will be used no need to inject one.

1 Like

Was running through the process on my DV TV before helping a friend install.

If HDR only display, then no VSVDB info? I didn’t see any info on min/max lum for his older Sony TV (X75H) and wanted to pull it from the EDID/VSVDB instead.