EDID Override - Injecting a Dolby VSVDB Block

Hi @cpm ,
I got this sample anywhere in the internet forums, this one give me weird colours in the whole menu, when the HDR10 to Dolby Vision in VS10 enabled.
Colours back to normal when I open one normal dolby vision file.
Its wrong coded file?
Not a big deal, Im just curious, cos all other real movie is good, with that settings…
thx

I downloaded, and get a “purple and green” video with VS10: HDR → DV

I saw this before with HDR10+ content, where I needed to strip the HDR10+ per frame side metadata from the stream before the AMLogic would process correctly through VS10 - basically make it into HDR10 only frames when processing in VS10.

Not sure what is wrong in that sample, but the DV Engine does not like it and attempting to go from HDR → DV it is throwing issue (hence get stuck in the purple and green).

May 28 00:34:44 CoreELEC kernel: DOLBY: update src fmt: HDR => DOVI, signal_type 0x35091009, src fmt 0
May 28 00:34:44 CoreELEC kernel: DOLBY: reset: src:2-0, dst:0-0, frame_count:120, flags:0x400d
May 28 00:34:44 CoreELEC kernel: DOLBY: video 0:3840x2160 setting 0->0(T:1-700): pri_mode=0, no_el=1, md=174, frame:120
May 28 00:34:44 CoreELEC kernel: DOLBY ERROR: control_path(0, 0) failed -1

From the top of my head: maybe it thinks it is SDR (src:2-0) at the dolby processing point as 2 is SDR from my memory - need to check the code to be sure, and in conflict with other code which thinks it is HDR (HDR => DOVI)
Overall does look like something is different with that file.

It is a little worrying the menus do not recover to the correct colour - but if not seeing with other content then will not think about it for now.

Edit:
FYI the original BluRay for that film is DV so would source that if actually want to watch it.

1 Like

If I may ask a related question about the Player-LED HDR mode to anyone who can answer :
is TV HDR10 tonemapping applied (in addition to the PQ function which is indeed applied) in this mode ?,
and if yes, why ? as tone mapping is already done on player side (per my understanding).
If a “HDR10” tone mapping is applied on TV side, how can we be sure it’s not detrimental ? (maybe I’m totally mistaken in this approximative analysis, and thus please enlighten me :wink: )

@frodo19

I think I see the problem, if you open the file in MakeMKV it finds DV data and says it is a DV p7 FEL, so looks like the file level meta data is wrong it is not only HDR10, and then confuses the processing.

Re-muxed in MakeMKV and now plays correctly as a proper DV file, not only HDR10 - no need for VS10.

Tone mapping is done according to the Dolby VSVDB you set, as per normal LLDV, only real difference here is it sets the output as HDR10 so the TV switches over.

I guess it depends if you have any dynamic tone mapping enabled on your TV or not, same would be true if you apply dynamic tone mapping on DV I would presume.

Ok, thanks, so this is wrong file.

If VS10 conversion mode is enabled in the player and output to HDR TV, i.e. DV or HDR is output to HDR using VS10, the image will be already adjusted to the capabilities of your TV (if its peak brightness is correctly specified for EDID data in the player).
If this brightness exceeds the capabilities of your TV, the TV will apply secondary tonemapping, or simply cut off those brightnesses that are beyond its capabilities.

If the movie has PQ1000 nits, and the TV is OLED and has 500-700 nits, there is no need to do tonemapping with VS10 at all. At pq1000 usually the video does not go beyond 100 nits, and only small peaks can be up to 1000 nits, which will not be noticeable for you, compared to the tonemapping of the TV itself.
But for movies with pq4000 or some poorly done releases with pq1000 but with real peaks above 2-4k nits, then turning on VS10 on the player and converting the original HDR or DV to HDR with tonemapping can help show more image detail in highlights, especially if the TV doesn’t do a roll off for such videos. Or you like to watch HDR with the peak brightness set to something other than max, then vs10 will help keep the detail in those peaks and you won’t just see a white spot where you should be seeing picture elements.

But, you should be aware that if the EDID you load into the player has a brightness higher than 1000 nits, CEAM (am6b+ CE) will overestimate the brightness, because in Player LED mode all players with only CM2.9 will not work correctly with DV if the EDID brightness is higher than the brightness specified for the movie.
So if your TV has 1100 nits or higher in EDID, you should not turn on vs10 if you watch movies with PQ1000, just watch the HDR version of the video so that the video is output without tonemapping.
But for movies with PQ4000 this problem won’t be there, unless your TV has a brightness above 4000 nits, which is still rare.

2 Likes

Is nits from custom edid maped in playerled ? Or only tvled?

Thanks a lot for your comprehensive answer.

Actually I’m using a projector on a 92" screen and it’s around 200 nits, so no worries about the cm2.9 bug you described.
I think that in pure hdr10 my projector has a lack of tone-mapping or maybe no tone-mapping at all because I need to constantly adjust the contrast to reach the max before whites are crushed (sometimes it’s 50, and somtimes 70+).

Interesting is that with proper LLDV settings loaded (initially with the HDfury splitter) I can let the contrast at around 83 and it’s good for all DV sources (+ HDR10 sources thanks to VS10). VS10 does really really help with some HDR10 movies, and as a bonus I don’t no need to worry anymore about tweaking the contrast setting to reach peak luminance while keeping details in bright spots

So I’m really glad that I’ve now a player that does it all well and even compensate for drawbacks of my projector, thanks @cpm.
Results are really amazing.

Though I need to check my “no or few tone-mapping” hypothesis with Spears & Munsil benchmark disc

1 Like

I think there is a misunderstanding, my question was not very clear but DMDreview kind of answered it. I was talking about the TV managed HDR10 tone-mapping. The difference I was trying to spot is:

  • real LLDV mode → TV knows it’s DV it won’t apply additional tone-mapping
  • LLDV trick mode (output as HDR) → TV doesn’t know and could potentially do another tone-mapping, but I guess that no HDR10 metadata is present and that it only does some tone-mapping if it exceeds display capabilities (as DMDReview said)

If I’m not mistaken, EDID injection (or VSVDB injection) data is only used(“mapped”) in player-led mode (both in real LLDV and in HDR “trick” method).

in TV-Led mode, TV has already all the data it needs internally and has no reason to read its own capabilities from a “foreign” EDID.

I guess a simple test would be use a VSVDB with much higher nits than the display.

  • Player Led (HDR) if looks reasonable then the display is mapping again to keep within its limits.
  • Player Led (DV-LL) we are presuming will look blown out as no further adjustment is made.
  • Display Led (DV-Std) should be unaffected as not relevant.
1 Like

only player led

don’t focus on the Spears disk, there is very brightly recorded content, in reality there are literally a couple of such movies where there are 10k and average brightnesses above 1000-2000 nit

@DMDReview

After finding out the Use Display Luminance setting is only effective when the Dolby VSVDB is V1 (and I think most would have V2 Displays), I am now thinking - is it actually relevant/helpful?

Unfortunately the values it is changing go into proprietary code so cannot tell ultimately what it is doing from the code side.

During your testing, did you see any difference with this setting on or off?

I am thinking to take away the UI setting, and just have it set to always on in code to avoid user confusion, it has some small real-world value from the logging perspective to check the VSVDB has actually changed for my V1 TV, and I recall in graphics mode it did have some effect - but we always want video mode.

:warning: Experimental Build :warning:

Chucking this one out there for feedback, it has known issues.

  • Add DV Mode setting.

    • On → DV on full time (not from boot, but from kodi start)

      • Always on except when VS10 has been set off, then will play as per the content type without DV.
    • On Demand → DV on when playing DV content or using VS10.

    • Off → DV completely off.

Issue 1: when switching from Off to another mode On or On Demand, you will need to restart kodi to see further options (not sure options not coming up yet - still checking that), once restarted you will be fine unless setting to Off again and reboot/restart.

Issue 2: coming out of suspend - dreaded wrong colours! - not sure will be able to fix that one may need to wait for boot enabled DV.

Issue 3: (or maybe a feature!): menus may be too dark using DV Mode On, particularly if using Player Led (HDR), may look to add a slider later to allow adjustment of the menu luminance, in DV modes.

Issue 4: VS10 for HDR HLG off and playing HLG content is coming up as SDR, needs further checking.

Be on the look out for other defects / issues.

This is more for fun to check if it helps with not switching modes in certain setups (cannot really tell from my TV, so soliciting some feedback).

3 Likes

Checked this setting, it really has no effect.
But there is a problem in the May 27 version. Conversion of original HDR-vs10 (DV) - HDR is performed perfectly, the output is HDR with tonemapping and PQ0.
But the real original DV is output correctly with tonemapping and HDR flag, but PQ is not equal to 0, but to the original value of DV version, i.e. for example PQ4000.
At the same time on the previous version, where the full string was inserted, it was PQ0.

It was like this
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : R: x=0.000000 y=0.000000, G: x=0.000000 y=0.000000, B: x=0.000000 y=0.000000, White point: x=0.000000 y=0.000000
Mastering display luminance : min: 0.0000 cd/m2, max: 0 cd/m2

And this is how it became, for DV source video
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : BT.2020
Mastering display luminance : min: 0.0001 cd/m2, max: 4000 cd/m2
Maximum Content Light Level : 4000 cd/m2
Maximum Frame-Average Light Level : 400 cd/m2

This is how the signal is output to TV in true HDR mode
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : BT.2020
Mastering display luminance : min: 0.0001 cd/m2, max: 4000 cd/m2
Maximum Content Light Level : 4000 cd/m2
Maximum Frame-Average Light Level : 400 cd/m2

Weird, I don’t think I reproduced this issue in Player-Led mode (output as HDR) for both :

  • real DV source
  • HDR10 source converted to DV (VS10)

(I remember because I looked for these parameters on the HDFury splitter overlay everytime I played a video and it was 0 everywhere :smiley: ).

But I’ll check again in few hours to see if I reproduce this same issue as you or not.

Edit: indeed I reproduced a similar issue see my later comment

Only difference with you is that I don’t inject VSVDB via coreelec, but through EDID from the HDFury splitter

Giving this a test, looks good!

Something I noticed is that when DV is enabled in the GUI, it seems to output in 12bit (even in TV led mode). Maybe this explains the wrong colours under some circumstances?

Big thanks for always lldv cpm :raised_hands::raised_hands:

1 Like

“But, you should be aware that if the EDID you load into the player has a brightness higher than 1000 nits, CEAM (am6b+ CE) will overestimate the brightness, because in Player LED mode all players with only CM2.9 will not work correctly with DV if the EDID brightness is higher than the brightness specified for the movie.”.

Interesting. @DMDreview It seems that I can’t accomplish what I had in mind: Have my Ugoos process FEL and send it with zero tone mapping via Player-Led to my capture card, where I would handle the tone mapping with MadVR/Envy. My plan was to use an EDID of 10,000 nits to absolutely avoid any tone mapping at the player level… Is there any other way to accomplish what I want? FEL properly added by the Ugoos, but tone mapping handled by another box?