EDID Override - Injecting a Dolby VSVDB Block

No FEL support. I think most users will stick to the S922x-J

The hybrid with RPU from D+ is better in this regard. Not exact frame, but can see it is much less saturated

VS12 on S928XJ is closer to the native DV. Shame there is no FEL on that SOC, but maybe worth checking out just for the VS12 upscaling on SDR/HDR content. Looks close enough to native DV.

not good, the blacks are completely crushed

I think there is no VS12. Apparently, when the original HDR video is processed, the compression is non-linear, and since the brightness is very large, but the brightness distribution on the face is close enough, the engine can’t figure out how to compress it. Maybe in the new library they decided to fix this, but it leads to a big drop in brightness. In some cases the whole brightness drops by a factor of 2, it is not the peaks that are compressed, but the whole brightness range.
I.e. this solution is not very correct, it is different from the solution used by CM4.0 players, and Dune 8k on S928x is not such a solution, it is Cm2.9.

Used EDID with 600nit.
In any case, only CM4.0 can give more correct range compression, due to the three sections with a single range compression curve.

Harry Potter original

Dune 8k

CEAM

Very insightful as always, thanks.

FYI, the VS12 moniker has cropped up in the names for the dovi.ko, so it exists in some form if just a filename!, what it means though is unknown - from YadaYada :

So left guessing, vs10 and vs12 look like they are internal versioning / designations of some kind probably related to versions of the dolby software development kit used for SoC implementations.

My speculation would be vs10 is cm2.9 era and v12 is cm4.0 era.

The biggest question is will the AM6 still have FEL support in the 5.15 kernel.

I think vs10 is 10 bit processing and vs12 uses 12 bits, more accurate calculations for higher brightness, it makes more sense based on the name.

Did some additional research.
Look, there is the same video, same HDR encoded brightness level, the only difference is the metadata prescribed in MaxCLL, MaxFALL and MDL for each file.
If MDL=4000, with maxcll=1000 and maxfall=400, these are the standard values in Davinci, then there is no clipping, using the maximum edid brightness range of 600 nits.
The second case MDL=1000, with maxcll=1000 and maxfall=400, there is clipping.

Third case, MDL=4000, maxcll=3187 and maxfall=230 (these are the calculated real values contained in the video), there is no clipping, but the brightness after tonemapping does not reach the maximum, i.e. Vs10 leaves a margin for a possible brightness of 4000 and higher nits.
And the fourth case MDL=1000, with maxcll=3187 and maxfall=230, there is clipping.
Hence, VS10 function when converting does not just use EDID of the display, but uses MDL and other two parameters, if MDL 1000 is specified, VS10 will not try to keep the whole brightness range, but will clipping, but the rest of the image will be brighter.
If the video has an MDL of 4000, VS10 will try to keep a larger range, but MaxCLL and MaxFall will also be affected.

I did the same trick with Harry Potter video, re-encoded it in davinci with different MDL and newly calculated values of MaxCLL and MaxFall. and at MDL 4000, as in the original movie, the file started playing correctly on CEAM, even better than on Dune 8k.
And even the variant, just without calculations, by default, as Davinci does (CLL 1000 and Fall 400), also works fine and the image is correct, without clipping.

1 Like

It seems that the real reason was on the surface, files that have clipping are incorrectly defined in Ceam, i.e. for example Harry Potter has MDL 4000, but Ceam in HDR defines it as MDL1000, hence the clipping, the second movie 12 Friends of the Ocean, also has MDL4000 in the original, but Ceam defines it as MDL1000, but since there the real brightness does not go beyond 1000, it outputs it more or less correctly.
It seems that the player is not aware of MDL10000 and does not know how to process it properly, and just cuts off everything above 600 nits. (but that’s just the Spears Muncil video).
Most likely the problem is caused by ATEME library, the same defect is present on many old CM2.9 players, apparently when reading it, an error occurs, Zidoo Z9x pro by the way, unlike Z9x, solved this problem.
Question, will it be possible to solve this problem on Ceam?

original CLIPPING

Davinci default, NO CLIPPING.

Davinci rescale maxcll maxfall, NO CLIPPING

capturing from original Harry Potter in HDR mode playing, CLIPPING

I’m having hard time to follow the explanations here… I admit

Do you mean:

  • All hdr10 files with MDL 4000 are interpreted as MDL 1000 by CEAM VS10 ?
  • VS10 takes theorically all 3 parameters (MDL, MaxCLL, MaxFALL) but only takes MDL ???

not all, but only videos encoded by Ateme library are recognized as mdl1000, not MDL4000.
No, as I understood, it accepts all three, but if MDL is defined incorrectly, the player will consider MDL incorrectly and will do tonemapping based on MDL as the most important parameter.

@cpm is the option for custom payload gone for lldv ?

No, still there.

Checked Ugoos am6b+ android + kodi 21 omega. movie with Ateme and MDL 4000 is detected correctly as mdl4000.
So the problem seems to be in CE

ok , fixed by downgrade and then upgrade again. All good again , thanks !

Had a look at the code, could not even get it to debug the SEI parsing.
Maybe need to wait for a new kernel and hope it is fixed, or an experienced member of the CE team may have better luck.

As you said, the ATEME Titan file cannot be processed correctly, regardless of values, and will then just get defaults from Amlogic instead (BT.2020, 1000, 0.005):

CoreELEC kernel: DOLBY: HDR10: present 0, 0, 0, 0
CoreELEC kernel: DOLBY:         R = 8a48, 3908
CoreELEC kernel: DOLBY:         G = 2134, 9baa
CoreELEC kernel: DOLBY:         B = 1996, 08fc
CoreELEC kernel: DOLBY:         W = 3d13, 4042
CoreELEC kernel: DOLBY:         Max = 10000000
CoreELEC kernel: DOLBY:         Min = 50
CoreELEC kernel: DOLBY:         MCLL = 0
CoreELEC kernel: DOLBY:         MPALL = 0

A big hack would be to bypass amlogic and pass in from kodi side parsing where it cannot prase correct - but not a nice solution

As you say for none ATEME Titan it is working ok:

CoreELEC kernel: DOLBY: HDR10: present 1, 930, 0, 2
CoreELEC kernel: DOLBY:         R = 84d0, 3e80
CoreELEC kernel: DOLBY:         G = 33c2, 86c4
CoreELEC kernel: DOLBY:         B = 1d4c, 0bb8
CoreELEC kernel: DOLBY:         W = 3d13, 4042
CoreELEC kernel: DOLBY:         Max = 10000000
CoreELEC kernel: DOLBY:         Min = 1
CoreELEC kernel: DOLBY:         MCLL = 930
CoreELEC kernel: DOLBY:         MPALL = 197

Not sure on the default Primaries for BT.2020 as look different to those coming from a file as well.

@Portisch - One to add to your long list if you can take a look when you have time - looks like HDR10 content is not correctly being handled at least in the 4.9 kernel (for ATEME Titan encoding which is a very large proportion of HDR10 UHD content at least).

I think the SEI parsing is in the decoder here, but could not get any debugging output working for it:

FYI the debug output above is from:

With altered code to always output the debug logging as flag will never be set in this scenario.

Let me know if you need more details if taking a look.

Thx.

Then I think it is necessary to report this to the main CE developers, as this bug can cause problems even without using VS10.
Because the player sends HDR without tonemapping to the TV, but prescribes MDL 1000, but if the TV uses MDL to output HDR to the screen, then there will be a problem with wrong tonemapping, but by the TV itself.
And taking into account that Ateme library encoded about 40% of all HDR video, the problem will be visible on many movies with MDL4000.

Sorry, DMD do you mean that the Zidoo Z9X Pro tone maps correctly without clipping?
Sorry for being off topic, it’s just curiosity

Yeah, they fixed ATEME’s reading of video files.

Checked on Ugoos x4q pro with Android, it has the same problem, Ateme files are not playing correctly. Although I installed the latest version of Kodi.
At the same time on Amazon Fire tv stick 2018 they fixed the defect, it used to like to set wrong values for video with Ateme too

@cpm, just an idea for now, could you just put a place on the screen, as you did for the VSVDB block, to type in and inject a HDMI data packet that includes all HDR metadata values - like the HDFury devices? This would not only allow a temporary fix but add the functionality and flexibility of the HDFury devices on the CE platform?