Learning about Dolby Vision and CoreELEC development

Here are my findings,

DV triggered, not sure about the colors as I didn’t check the same scene on my Ugoos.

I have a Philips 808 tv and took a picture of the info screen, it always reported 8 bit even though I changed to 8, 10, and 12 bits.

UI blacks out the screen but I can get a glimpse of it while it fades away, it said:

CE - OSD
8 bit - 16 bit
10 bit - 12 bit
12 bit - 10 bit

Seems random?

Also, can’t help much other than testing but this progress is exciting. I have two cheap chinese boxes and with DV, they would be suddenly equal to AM6.

Edit:

x96 Max (it looked the same in all the bit depth settings)

AM6

Thanks for the tests. Some interesting results.


I wonder if the TV is set to always say that if it is receiving a DV signal. What does the does info screen say before you trigger the DV with the ssh commands?

Those commands don’t actually change the hdmi format - just the avi packet and vsif sent.


So we are seeing different behaviour here. My TV just exits DV mode but still shows an image just with distorted colors.


Not random, just the bitdepth reported by CoreELEC depends on both the rgb/ycc indicator in the avi packet and if a 4:2:2 mode is used. No idea why, but is why I wanted someone to check with an external device


Differences between x96 Max and AM6:
I mistakenly made the file using the same chroma channel twice.

Corrected file: rise_of_gru_sample_8bit_embedded.mkv

I also made a 10-bit version (will only work in 10-bit and 12-bit output modes): rise_of_gru_sample_10bit_embedded.mkv

1 Like

Whatever I set under system settings.

I can’t really remember if the tv showed 12bit when we had such issues in older builds. I want to say yes, but I’m not sure.

Outside of DV, it does report different bit depths correctly in RGB mode.

I’ll test the newer files tomorrow.

When you test again, try running

echo DV_rgb_only > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

when a file is playing in YUV 4:2:2 mode, and then have a look at the info screen. This isolates the effect of enabling the DV mode from the test, as that command only sets the avi packet to full-range RGB.

After I run that command, it says 8bit Full RGB.

Updated file looks the same as AM6 now.

Not sure what to make of that for 8-bit and 10bit 4:2:2 modes. Makes sense for 12-bit 4:2:2 though.

Could someone give my a pointer as to where in the linux code the buffer/frame data/pixel values for the gui layer lives?

I’ve tried to follow where to setting for gui scaling in the menu lead to, but haven’t had any luck.


The idea would be alter the pixel values in the gui layer to embed the DV metadata.

Given the gui layer appears to be always on and sits above the video layer, this seems like a sensible idea. Any thoughts on this approach from those that know CoreELEC/ Kodi much better than I do?

Some info - not sure if it is any use or relevant but more info is normally better:

The OSD (Kodi GUI) is on OSD 1 from the linux pipeline point of view.

There are properties in the Dolby Enhancement code which indicate this graphics layer is 1080 by default, in some testing I changed those parameters to 2160 to see if any effect on quality (it did not) but maybe important here.

Also to consider is the kodi side, I think that is 1080 and scaled to 2160 if set to 4K GUI etc.

Good luck and hope you get some more concrete pointers.

@cpm I thought it was 1080 and scaled to 2160 by default and that the disable scaling option make the gui a native 2160 layer?


Anyway, issues like this make me think the approach of using the second video layer in a picture-in-picture (pip) mode would be much better.

The idea here would be to enable the pip mode, set the pip to only cover the top few rows, and then feed this with data not from a video decoder but simply from a 8-bit YUV formatted buffer that the metadata can be written to. The idea is to set the pip overlay black (+ metadata) which, for all content that is cropped/letterboxed would result in no visual impact on the original video content.

I think this approach would actually work for several reasons:

  • Avoids issues with non 16:9 videos
  • Avoids issues with colorspace conversion of the gui layer
  • Avoids issues with chroma upsampling. Either by setting a 422 format for the pip layer or by setting a constant chroma data in the pip layer (testing shows this lets the metadata work even from a 420 format)
  • Should avoid any potential issues with synchronising the video frame and metadata as the pip code seems to live in the same vsync function as the main video layer
  • Issue with the gui layer being overlaid should be resolvable by setting the top few rows as fully transparent

Really, the only situation I think this approach would have issue with is for content that is actually 16:9. Clearly this approach would result a blackout of the top few rows - I think this could be mitigated though by instead taking the original video data for these pixels and using them in a 8-bit approximation.

Only problem is I can’t find any work that I can build from / copy off and I’ve had no luck trying to get the pip layer enabled …


Given that the work of making tv-led DV work on unsupported devices seems like a large overly-complicated task with no clear starting point that may put people off from trying, I want to make it clear that embedding the metadata is the only remaining roadblock. Also that I honestly can’t see why the approach outlined in this post would not work to get the metadata embedded.

In the interest of lowering the barrier of entry for anyone interested in trying to get tv-led DV working on unsupported devices - I’m making it clear that I think the goal of enabling the pip layer with data feed from a buffer appears to be a way of solving the remaining roadblock.

1 Like

Breakthrough - Successfully embed metadata
I can now embed the dolby vision metadata while playing any video. Metadata is put into the gui layer and seems to be rock-solid.


Test build:
https://mega.nz/file/cJpzGYzI#B0ZbBJTQ3PEa53puTEcoEcwg5VD5DdQo_k2K_khs4K0


Usage:
Initial once-off setup: Set kodi to use 12-bit 422 with a 3840 x 2160 gui resolution. Enable the option “Disable GUI scaling”. Reboot.

Use:
To use trigger DV with a video (any video, doesn’t need to be DV) playing by

echo Y > /sys/module/amvideo/parameters/inject_osd_metadata
echo Y > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

Will need to do this for every video started.

Note: When the videos stop the gui will by in all the wrong colors. If you want to correct that enter the ssh commands

echo N > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_disable_vsif > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

and then do something that triggers a resolution/refresh rate change.


There are two buffers for DV metadata setup - these are not yet being set by metadata from video files. The two are initially loaded with metadata for a source in the IPT (i.e., profile 5) colorspace with very different L1 data.

Change to suit profile 8.1 with

echo 0,0,0,0,82,0,0,37,102,0,0,53,234,37,102,249,252,235,28,37,102,68,202,0,0,1,0,0,0,8,0,0,0,8,0,0,0,28,54,34,67,1,134,14,70,48,142,5,20,0,0,1,166,62,90,255,255,0,0,0,0,0,0,0,0,12,0,1,0,0,62,11,134,42,42,1,0,0,0,6,1,0,20,0,50,0,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,11,120,97,107 > /sys/module/amvideo/parameters/metadata_buffer
echo 0,0,0,0,82,0,0,37,102,0,0,53,234,37,102,249,252,235,28,37,102,68,202,0,0,1,0,0,0,8,0,0,0,8,0,0,0,28,54,34,67,1,134,14,70,48,142,5,20,0,0,1,166,62,90,255,255,0,0,0,0,0,0,0,0,12,0,1,0,0,62,11,134,42,42,1,0,0,0,6,1,5,220,13,172,11,184,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,194,81,228,196 > /sys/module/amvideo/parameters/metadata_buffer2

Change back to IPT with

echo 0,0,0,0,82,0,0,32,0,3,31,6,145,32,0,252,91,4,67,32,0,1,11,234,87,0,0,0,0,8,0,0,0,8,0,0,0,66,185,254,163,254,163,254,163,66,185,254,163,254,163,254,163,66,185,255,255,0,0,0,0,0,0,0,0,12,2,1,1,0,62,11,134,0,42,1,0,0,0,6,1,0,20,0,50,0,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,189,151,135,54 > /sys/module/amvideo/parameters/metadata_buffer
echo 0,0,0,0,82,0,0,32,0,3,31,6,145,32,0,252,91,4,67,32,0,1,11,234,87,0,0,0,0,8,0,0,0,8,0,0,0,66,185,254,163,254,163,254,163,66,185,254,163,254,163,254,163,66,185,255,255,0,0,0,0,0,0,0,0,12,2,1,1,0,62,11,134,0,42,1,0,0,0,6,1,5,220,13,172,11,184,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,116,190,2,153 > /sys/module/amvideo/parameters/metadata_buffer2

By default these two sets of metadata will be switched between every 125 frames. Can set this timing to whatever, for every 250 frames use

 echo 250 > /sys/module/amvideo/parameters/toggle_metadata_count_limit

To disable the auto toggling use

 echo 0> /sys/module/amvideo/parameters/toggle_metadata_count_limit

Can also manually toggle with

echo Y > /sys/module/amvideo/parameters/metadata_toggle

Testing
This build should provide tv-led DV from any device (licenced or not) paired with a TV that support tv-led DV.

Success is the TV goes into DV mode, brightness toggles according to toggle_metadata_count_limit, and for profile 5 content, correct colors being displayed (when using the default metadata).

Testing has been very solid for me, at risk of a onslaught of problems, try to break it …

As this work is starting to look like it may actually become useful, the more people that can test with different hardware and setup the better.


Next steps
Next is getting the DV metadata from the video files. I know how to do this with the decoded metadata, but I need access to the decoded metadata. This will require getting the dovi_tool to decode the metadata inside the primary_render_frame function in video.c.

I don’t have the slightest clue how to get a access to the c interface of a rust tool inside of the linux kernel and get it to build. I will need someone to (ideally) do this part for me or at least provide significant help. Shoutout to anyone that could help …

9 Likes

Thanks @doppingkoala
However I’m getting this on Beelink GS king-X:


Also to get gui normal, just play any file again, so I didn’t need 2nd command.

Can you see the metadata in the top pixels of the video? May be more obvious if you only run

echo Y > /sys/module/amvideo/parameters/inject_osd_metadata

At this stage colors should be normal / no DV mode. The metadata should be able to be seen as black/white pixel values in the top row.


edit: scrap that. This issue is your display mode is 1920 x 1080. I think you may have changed the whitelist so 1080p videos still output in 1080p rather than being upscaled despite having a 4k gui. (probably the better setting from a video point of view).

You will need to do something to get a 4k display mode. Simplest is to play a 4k video. Or you could change the whitelist settings so a 4k display mode is used.

Hopefully this should solve get it working.


For this injection to work the output resolution and the native gui resolution need to match without scaling. It should be possible to use a 1080p display mode though (I haven’t tested). To do so:

  • Disable the “Disable gui scaling option”. Set the kodi gui resolution to 1080p. Reboot.
  • Enter echo 8294400 > /sys/module/amvideo/parameters/osd_byte_offset
  • Instructions as before, but ensure that a 1080p display mode is used.

Hi @doppingkoala You were right,
However correcting it still gave wrong colours.


Or do you mean the file has to be 4k?

No the file doesn’t need to be 4k - only the display mode as your second photo shows. Can you confirm that you didn’t do any of the changes I talked about wrt to 1080p?

What you are seeing is what happens when the TV doesn’t go into DV mode. Get to the stage of your second photo where you have entered all three commands.

Can you then look at your TV at the same time as entering the last command

 echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

again and see if your TV responds in any way / briefly flicks to DV mode?

@xmlcom As a troubleshooting step also try playing rise_of_gru_sample_10bit_embedded.mkv and using the below commands to trigger DV.

echo N > /sys/module/amvideo/parameters/inject_osd_metadata
echo Y > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

Difference is this doesn’t use the on-the-fly metadata injection. Make sure you don’t have the gui up at all testing this file

Hi i tried again & checked if DV was activated quickly but no dice, do you need debug logs or anything.

It really would need to be looking at the tv at the same time as running the DV_enable command again.


I haven’t really added logging so that wouldn’t really help.

There is actually very little that can go wrong when using that file at the heavy lifting is really done when making it. Your photos also shows the all but one thing is working.

I’m wondering if there are some difference between tv’s as that test works for both my Sony TV and @Kaan’s Phillip TV. @xmlcom What tv are you using?

I have a suspicion about what to change. I’ll let you know when I have new build

@xmlcom Try this new build https://mega.nz/file/sQwFVJ7J#oCc_vwU1JaRQgAAnP5QOYXrh7h63Zsy8QhpGP8WWAnU


Start with testing the rise_of_gru_sample_10bit_embedded.mkv test. When playing and no gui, run

echo N > /sys/module/amvideo/parameters/inject_osd_metadata
echo Y > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr
echo Y > /sys/module/am_vecm/parameters/always_send_tvled_DV_vsif

to enable DV. No need to look at the tv at the same time anymore.

Disable commands (if needed)

echo N > /sys/module/am_vecm/parameters/always_send_tvled_DV_vsif
echo N > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_disable_vsif > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

To try the newer on-the-fly metadata embedding again with any video file.

Enable commands

echo Y > /sys/module/amvideo/parameters/inject_osd_metadata
echo Y > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr
echo Y > /sys/module/am_vecm/parameters/always_send_tvled_DV_vsif

Disable commands (if needed)

echo N > /sys/module/am_vecm/parameters/always_send_tvled_DV_vsif
echo N > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_disable_vsif > /sys/devices/virtual/amhdmitx/amhdmitx0/attr
echo N > /sys/module/amvideo/parameters/inject_osd_metadata

If it doesn’t work post a dmesg log

1 Like

Thanks for your endeavour.
But unfortunately still same wrong colours no DV. My Tv is LG C6, is really old so might be that.
dmesg:
https://paste.coreelec.org/AltarSkates

Do you have tv-led DV working on that tv with another CoreELEC device?