Learning about Dolby Vision and CoreELEC development

dv-p5-mag1.avi - Google Drive capture dv P5 rgb 8bit mode
multi.pattern.DV.P5.mp4 - Google Drive original

1 Like

Thanks, that capture shows that the 10-bit signal has been expanded to fill a 12-bit range. That matches the metadata in the embedded pixel sequence which says 12-bit.

Seems to show that, despite the metadata having options for 8, 10, or 12-bit, only the 12-bit option is used by players. The only reason I can think of that being useful is if the chroma upsampling process is not nearest neighbour and uses greater than 10-bit processing in its calculations…

I’ll play around and see what response I get when changing the bitdepth in the metadata. Be interesting to see if TV’s actually use this metadata as well handle a 10-bit signal sent as 12-bit with padded zeros.

This seems to be the case, my TV responds to changes in the bitdepth field and seems to only use the lower 8, 10, or 12-bits as matches the field in the metadata.

@DMDreview Could you provide the original file and a capture of profile 5 content being played in CoreELEC, but with DV support disabled, and the output format forced as 12-bit 4:2:2 format?

I think I may have found a way to decode the compressed video signal when a 10-bit file is playing, and have an idea for writing the metadata, but I want to check via your captures if there are going to be any further show-stoppers (wrt changes in bitdepth, chroma sampling, etc) before I go too far.

Has anyone ever looked into enabling the second video layer at all or picture-in-picture modes?

My idea is, rather than try to embed metadata into the decoded video frame, to overlay a second video layer that contains the pixel sequence. Wanted to see if there was anything I could start from

Ok, tomorrow

1 Like

I disabled DV support in CE settings. at the same time I simulate work on HDR TV samsung, without DV support. 12b 422 output is installed. ran DV profile 5 file. I see ITP colors in the image, I also see places where there is a change in L2. jumping brightness changes.
If TV with DV support, but in the settings also disabled DV support and disabled Player led, then we get the same as with HDR TV, ITP colors and jumping brightness changes due to switching RPU.
The same result if I leave Player led but disable DV is set in the settings.
Source video 023-DV movie1_p5_DV.mp4
capture dvp5-dv dis2.mov

You need to keep DV enabled as this system does convert P5 to SDR/HDR when no DV capable screen is connected. Use latest nightly!

When you disable DV there does no colour map happen at all and P5 does not include a HDR fallback.

That is the test I asked for.

Thanks again for continuing to do tests.

I’m seeing that this capture is in a 10-bit 4:2:2 format. Shouldn’t I be seeing a 12-bit format to match the 12-bit 4:2:2 output you set in CoreELEC?

My capture card like all others can accept 422 12 bit but will only write to file in 10 bit, there are no capture cards to capture 422 12 bit yuv. Only in RGB mode there are capture cards with rgb 12 bit. But my card does not support it.

Can you just capture it the same way you were the DV tunnel captures, i.e., set the capture card a 8-bit RGB 444 format, regardless what the actual format is? If not, I’ll see if I can use your capture as it is, or I’ll make a build that alters the avi packet so your capture card thinks the output is rgb.

This is really the same way you were doing the DV tunnel captures. The whole thing with packing a 12-bit 422 signal into the same transmission pattern/timing as 8-bit RGB isn’t actual a Dolby trick, it is how 12-bit 422 transmission is defined in the hdmi 1.4 specifications. Dolby seem to just alter the avi packet to say rgb instead of yuv.

Ready. test422-12torgb8bit.avi

I’ve had a quick look at that file. Is that with the capture card set to accept a 12-bit 4:2:2 yuv input, but saving in a 8-bit rgb output? i.e., the capture card is converting from 12-bit 4:2:2 yuv to 8-bit rgb?

If so, that is not what I meant - I actually meant set the capture card to accept a 8-bit rgb input and save as 8-bit rgb - even though the output from coreelec is actually 12-bit 4:2:2 rgb. Not sure if this is possible with the capture card - or if the avi packet would need to be modified to the capture card believes it is receiving rgb the same way tv-led dv is output.

Yes, the card does format conversion. I don’t know if it can be undone, but it seems to be on the hardware level. I’ll check tomorrow on the other card.

Transmission of tv-led DV:

So I may have found that tv-led Dolby vision is not required to be a 12-bit 4:2:2 signal. I think I have found that it works with all of 8-bit, 10-bit, and 12-bit 4:2:2 signals - at least my TV goes into DV mode when I had previously set the output to these formats. I am fairly certain this is the case as the only change I am making to the hdmi output format is in the avi packet.

The issue I have is confirming that the signal format does not change when I trigger the DV mode - my Sony tv does not report data on the hdmi signal format and the bitdepth info reported by CoreELEC looks incorrect once you start modifying the rgb/ycc indicator in the avi packet (I think this is related to why some people were seeing CoreELEC report a 12-bit RGB signal in tv-led mode).

Does someone have a TV that reports data on the hdmi signal format (bitdepth and chroma format) and supports tv-led DV (I think LG TV’s may be capable) and is willing to run some tests to confirm the output format? If so, let me know and I’ll provide a test build, sample videos with the metadata embedded, and instructions on how to trigger the DV mode

Don’t spend too much time on it at the moment.

If someone can confirm (I am fairly certain though) that tv-led DV works with a 10-bit 4:2:2 signal format (see above post) that would be a better mode to test once confirmed. Otherwise, if it becomes a problem later, I’ll make a build that modifies the avi packet coming out of CoreELEC - should make the 12-bit 4:2:2 signal appears as 8-bit RGB.

i have lg cx

All right. This is the test build. For anyone else that is interested in playing around - the following instructions should trigger tv-led DV mode on unlicenced devices

So for each format, 8-bit, 10-bit and 12-bit 4:2:2 start by setting CoreELEC to that format. Then start playing rise_of_gru_sample_8bit_embedded.mkv (edit: this link has been updated to correct mistakenly using same chroma channel twice)

With the video playing, trigger tv-led DV with these ssh commands:

echo Y > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_enable > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

This will need to be redone each time a video is started.

Assuming this works, your TV should then go into DV mode and the image should have an obvious change to the correct colors.

At this point, for each of the signal format settings on CoreELEC (8-bit 4:2:2, 10-bit 4:2:2, and 12-bit 4:2:2) could you check the hdmi signal info on your TV to see what the reported signal format is (bitdepth chroma format, etc)?

Note: When the videos stop the gui will by in all the wrong colors. If you want to correct that enter the ssh commands

echo N > /sys/module/am_vecm/parameters/DV_vsif_send_in_hdmi_packet
echo DV_disable_vsif > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

and then do something that triggers a resolution/refresh rate change.

I suggest starting with the “standard” 12-bit 4:2:2 format to confirm everything is working properly and to see what signal format your TV reports in that mode


The test build should also allow you to capture 12-bit 4:2:2 as 8-bit RGB by altering the avi infopacket that is output.

To use: Disable the DV stuff with the second lot of commands from the above post (if they were enabled since a reboot), set CoreELEC to 12-bit 4:2:2, start the test file, while playing enter the ssh command

echo DV_rgb_only > /sys/devices/virtual/amhdmitx/amhdmitx0/attr

Effect should be immediate and obvious on the image. Connected devices (TV, capture card, etc) should now be think they are receiving a 8-bit RGB signal that you can capture (as you do for the tv-led captures) that contain a 12-bit 4:2:2 signal packed into it.