I did as ur recommended setting now dv working.
@cpm I noted that EOTF & Famut only dv std default not BT2020nc or 709!! how I can get by DV BT2020nc via Display Led not Player Led?
I did as ur recommended setting now dv working.
@cpm I noted that EOTF & Famut only dv std default not BT2020nc or 709!! how I can get by DV BT2020nc via Display Led not Player Led?
but the question cpm my TV is LG G4 and supports 12bit I have many movies remux P7 but only get it when played 8bit dv if make TV Led but if changed to Player Led will get 12bt in pixel format.
Thatâs how itâs supposed to be. TV-led is 8 bit and LL is 12 bit.
I see thank you for clarify.
8 bIt DV is how Dolby originally designed Dolby Vision, it is actually called 8 bit RGB tunnelling where the 10 bit and potential extra 2 bit yuv are recalculated to 8bit RGB colour space then passed through this tunnel to the display where it then spits them out into 10+2bit yuv (TV Led). In LLDV this 10+2bit yuv is extracted and created in the player and passed to the TV hence the term âLow Latency Dolby Visionâ. This is also why it doesnât matter if the 8 bit RGB shows 709 colour space, because the correct colour space is âin the tunnelâ and decrypted by the TV.
Not at all. It has always been a 12-bit 422 signal over hdmi. As a part of distinguishing that DV data is being transmitted, Dolby decided to set the RGB/YUV indicator bits in the avi-packet to full-range RGB. This provides a distinction between a normal 422 signal which would have the avi packet flagged as YUV instead.
The whole âtunnellingâ idea just has seemed to cause confusion and I guess is simply due to the hdmi standard for transmitting 12-bit 422 (or 10-bit or 8-bit) video being indistinguishable from 8-bit 444 w.r.t. the bits for the video signal. The only difference is in other info packets.
Seemingly some devices donât look at the all the right info packets, just the avi packet so simply see it as 8-bit RGB - which it is not.
I tried this build. Completely lost DV capability. Back to nightly.
you just have to perform the procedure mentioned couple of posts above: switch VS10 DV mapping from off to SDR, play DV file, then switch back to off and play DV file, and all should be ok. same happened to me after the update.
Think I probably found the root cause of this issue - will add to the list when doing the next release.
@cpm , I forgot to ask what will be the difference in quality (or DV output) once you have implemented HDR10+ > DV P8 versus currently selecting the VS10 engine to output DV if HDR10+? I know that VS10 only uses the HDR10 stream but not sure what it would mean in terms of quality?
youâd have to capture the generated stream from V10 and the one converted from HDR10+ and compare them, but even that wonât tell you about general quality differences as it may vary on per-movie basis. In theory HDR10+ converted should be a bit better (if it was mastered correctly and not some sloppy job) as itâs not generated with on-the-fly algorithms.
Hello, I came today to play some movies but the same issue yesterday no DV then I followed your setups but device has become GUI rubbish color I restarted device became normal and dv ok.
Test Build:
(amlogic-ng ce-21) update tar T4
Thank you so much Iâm facing issues with T3 as you can see in my last post. I hope this will fix it.
Thatâs a huge fix. Thank you!
Itâs a âbelts-and-bracesâ additional - after removing the wait for video to be playing before doing the resolution update (the existing solution in the nightlies), I had used the existing linux-amlogic toggle frame to make sure it was in DV output mode i.e. should be outputting vsif packets before doing a resolution update and all before we attempt to send any video. So this check for me is instant as it is always already outputting the vsif - and I have never seen anything except RGB 8-bit output, but maybe there is some other delay in others setups - hence added this wait and confirm the vsif are being sent according to the /sys/kernel/debugfs.
Dear @cpm ,
Many thanks for your work!
I have a problem with my Hisense 55A9G and VS10 mode.
Even if I set it to Dolby Vision via VS10 for SDR or HDR, when I play the movie, the TV switches to SDR RGB 8 bit mode and there is no picture at all. Normal Dolby Vision mode works fine. I have attached the EDID of the TV, if you need logs I can send them privately.
hisense.bin (256 Bytes)
Is that for both Display Led and Player Led?
RGB 8 Bit sounds like it is a Display Led signal, but Display not switching into DV in this case.
edid looks normal nothing different about it:
Vendor-Specific Video Data Block (Dolby), OUI 00-D0-46:
Version: 2 (12 bytes)
DM Version: 4.x
Backlt Min Luma: 100 cd/m^2
Interface: Standard + Low-Latency
Supports 10b 12b 444: Not supported
Target Min PQ v2: 0 (0.00000000 cd/m^2)
Target Max PQ v2: 2705 (447 cd/m^2)
Unique Rx, Ry: 0.67578125, 0.32031250
Unique Gx, Gy: 0.26953125, 0.67968750
Unique Bx, By: 0.14062500, 0.04687500
I tried with T4 yesterday and got the same color garbled menu after finished watching a movie
In both cases it is wrong with VS10:
In this case the GUI is not displayed either.
If I donât use VS10 but play a DV file, the TV switches to DV mode correctly.
If I check the difference with the TV info button, when there is no picture, HDR: No, when there is a picture HDR: Dolby Vision
About | FAQ | Terms of Service | Privacy Policy | Legal Notice