Player-LED vs TV-LED Dolby Vision on LG OLED – Which Is More Accurate?

I’ve been testing Dolby Vision playback on my LG C4 OLED using CoreELEC(CPM A14 Build) on a Dune/Homatics box, and I wanted to share some detailed results for discussion. I’ve captured screenshots from Furiosa: A Mad Max Saga (UHD Blu-ray remux) in both Player-LED (12-bit) and TV-LED (8-bit tunneling) modes — and the differences are more interesting than expected.

Display: LG C4 OLED

  1. Player: Dune Homatics 4K Plus (CoreELEC)

  2. Source: Dolby Vision

  3. Audio: Dolby TrueHD 7.1 passthrough

  4. Display Mode: 3840×2160p @ 23.976Hz

  5. Disable Noise Reduction : Enabled

    What’s interesting is that, despite the 12-bit output in Player-LED mode, the TV-LED image is visibly brighter and often more accurate in terms of highlight tone-mapping and perceived PQ. — it’s how Dolby Vision tunneling is supposed to work ?

    Notice right top corner in below images . 8-Bit Tv Led looks more brighter than 12 bit , so in my case TV Led is more accurate ?

    is PQ and tone-mapping are often better with TV-LED on LG OLED panels, even if the reported output is “only 8-bit” , please clarify

A post was merged into an existing topic: Help, support CPM build