When Deep Colour/10-Bit is an illusion! (Cables Again)

So still figuring out how to get the best out of my boxes, now I’m running an Nexbox A95X as well as a Beelink Mini MX. So much of this is subjective but hopefully some of you can confirm my observations and deductions?

If a cable says it’s high speed it could be fake, that’s China for you! So anyway 10-Bit colour depth AKA Deep Colour should increase the colour palate giving a smoother image and greater realism to images, right? So if you’re seeing more intense colours that may be because your cable or your TV can’t deal with the extra information.

So I thought wow that colour looks more vibrant with an old cable but I latter suspected it was not a true high speed cable because of judder issues. Lower colour depth will look more cartoon like and less natural. After swapping the cable I’m reasonably sure that I’m also getting less judder, maybe none, and smoother colour, not as vibrant, which suggests that the inferior cable was having issues with the date. I find it hard to believe that even crap cables can’t handle 1080p at 23.98fps but I suppose the 10-Bit colour is a lot more information than we realise?

Anyway I might order a cheap but high speed compliant cable from Amazon and see if I can further improve my boxes output/video quality. Hey worth a punt for £5.

So the bottom line is vibrant intense colour probably isn’t 10-Bit/Deep Colour as the main improvement will be smoothness of the image and perhaps even a softening but also more apparent detail. I’m wondering if the extra detail contributes to micro judder? So I’m seeing hardly any judder, if any, but an increase in that fractional micro judder which sensitive people can notice.

Any thoughts?

10-bit color depth is meaningless if the content you are playing is not 10-bit as well (such as HDR content)
Having good cables is important mostly for 4K resolutions. For 1080P, most cables, even cheapo ones, will work just fine.
A cable can’t cause judder or stutter, it can’t make colors look better or worse.

A higher color depth helps reduce banding, as the gradients possible with x4 more (8-bit vs 10-bit) color shades can be much smoother. But the content must be real 10-bit in order to benefit from it.

1 Like

While I’m prepared to accept that I might be in error and am imagining any difference I currently believe I am seeing a difference but it’s early days and as said I might buy a new cable.

Finding a scene with a graduated colour which display the banding effect 8-Bit colour produces and then swapping cables might resolve the matter in my mind. It could be my old (2012) LG TV which doesn’t handle 10-Bit very well. Yes the TV is 10-Bit compliant and was expensive at the time.

Anyway I’ll plod on although I’m beginning to think the odd glitch/judder is almost impossible to resolve. Hey it could just be my TV!!!

An HDMI cable can’t affect the image quality because the signal is digital. Issues with HDMI cables usually manifest themselves by signal loss and/or “white snow” over the image.

1 Like

Yes I get the digital signal bit, no pun intended, but can cables cause a bottle neck? It’s like when you get judder/stutter when streaming sometimes.

I have 10 and 8-Bit content but I’m just wondering what this ‘wide’ function on my old TV 's advanced settings is really about? 10-Bit would probably automatically switch so perhaps the ‘wide’ colour setting is introducing some processing possibly causing glitches?

No, judder/stutter isn’t caused by a bad HDMI cable.
It possible, but unlikely for a TV to introduce the glitches you’re talking about.
Disable the feature and observe how well things work for a while, then try to enable it and observe again.

1 Like

Well turning the ‘Wide’ colour gamut off didn’t make any difference apart from less vibrant colour, yes it seems to increase the colour depth for 8-Bit.

I’ll just chill and see how things go, and of course stop obsessing!! :crazy_face:

OK I seem to have resolved the colour issue which I think was due to the skin. I don’t like the default skin, Estuary, and was using Confluence which I think might now be too old-in-the-tooth! I switched to Eminence, which I really like, and the colour issue seems to have been resolved as I am consistently getting ‘deep colour’, regardless of whether it’s 8-Bit or 10-Bit. Yes I know 8-Bit is not deep colour but both 10 and 8-Bit seem more vibrant so I’m not sure what was occurring?

Still get the micro-judder/jitter but that’s almost certainly dodgy encoding or simply over compression. I often see excellent small files sizes producing amazing video quality but it seems to be becoming rarer. The occasional judder is still unexplained but that may just be hardware limitations?

If you haven’t tried the skin Eminence take a look here: https://forum.kodi.tv/showthread.php?tid=237538

EDIT: This article suggests which skins are compatible with the newest versions of Kodi: https://www.technadu.com/best-kodi-skins/8372/

I still use confluence, which I use a mod of a mod to give a few basic extra features and still find it the best and with plenty of teeth.

Whilst there are many amazing looking skins, I never found one that retained the basic simplicity of Confluence and add additional useful features and so many I find to be awkward in operation.

But I have never heard of a skin affecting playback quality, the behaviour of other addons yes but never basic quality.

But I am curious enough to try this Eminence to see for myself as there are times, especially with 4k material that is not HDR based exhibiting the same basic issues that you get with tone mapping HDR to SDR with the washed out colours, so 4k has largely been a no no in terms of matching with both 720p and 1080p of the same footage.

I’ll come back and post my impressions.

1 Like

Well I have certainly learned something today.

I would never really have thought that a skin could affect playback quality but today I saw it happen in spectacular fashion, albeit for me, wholly negative.

As far as the skin design is concerned, I did generally like the stripped down feel of the colour, although some elements I found frustrating, such as the black cursor that blends into the background too much but that’s more something to report back to the dev for feedback and modifications.

But I have 2 clips that I used when initially testing out 1080p vs 4K HDR playback with HDR to SDR tone mapping on.

Using confluence, the 1080p footage is bright and vivid and just as I would want, whilst the 4K HDR was washed out and especially dark, with colours almost non existent in some cases.

So for this skin test I used the same clips, starting with the 1080p.

To my astonishment, it not only looked awful but exhibited exactly the same things as the 4K HDR clip normally would, only even worse, darker again.

When it came to the 4K HDR clip that was worse on another level and was practically unwatchable.

Then I tested out some normal run of the mill stuff, starting with live TV and the darkness and washed out effect was prevalent on pretty much everything animation, which was darker but not to the same degree as anything else.

So it does, for me at least, introduce another element to think about in future when it does come to playback concerns.

Clearly for you @ PatrickJB and your setup it works well, which I guess adds another potential level of complication as to why there can be such vast difference in experience.

1 Like

Yes it’s strange and must be something to do with the old LG TV and the ‘wide’ colour gamut setting. The HDR effect is almost overwhelming but is preferable to the washed out video. It probably also has something to do with my box/s and possibly the skins scripts and such technical stuff.

Flicking through a few movies and then shows the quality difference with smaller files sizes is evident especially the micro judder. There’s always a trade off and even though excellent quality can be achieved in small files it’s an art/skill I suspect. Yep even Netflix streams have the odd judder with Amazon and Now TV being slightly better although I only have ADSL because of where I live, no fibre.

I’ll go 4K when OLED’s are cheap and 60" screens are the norm until then 1080p is fine for me.

So perhaps skins can be an issues for some and swapping to a different one can improve the overall viewing experience?