N2+ poor deinterlacing quality

I’m running CoreELEC on both my Minix U9-H (S912) and my new ODroid N2+ (S922X). I would describe the U9-H’s deinterlacing as “dodgy”. When it locks onto the cadence properly it’s decent, but it often briefly mistakes 50i content for 25p content, resulting in some artefacts when the content shifts from mostly stationary to mostly motion. However, having used the N2+ for a few days, I’d describe its deinterlacing as “bad”.

For example, when watching an tennis on an SD (576i) channel, the U9-H struggles to show the far base line when the camera moves, but the ball in motion is smooth. With the N2+, the far base line is rendered better but there are combing artefacts on the ball and moving raquets nearly all the time. When using the deinterlacer on my AMD Radeon Vega 56 (using MPC-HC + MadVR), none of these problems exist. I know that I shouldn’t expect the Amlogic chips to deinterlace as well as my GPU but it doesn’t make sense to me that the S922X looks worse than the S912 in this regard.

Surely the S922X wouldn’t have a worse deinterlacing algorithm/chip than the S912, so I don’t understand why it looks worse. Aside from the chipset, the only difference I can think of between the two boxes is the N2+ uses a newer kernel (amlogic-ng build) - maybe this is causing a problem? Are there any settings that control this?

Have you tried “Deinterlace (half)”?

As far as I know, that is an option for software deinterlacing, not hardware deinterlacing. In any case, no I would not want to watch sports at 25 fps.

I don’t know if Kodi uses Field combination deinterlacing or Field extension deinterlacing when halving but you should try it before you complain. As you know, all movies use 24 FPS these days (rounded up) and you don’t see any motion there. I’m watching the French Open that way and am satisfied. To try

  1. Under Settings - Player - Processing: Turn off “Allow hardware acceleration”.
  2. When playing a TV channel, under Video set to “Deinterlace (half)”.

Deinterlace half will look jerky.
In my experience Amlogic is generally quite bad for interlaced tv streams. It’s unfixable IMO.

I’m glad you’re happy with it but half-rate deinterlacing, which results in you losing 50% of the motion resolution of the video, is unnacceptable to me.

If there’s one thing I miss about my HTPC setup, it’s the high quality deinterlacing and SD upscaling. I don’t watch much SD these days so it’s not a huge deal but it really does seem bad on these Amlogic boxes. Watching 1080i footy looks great though, so maybe it’s the upscaler in combination with the deinterlacer that’s the problem, rather than just the deinterlacer.

I tried whitelisting 576p in Kodi to rule out the upscaler as the problem, but the result is a hugely distorted picture that’s cropped and stretched to like 2:1 for no appearent reason. I’m not sure if that issue is caused by the ODroid box, my Denon AVR, or my LG TV but right now I’ll have to stick with 1080p output.

I have switched for now to using software deinterlacing and upscaling for MPEG2. YADIF and Lanczos have their own problems but at least the tennis ball isn’t a constant comby mess in this mode.

One problem with SD resolutions and whitelisting is that the aspect ratio is not correct with fullscreen video. At least all my 4:3 stuff is displayed as 16:9 on the TV so the signalling is not correct.

I’ve managed to fix the 576p aspect ratio issue with the “Video calibration” setting within Kodi. I’ve had to set the pixel aspect ratio to 1.412 (!!) to get the 16:9 video to be fullscreen. This allows me to use the TV’s scaler instead of the ODroid’s and I must say it is 100x times better. The lines of the tennis court are no longer jagged to hell, which was the case with both hardware and software scaling before. This doesn’t fix the deinterlacing issue, and the GUI is hella ugly in 576p, but it’s a start.

Kodi doesn’t use a single deinterlacing approach.

If hardware deinterlacing is used then the deinterlace is based on the VPU/GPU combination in use and what that platform offers. On CoreElec on AMLogic devices you don’t get any real choice in deinterlacing configuration. It’s always a straight i25 to p50 or i29.97 to p59.94 deinterlace (so for native interlaced sources - or p50/p59.94 sources that have been interlaced - you get full 50Hz or 59.94Hz motion, not half rate judder vision)

On Intel GPUs you often get a wider choice - letting you decide whether you want 25/29.97Hz or 50/59.94Hz motion - and whether Motion Adaptive or Motion Compensative deinterlacing is used (the latter is more demanding, and much older GPUs often could only do MC on SD content and had to use MA on HD. These days MC is standard for all SD and HD content)

If you use software deinterlace (which means you usually also have to use software decode AIUI) then you either get a YADIF 2x (for 50/59.94Hz motion) or YADIF (for 25/29.97Hz motion). The latter requires less CPU grunt - but is obviously terrible for native interlaced 50/59.94Hz content. (You also get Bob and no deinterlace as options - the no deinterlace = weave, which is perfect for 2:2 content you know is correctly mastered)

One thing that can also influence this stuff is that highly compressed video - MPEG2 and h.264 - will often itself have interlace related compression artefacts as MPEG2 and h.264 can (depends on codec config) try to use static information in an interlaced stream to their advantage using techniques like MBAFF to send static macroblocks as progressive rather than interlaced encoded picture, switching to an interlaced friendly mode on macro blocks that have intra-frame motion (i.e. motion between the two fields). Bad encodes make life more difficult for deinterlacers too (as do bad 50<->59.94Hz broadcast standards conversion that distort the motion).


I think you’re right that with crappy, low bitrate, broadcast SDTV there are probably artefacts in the source itself that doesn’t help matters. Right now I’ve stuck with hardware deinterlacing and outputting at 576p since that seems to be the best combination for quality. I tried YADIF 2x with 576p output but there are weird artefacts (if you imagine a diagonal line, that line looks as if there are breaks every now and then along the horizontal plane). Given HD interlaced content looks significantly better, I suspect the hardware deinterlacer is not the main source of the artefacts.

There does seem to be a bunch of deinterlacing options at the kernel level but I suspect they’re already set optimally anyway. The only one that seemed potentially interesting was this, which is set to off by default:


I can’t figure out what it actually does from the Amlogic source code though, or whether it’d improve or decrease quality.

Yep - YADIF has some limitations. In ffmpeg there are a couple of alternatives. The BBC R&D Weston 3-field is a good option (w3dif in ffmpeg) - it does no adaptation, it’s just a cleverly designed vertical/temporal filter. For many years it was used in broadcast video effects boxes and frame rate converters as a good deinterlacing solution. It’s my deinterlacer of choice as it does no analysis of the video - and therefore doesn’t get ‘confused’. It’s not going to be perfect - and there are some edge cases where it doesn’t perform as well as others - but for a good ‘all round’ deinterlacer it’s pretty good. I built some Kodi builds that swapped YADIF for W3FDIF a while back and was pretty impressed. I think MrMC uses it - or has an option to use it too.

There is also BWDIF which combines W3FDIF and YADIF - and supposedly does better than both of them - but I’ve not really trialled it that much.

My guess is the use_2_interlace_buff may increase the number of images used for deinterlacing ? Weston 3-field uses 3-fields (as its name suggests…)

Over compressed video is harder to deinterlace cleanly - and these days SD is often heavily compressed (and often still uses MPEG2 which needs even higher bitrates to deliver decent quality). Plus HD artefacts will be much smaller than SD artefacts - so errors in deinterlacing will be less visible in HD?

If software alternative deinterlacers could be built into CoreELEC I’m all for it, naturally.

They already are built in to Kodi (and I don’t think CoreElec disables them). If you disable hardware decode, then software decode and software deinterlacing are used (Deinterlace = YADIF in that situation I believe)

It’s more difficult to hardware decode and software deinterlace I believe.

I don’t see W3fDIF with software decoding…

Ah - I misread what you meant (I thought you meant software deinterlacers as alternatives to hardware - not a choice of multiple software deinterlacers).

YADIF is the software deinterlacer used when hardware decoding is disabled and ‘Deinterlace’ is enabled I believe. There aren’t alternatives like Weston 3-field (aka w3fdif) as options in Kodi as standard.

I think you’re playing at being Captain Obvious now :innocent:

1 Like

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.