I have been struggling with this for some time. We use OpenMAX for video playback in MythTV. It used to work but at some upgrade point deinterlaced videos started to show jagged edges like they were not being deinterlaced. It either occurred starting with a new raspbian version or maybe started with the RPI3.
My testing shows that advanced and fast deinterlace only work if the input to image_fx is tunneled from a decoder. Is this intentional and is there any way to get it to work with OMX_EmptyThisBuffer calls ?
I created some sample programs that play back video with image_fx advanced deinterlace. I have 3 programs at
https://github.com/bennettpeter/raspber ... nMAX/Video
decode -> tunnel -> image_fx -> C code that empties and fills buffers -> render
(il_deint_t2) This is how MythTV does it
decode -> C code that empties and fills buffers -> image_fx -> tunnel -> render
decode -> tunnel -> image_fx -> tunnel -> render
il_deint_t1 and il_deint_t1t2 show the video perfectly deinterlaced. il_deint_t2 shows jaggies as it it was not deinterlaced.
These three programs are a bit messy as they are test programs. In particular they take a parameter n to prevent deinterlace but the code to support that is not complete. To build the programs you must first run make in /opt/vc/src/hello_pi/libs/ilclient/.
I have a test file (raw mpeg2) at https://www.dropbox.com/s/82voelqrpvxr5 ... video?dl=1
If you need me to clean up the test program code let me know.