luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

egl_render and vsync

Mon Jun 02, 2014 10:20 am

Hello,
anybody who noticed vsync issues while using the egl_render component recently? I never noticed that but on recent firmwares it seems there is some tearing in the texture rendered using the egl_render component. The tearing effect manifests only in that texture, not in the rest of the OpenGL graphics.
Do I have to ensure myself that OpenMAX is not rendering into the texture while I'm drawing it? I suppose the frame is "drawn" in a "back buffer" so no tearing effect should be noticed or tearing for the entire OpenGL scene should be visible, isn't this correct?
Thanks.

dom
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5502
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge

Re: egl_render and vsync

Mon Jun 02, 2014 2:25 pm

If the texture being decoded to by OpenMAX is also the current texture being read from when rendering the current GL scene, then yes you will get tearing.

You will need to a queue textures that OpenMAX writes to, and only return them to be reused by OpenMAX when rendering has finished.

There is code that does video decode->texture in xbmc:
https://github.com/popcornmix/xbmc/blob ... xVideo.cpp

luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render and vsync

Mon Jun 02, 2014 4:13 pm

For some reason I never noticed this issue before now... Thanks for the information!

luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render and vsync

Sun Jun 08, 2014 3:12 pm

dom wrote:If the texture being decoded to by OpenMAX is also the current texture being read from when rendering the current GL scene, then yes you will get tearing.

You will need to a queue textures that OpenMAX writes to, and only return them to be reused by OpenMAX when rendering has finished.

There is code that does video decode->texture in xbmc:
https://github.com/popcornmix/xbmc/blob ... xVideo.cpp
Do you know of any possible reason why OMX_UseEGLImage may hang when setting the second EGLImage? If I set just one EGLImage everything works (except for the tearing of course), but when I try to set the second EGLImage OMX_UseEGLImage hangs. It never returns so I can't get any error value. Any possible reason for this behaviour?
Thanks.

dom
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5502
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge

Re: egl_render and vsync

Sun Jun 08, 2014 4:52 pm

luc4 wrote:Do you know of any possible reason why OMX_UseEGLImage may hang when setting the second EGLImage? If I set just one EGLImage everything works (except for the tearing of course), but when I try to set the second EGLImage OMX_UseEGLImage hangs. It never returns so I can't get any error value. Any possible reason for this behaviour?
Thanks.
The xbmc code I linked to has multiple OMX_UseEGLImage textures. Perhaps have a look at the differences?

I wouldn't be surprised if getting settings wrong (especially mismatched sizes or formats) causes hangs on GPU
(obviously it shouldn't, but this API is bridging two unrelated libraries and they tend to trust the information given is correct which can result in memory corruption if not true).

Another gothcha is to make sure all GL related calls come from the same thread (or at least a thread with a valid context).

You may get some info on fatal errors from the GPU log:

Code: Select all

sudo vcdbg log

luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render and vsync

Sun Jun 08, 2014 9:05 pm

Thank you for your advice. Unfortunately my code is open and intentionally very similar to that of XBMC and omxplayer, but still it is much code so I'm having a hard time trying to get what is causing this. I used sudo vcdbg log msg, but I got no message while running my code. Any other way to "monitor" what the GPU is doing? Anything else that can give me a hint?
Thanks!

luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render and vsync

Sun Jun 08, 2014 11:18 pm

dom wrote:The xbmc code I linked to has multiple OMX_UseEGLImage textures. Perhaps have a look at the differences?
I installed XBMC following this: http://www.raspbian.org/RaspbianXBMC. I chose the prebuilt package so I used the indications in http://michael.gorven.za.net/raspberrypi/xbmc as suggested. I don't know if this is related to what I'm experiencing or not, but it seems XBMC cannot play 1080p big buck bunny. I suppose it should... correct? It plays the audio but no video. However it is probably not related as 720p seems to be working fine...
If anyone else is aware of other ways of getting hints of what happens when calling OMX_UseEGLImage, please let me know :-)
Thanks!

dom
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5502
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge

Re: egl_render and vsync

Mon Jun 09, 2014 12:42 pm

luc4 wrote:I installed XBMC following this: http://www.raspbian.org/RaspbianXBMC. I chose the prebuilt package so I used the indications in http://michael.gorven.za.net/raspberrypi/xbmc as suggested. I don't know if this is related to what I'm experiencing or not, but it seems XBMC cannot play 1080p big buck bunny. I suppose it should... correct? It plays the audio but no video. However it is probably not related as 720p seems to be working fine...
If anyone else is aware of other ways of getting hints of what happens when calling OMX_UseEGLImage, please let me know :-)
Thanks!
Probably not enough GPU memory allocated.
omxplayer wants gpu_mem=128 for 1080p playback
dvdplayer wants gpu_mem=192 for 1080p playback (having a queue of 32-bpp textures is more expensive than 12-bpp YUV frames).

Note: only dvdplayer uses OMX_UseEGLImage (and it's performance is worse than omplayer).

No promises, but if you produce an ARM test app that uses OMX_UseEGLImage and fails (binary only is okay), I can run it with a (GPU) debugger attached and see if anything obvious is failing.
But if that doesn't show anything obvious I probabably can't help much more. Getting OMX_UseEGLImage working with dvdplayer took a couple of weeks and I don't have the time to go through that again with someone else's code.

I can only suggest you look very closely at the code I linked to before and make sure your close is as close as possible.

luc4
Posts: 51
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render and vsync

Tue Jan 06, 2015 1:41 pm

It took me a few hours but I finally made it work. Probably stuck because I didn't set the total number of buffers I intended to use. This is the final result: http://youtu.be/SeJxQN-W2uA. I didn't expect such a good performance actually. And my bad/buggy code can surely be improved!
Thank you for pointing me to that sample code.

dom
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5502
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge

Re: egl_render and vsync

Tue Jan 06, 2015 5:18 pm

Awesome!

cbratschi
Posts: 9
Joined: Mon Oct 17, 2016 3:19 pm

Re: egl_render and vsync

Fri Apr 07, 2017 12:19 pm

I am running into the same issue. So far I am using a single egl_render output buffer but on 1080p@60 a 29.7 fps video does not run smoothly enough on a RPi 3. There are tearing effects and the OpenGL thread can only deliver about 27 frames per second. I tried a lot of code changes but I guess more than one buffer is needed to get a smooth playback.

I had a quick look at the XMBC code which uses 4 output buffers. I did not check yet how the textures are rotated to get a smooth playback.

@luc4: is your code available online?

Thanks,
Christoph

Return to “Graphics, sound and multimedia”