OpenMax rendering onto OpenGL texture


72 posts   Page 1 of 3   1, 2, 3
by Twinkletoes » Fri May 25, 2012 11:04 pm
So... I'm trying to get a video onto an openGL texture...

I've opened up the wonder that is hello_video and I can sort of see what's going on. However I can't work out how you would redirect the decoded frame from the video_render component onto a texture or even a memory buffer.

Anyone done this before elsewhere? I'm from a DirectShow background, and I thought that was badly documented... then I met OpenMax. Lots of ppts saying lovely things about it but no documentation or code examples I can find. Am I missing a book or a site that goes into details? Google is not my friend today... :)

Bryan
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by dom » Fri May 25, 2012 11:30 pm
Like this?
http://www.youtube.com/watch?v=-y3m_HFg4Do
It is possible with openMax, but is tricky.
I hope to get some example code out, but it needs quite a lot of work.
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge
by Twinkletoes » Tue May 29, 2012 9:04 am
Exactly like that!
Can you point me at the relevant docs or PM me your code? I'm happy to help clean it up into an example
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by Twinkletoes » Tue May 29, 2012 4:29 pm
So... are you creating a new render component to render to a texture, or using bitBLT to push the data into a PBO, or something more sneaky?
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by dom » Tue May 29, 2012 5:23 pm
There is an eglrender OpenMaxIL component that can take input from video_render and produce a texture.
Note, the decoded video frame and the texture produced are all server side, and the client (ARM) never needs to see the pixels.

Doing it by fetching the decoded video frames to the ARM, and passing them back in as texture data would be much less efficient.
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge
by Twinkletoes » Tue May 29, 2012 6:13 pm
I've googled eglrender openmax and not found anything - where should I be looking for details?

Are you able to send me messy code that I can start to help to clean up?
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by dom » Tue May 29, 2012 8:53 pm
Some OpenMax documention here:
https://github.com/raspberrypi/firmware ... umentation

Unfortunately the tidying of the code involves removing dependencies and adding appropriate licenses, so we can't release it until that's done.
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge
by fabpica » Wed Jun 13, 2012 10:39 am
I ma also very interested in using egl_render, could you juste post a part of the code using this component? the boadcom documentation is very poor.
How do you use the output of egl_render with openglES ?
thanks for you answer
Posts: 3
Joined: Wed Jun 13, 2012 10:36 am
by mjdale » Wed Jun 27, 2012 4:09 pm
Hi i am looking to do the exact same thing but on to a flat surface.

Is there any more documentation that i can get hold of to enable this?

Could any code snippets be released to enable the connection between the egl_render component and an egl surface? From what i can tell all that is required is a callback from the component on render completion.

Also can someone post an example of how to get the video_decoder component to decode in to a buffer rather than using a tunnel to the video_render component.

The broadcom documentation is far from helpful regarding these features considering it is the primary feature of the device.
Posts: 1
Joined: Wed Jun 27, 2012 3:54 pm
by andybastable » Wed Jul 04, 2012 11:09 am
+1 for any sort of sample code to help us here. The OpenMax docs are not particularly clear!
Posts: 8
Joined: Thu May 17, 2012 12:16 pm
by JohannesTaelman » Thu Aug 02, 2012 12:46 pm
another +1 bump
Posts: 1
Joined: Thu Aug 02, 2012 12:43 pm
by dhorbury » Thu Aug 09, 2012 3:08 pm
Lack of progress frustrating...

We've created an EGLImage from a 1280x720 gl 32bit texture. The texture is initialised as a yellow checkerboard pattern.
Replaced the video_render component from working video playback with an egl_render component.
Set the pNativeWindow member on the output to point at the EGLImage created previously.
(Should the pNativeRender be set on the port and if so, what to? this doesn't accept the EGLDisplay or DISPMANX_DISPLAY_HANDLE_T created when starting up OpenGL ES 2).
But still no joy, the egl_render component is receiving something from the decoder, as it reports the video stream on it's input port is of dimesnions 1280x720.

The output port never gets enabled though, is it supposed to be tunnelled to another component?
We've tried a null_sink and video_render, but the ilclient_setup_tunnel reports -5 indicating the data format from the egl_render output port isn't acceptable.

The texture never has anything written to it?

Does anyone have any further pointers or a list of modules and steps required?

Many thanks,

Dan
Posts: 1
Joined: Thu May 31, 2012 1:30 pm
by andybastable » Fri Aug 10, 2012 12:14 pm
dom, seeing as you've had this working -- any chance you could give us some pointers?
Posts: 8
Joined: Thu May 17, 2012 12:16 pm
by Twinkletoes » Fri Aug 10, 2012 4:59 pm
Dom -
Is this code in user space or is it part of the GPU blob? Is this something it's worth trying ourselves, or should we wait for your code to be released? Any thoughts on timescale (e.g. not earlier than X)?

Bryan
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by asb » Fri Aug 10, 2012 5:15 pm
Twinkletoes wrote:Dom -
Is this code in user space or is it part of the GPU blob? Is this something it's worth trying ourselves, or should we wait for your code to be released? Any thoughts on timescale (e.g. not earlier than X)?

Bryan


The OpenMAX EGL_render extension is present and, as far as I know, functional in current firmware builds https://github.com/raspberrypi/firmware ... ender.html
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 788
Joined: Fri Sep 16, 2011 7:16 pm
by Twinkletoes » Fri Aug 10, 2012 5:51 pm
How do I use an eglimage from OpenGL? Can I define an eglimage with its storage in a GL texture? Or would I need to blit every frame?
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by asb » Fri Aug 10, 2012 5:57 pm
Twinkletoes wrote:How do I use an eglimage from OpenGL? Can I define an eglimage with its storage in a GL texture? Or would I need to blit every frame?


I think you want eglCreateImageKHR for this.
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 788
Joined: Fri Sep 16, 2011 7:16 pm
by elburger » Mon Oct 29, 2012 3:03 am
Bump.

Has anyone had any luck with this? Dom, I completely understand if you can't publish the code for any reason, but would it be possible for you to outline the "recipe" you are using?

What I'm currently doing:
- Create an OpenGL texture with 1280x720 resolution (matches my test video) and RGBA8888 format. Create a EGL image that points to the texture.
- Create a video_decoder
- Create an egl_render
- Disable all ports on both components
- Set the pNativeWindow of the output port of the egl_render to a valid EGLDisplay handle
- Set the eCompressionFormat of the input port of the video_decoder. I believe the input of the decoder works just fine as I can see the frames if I use a video_render instead of the egl_render.
- Change the state of the video_decoder to OMX_StateIdle
- Allocate input buffers for the decoder using OMX_AllocateBuffer()
- Enable the input port of the decoder
- Change the state of the decoder to OMX_StateExecuting
- Feed decoder some data until the decoder's output port changes settings
- Create a tunnel between the output port of the decoder and the input port of the egl_render
- Enable the decoder's output port
- Enable the egl_render's input port
- Change the egl_render's state to OMX_StateIdle
- Printing the definitions for all the ports here and the frame dimensions are 1280x720 as expected, but for some reason the egl_render output port's pNativeWindow has become 0(?!)
- Enable the output port of the egl_render
- Call OMX_UseEGLImage for the egl_render's output port
- Change egl_render's state to OMX_StateExecuting
- Call OMX_FillThisBuffer with the egl_render's handle and the buffer header I got from the OMX_UseEGLImage call
- Keep feeding the decoder more data as decoder's input buffers free up

The BufferFillDone callback never gets called. :(

The pixel format of the decoder is OMX_COLOR_FormatYUV420PackedPlanar. This isn't very convenient from a GLSL shader perspective. I looked at the different color format options and OMX_COLOR_BRCMEGL seems interesting. I tried setting the decoder's output format to this, but doing it with OMX_IndexParamPortDefinition results in a bad parameter error. Same thing happens for RGB565 eventhough the video_decoder documentation mentions that this should be possible. I also tried setting the format with OMX_VIDEO_PARAM_PORTFORMATTYPE, but setting it with that doesn't seem to stick (no error though).

Any ideas or advice?
Posts: 1
Joined: Mon Oct 22, 2012 8:04 am
Location: San Jose, CA
by luc4 » Sun Dec 02, 2012 3:48 pm
I'm stuck in the exact same situation trying to decode an image and placing in the EGLImage. OMX_COLOR_FormatYUV420PackedPlanar is the colo format and FillBufferDone never gets called. Anyone who solved this?
Posts: 29
Joined: Mon Nov 12, 2012 12:28 am
by savuporo » Wed Dec 05, 2012 7:44 pm
elburger : did you check the number of buffers required ? XMBC project seemed to have working code at some point, see http://bit.ly/UodIIy

AllocOMXOutputEGLTextures
Posts: 1
Joined: Wed Dec 05, 2012 7:41 pm
by dom » Sat Dec 08, 2012 3:03 pm
I've got permission to post the important bits of the source to videocube. (As seen in youtube video posted earlier)

Unfortunately this is written to use a different library framework so is not directly usable, but there may be some nuggets that can be picked out to get your own code working.

Note: the "playback_il" interface uses a read_media openmax component running on the GPU.
This isn't available (it is better done by arm) and you'll need video decode to be done more like hello_video or omxplayer.

The (obsolete) framework this uses is a message passing, event driven mechanism, where code starts from PLATFORM_MSG_INIT.
(I'd recommend ignoring that, and starting your code from main).

https://dl.dropbox.com/u/3669512/temp/videocube.tar.gz
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge
by Twinkletoes » Sat Dec 08, 2012 3:23 pm
Would we be allowed to see the documentation pngs in the \image folder?
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by dom » Sat Dec 08, 2012 3:33 pm
Twinkletoes wrote:Would we be allowed to see the documentation pngs in the \image folder?

Can you explain what \image folder you are referring to?
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge
by Twinkletoes » Sat Dec 08, 2012 4:33 pm
Referenced in comments in playback.h
Posts: 202
Joined: Fri May 25, 2012 9:44 pm
by dom » Sat Dec 08, 2012 4:58 pm
Twinkletoes wrote:Referenced in comments in playback.h

Okay, here:
https://dl.dropbox.com/u/3669512/temp/p ... images.zip
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 3989
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge