Unfortunately the master branch, doing DRM, does not use OpenGL and it is not immediately clear to me how to get the video frames into a texture. The x11 branch does not build out of the box. When I fix the Makefile to add some libraries that are missing, it builds, but it does not play the video correctly. Also, the x11 branch uses eglCreateImageKHR from /opt/vc/lib/libbrcmEGL.so, but I need to link with /usr/lib/arm-linux-gnueabihf/libEGL.so with fkms otherwise I will get runtime errors such as "* failed to add service - already in use?".6by9 wrote: ↑Wed Jun 26, 2019 4:19 pmOpenMAX egl_render talks under the hood between decoder and 3D - that won't work.
Still bashing the last little bits into shape, but use MMAL or V4L2 to decode into a buffer, export that as a dmabuf, and then pass that into EGL or DRM for rendering. https://github.com/6by9/drm_mmal master branch should do that for DRM (ie do not start X), and the x11 or x11_export branches should do the right thing for EGL and X (x11_export will fail at present).
Is there any example code showing how to get hardware accelerated video playback into a texture that combines well with hardware accelerated OpenGL(ES)?