Our OpenMAX based application need to take few still jpeg pictures at 8MP as fast as possible. The shortest time that I got for two jpegs in sequence is about 700ms between each jpeg which is too much and I want to use the picamera approach taking video frames instead of still images. How can I pipeline the IL components to connect camera->video_splitter->image_encoder? In the documentation of the video splitter it is written that:
When using proprietary communication on the input port, a single still image can be captured at one of the image output ports. The image is supplied as stripes via buffers on the output port. The image will be captured every time the port transitions from IDLE to EXECUTING.
But when I try to tunnel between video splitter output port 251 to image encoder input port 340 got error that these two ports are not compatible.