kkk627
Posts: 5
Joined: Mon Jan 28, 2019 5:38 am

Decode JPEG stream using MMAL or OpenMax

Mon Jan 28, 2019 6:15 am

Hello there.
I'm working on making a JPEG stream decoder in C++.
This program decodes JPEG images streamed from the server repeatedly.
Each decoded RGB images are put on the shared memory. (Not RGBA)
And my pi is "Raspberry Pi 3 Mode B+" (Raspbian stretch).

Now my program uses libjpeg-turbo in two decoder threads. (it can utilize NEON acceleration).
I'd like to make my program more speedy, so I'm trying implementing hardware decode.
I found I have to use MMAL or OpenMax to do that, but the sample codes are very difficult for me.

So my questions are:
- How can I implement jpeg decoder using MMAL or OpenMax? (to get RGB raw image, not to display it)
- Is JPEG hardware decode more speedy than using libjpeg-turbo in multiple threads?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 6422
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Decode JPEG stream using MMAL or OpenMax

Mon Jan 28, 2019 11:53 am

kkk627 wrote:
Mon Jan 28, 2019 6:15 am
- How can I implement jpeg decoder using MMAL or OpenMax? (to get RGB raw image, not to display it)
https://github.com/6by9/userland/blob/h ... peg/jpeg.c
It will only decode to YUV, as it is taking the output direct out of the JPEG hardware block and JPEG generally encodes as YUV.
(I really must get around to cleaning those up and getting them merged into the main userland repo).
kkk627 wrote: - Is JPEG hardware decode more speedy than using libjpeg-turbo in multiple threads?
It depends.
There is a setup phase required, so for small images that may take longer than just decoding it on the ARM. For larger images it should be faster. Try it and find out.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

kkk627
Posts: 5
Joined: Mon Jan 28, 2019 5:38 am

Re: Decode JPEG stream using MMAL or OpenMax

Mon Jan 28, 2019 2:55 pm

6by9 wrote:
Mon Jan 28, 2019 11:53 am
kkk627 wrote:
Mon Jan 28, 2019 6:15 am
https://github.com/6by9/userland/blob/h ... peg/jpeg.c
It will only decode to YUV, as it is taking the output direct out of the JPEG hardware block and JPEG generally encodes as YUV.
(I really must get around to cleaning those up and getting them merged into the main userland repo).
Actually, I have already tried to use this source code.
But I couldn't compile this source code because of missing header files and shared libraries.
Moreover, I couldn't understand how to change this source code for decoding JPEG stream.

I feel so ashamed of my poor skill...
Could you tell me how to use this source code?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 6422
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Decode JPEG stream using MMAL or OpenMax

Mon Jan 28, 2019 3:16 pm

kkk627 wrote:
Mon Jan 28, 2019 2:55 pm
6by9 wrote:
Mon Jan 28, 2019 11:53 am
https://github.com/6by9/userland/blob/h ... peg/jpeg.c
It will only decode to YUV, as it is taking the output direct out of the JPEG hardware block and JPEG generally encodes as YUV.
(I really must get around to cleaning those up and getting them merged into the main userland repo).
Actually, I have already tried to use this source code.
But I couldn't compile this source code because of missing header files and shared libraries.
Moreover, I couldn't understand how to change this source code for decoding JPEG stream.

I feel so ashamed of my poor skill...
Could you tell me how to use this source code?
https://github.com/6by9/userland/blob/h ... _pi/README to build all the hello_pi apps (although for some reason the IL ones fail to build cleanly at the moment).

Full instructions

Code: Select all

git clone https://github.com/6by9/userland
cd userland
git checkout -t -b hello_mmal origin/hello_mmal
cd host_applications/linux/apps/hello_pi/hello_mmal_jpeg
make
./hello_mmal_jpeg.bin foo.jpg
https://github.com/6by9/userland/blob/h ... peg.c#L348 is where data is read in from the file and sent to the GPU in chunks.
https://github.com/6by9/userland/blob/h ... peg.c#L426 is where you would want to do something with the decoded image.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

kkk627
Posts: 5
Joined: Mon Jan 28, 2019 5:38 am

Re: Decode JPEG stream using MMAL or OpenMax

Mon Jan 28, 2019 6:04 pm

6by9 wrote:
Mon Jan 28, 2019 3:16 pm

https://github.com/6by9/userland/blob/h ... _pi/README to build all the hello_pi apps (although for some reason the IL ones fail to build cleanly at the moment).

Full instructions

Code: Select all

git clone https://github.com/6by9/userland
cd userland
git checkout -t -b hello_mmal origin/hello_mmal
cd host_applications/linux/apps/hello_pi/hello_mmal_jpeg
make
./hello_mmal_jpeg.bin foo.jpg
https://github.com/6by9/userland/blob/h ... peg.c#L348 is where data is read in from the file and sent to the GPU in chunks.
https://github.com/6by9/userland/blob/h ... peg.c#L426 is where you would want to do something with the decoded image.
Thank you for your advice.
But now I have new questions:
- I'd like to know how to use the mmal_queue.
I think the mmal_queue is important for decoding jpeg frames repeatedly.
- At line 426, how can I access RGB data in the variable "buffer"?
To copy the data, "const unsigned char buf* = buffer->data;" is OK?

zephyr5415
Posts: 6
Joined: Tue Jan 08, 2019 7:39 am

Re: Decode JPEG stream using MMAL or OpenMax

Fri Feb 01, 2019 5:45 am

I recently tried to use mmal to decode the jpeg stream (MJPEG without timestamps) on raspberry pi and have already tried the source code from
https://github.com/6by9/rpi_mmal_examples.git
My question is that I simply modify the codes of graph_decode_render.c, which originally decode h264 file and output to monitor, changed the format of input ports as MMAL_ENCODING_MJPEG as follow:

/* Set format of video decoder input port */
format_in = decoder->input[0]->format;
format_in->type = MMAL_ES_TYPE_VIDEO;
format_in->encoding = MMAL_ENCODING_MJPEG;
format_in->es->video.width = 1920 ;
format_in->es->video.height = 1080 ;
format_in->es->video.frame_rate.num = 0;
format_in->es->video.frame_rate.den = 1;
format_in->es->video.par.num = 1;
format_in->es->video.par.den = 1;

But I got the error from control_callback with the status of MMAL_STATUS_EIO.
Why and what I lost of my trial?
p.s.: the input file is a single 1920x1080 JPEG file.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 6422
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Decode JPEG stream using MMAL or OpenMax

Fri Feb 01, 2019 12:31 pm

kkk627 wrote:
Mon Jan 28, 2019 6:04 pm
6by9 wrote:
Mon Jan 28, 2019 3:16 pm

https://github.com/6by9/userland/blob/h ... _pi/README to build all the hello_pi apps (although for some reason the IL ones fail to build cleanly at the moment).

Full instructions

Code: Select all

git clone https://github.com/6by9/userland
cd userland
git checkout -t -b hello_mmal origin/hello_mmal
cd host_applications/linux/apps/hello_pi/hello_mmal_jpeg
make
./hello_mmal_jpeg.bin foo.jpg
https://github.com/6by9/userland/blob/h ... peg.c#L348 is where data is read in from the file and sent to the GPU in chunks.
https://github.com/6by9/userland/blob/h ... peg.c#L426 is where you would want to do something with the decoded image.
Thank you for your advice.
But now I have new questions:
- I'd like to know how to use the mmal_queue.
I think the mmal_queue is important for decoding jpeg frames repeatedly.
mmal_queue is just a helper library that offers simple buffer management. mmal_pool is based on top of that in also handling allocating the buffers in the first place. Doxygen markup in the headers, which someone has rendered to HTML at http://www.jvcref.com/files/PI/document ... index.html

Call mmal_queue_get(pool->queue) and you'll get given the first buffer header from the pool. When done, call mmal_buffer_header_release(buffer) and it'll be returned to the pool. You can add optional callbacks when buffers are released back to the pool
kkk627 wrote: - At line 426, how can I access RGB data in the variable "buffer"?
To copy the data, "const unsigned char buf* = buffer->data;" is OK?
You won't get RGB data back from image_encode if decoding a JPEG, only YUV.
Yes, buffer->data will point at the data.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 6422
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Decode JPEG stream using MMAL or OpenMax

Fri Feb 01, 2019 12:38 pm

zephyr5415 wrote:
Fri Feb 01, 2019 5:45 am
I recently tried to use mmal to decode the jpeg stream (MJPEG without timestamps) on raspberry pi and have already tried the source code from
https://github.com/6by9/rpi_mmal_examples.git
My question is that I simply modify the codes of graph_decode_render.c, which originally decode h264 file and output to monitor, changed the format of input ports as MMAL_ENCODING_MJPEG as follow:

/* Set format of video decoder input port */
format_in = decoder->input[0]->format;
format_in->type = MMAL_ES_TYPE_VIDEO;
format_in->encoding = MMAL_ENCODING_MJPEG;
format_in->es->video.width = 1920 ;
format_in->es->video.height = 1080 ;
format_in->es->video.frame_rate.num = 0;
format_in->es->video.frame_rate.den = 1;
format_in->es->video.par.num = 1;
format_in->es->video.par.den = 1;

But I got the error from control_callback with the status of MMAL_STATUS_EIO.
Why and what I lost of my trial?
p.s.: the input file is a single 1920x1080 JPEG file.
video_decode is really designed for decoding MJPEG or H264, not single JPEGs. It would probably work and ignore EXIF headers and the like, but image_decode is intended to decode single images (and provides access to EXIF metadata, thumbnails, and the like).
In either case the input port width/height/frame_rate are irrelevant as that information is encoded in the input bitstream. The output format will be changed based on the bitstream, and you'll get a MMAL_EVENT_FORMAT_CHANGED event callback (buffer->cmd != 0) from the output port.

Without your code I can only make guesses. MMAL_STATUS_EIO is normally that you have submitted a buffer that is smaller than port->buffer_size.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

zephyr5415
Posts: 6
Joined: Tue Jan 08, 2019 7:39 am

Re: Decode JPEG stream using MMAL or OpenMax

Tue Feb 12, 2019 2:37 am

Hi @6by9,
sorry for reply late.

the codes I referenced are from here.
https://github.com/6by9/rpi_mmal_exampl ... e_render.c
which uses graphic_connection to connect the decode module and render module.

I just made merely change like following, I think I must miss some things to change it from playing a h264 file to MJPEG in memory.

Code: Select all

#ifdef VERBOSE
#undef VERBOSE
#endif

#include "bcm_host.h"
#include "mmal.h"
#include "util/mmal_graph.h"
#include "util/mmal_default_components.h"
#include "util/mmal_util_params.h"
#include "util/mmal_util.h"

#include <stdio.h>
#include "interface/vcos/vcos.h"

#include <sys/time.h>


#define USE_1080P 1

static unsigned int get_ms()
{
    struct timeval t ;
    gettimeofday(&t,NULL) ;
    return ((t.tv_sec * 1000) + (t.tv_usec/ 1000)) ;
}

static int loadFile(const char * fname, char * buffer) 
{
    int sz = 0 ;
    FILE * fp = fopen(fname,"rb") ;
    if ( fp ) {
        fseek(fp,0,SEEK_END) ;
        sz = ftell(fp) ;
        fseek(fp,0,SEEK_SET) ;
        fread(buffer,1,sz,fp) ;
        fclose(fp) ;
    }
    return sz ;
}

static int loadFiles( char ** buf, unsigned int *szs)
{
    buf[0] = calloc(1920*1080*2,1) ;
    buf[1] = calloc(1920*1080*2,1) ;
    #if USE_1080P
    szs[0] = loadFile("./bg.jpg",buf[0]) ;
    szs[1] = loadFile("./bg2.jpg",buf[1]) ; 
    #else
    szs[0] = loadFile("./sj1.jpg",buf[0]) ;
    szs[1] = loadFile("./sj2.jpg",buf[1]) ;
    #endif
    fprintf(stdout,"buf[0]:%p sz:%u, buf[1]:%p sz:%u\n",buf[0],szs[0],buf[1],szs[1]) ;
    return 1 ;
}

#define CHECK_STATUS(status, msg) if (status != MMAL_SUCCESS) { fprintf(stderr, msg"\n"); goto error; }


static uint8_t codec_header_bytes[128];
static unsigned int codec_header_bytes_size = sizeof(codec_header_bytes);

static FILE *source_file;

/* Macros abstracting the I/O, just to make the example code clearer */
#define SOURCE_OPEN(uri) \
    source_file = fopen(uri, "rb"); if (!source_file) goto error;
#define SOURCE_READ_CODEC_CONFIG_DATA(bytes, size) \
    size = fread(bytes, 1, size, source_file); rewind(source_file)
#define SOURCE_READ_DATA_INTO_BUFFER(a) \
    a->length = fread(a->data, 1, a->alloc_size - 128, source_file); \
    a->offset = 0; a->pts = a->dts = MMAL_TIME_UNKNOWN
#define SOURCE_CLOSE() \
    if (source_file) fclose(source_file)

/** Context for our application */
static struct CONTEXT_T {
    VCOS_SEMAPHORE_T semaphore;
    MMAL_QUEUE_T *queue;
    MMAL_STATUS_T status;
} context;

/** Callback from the decoder input port.
 * Buffer has been consumed and is available to be used again. */
static void input_callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer)
{
    struct CONTEXT_T *ctx = (struct CONTEXT_T *)port->userdata;

    /* The decoder is done with the data, just recycle the buffer header into its pool */
    mmal_buffer_header_release(buffer);

    /* Kick the processing thread */
    vcos_semaphore_post(&ctx->semaphore);

    //fprintf(stderr,"decoder input callback\n");
}

/** Callback from the control port.
 * Component is sending us an event. */
static void control_callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer)
{
   struct CONTEXT_T *ctx = (struct CONTEXT_T *)port->userdata;

   switch (buffer->cmd)
   {
   case MMAL_EVENT_EOS:
      /* Only sink component generate EOS events */
      break;
   case MMAL_EVENT_ERROR:
      /* Something went wrong. Signal this to the application */
      do {
          ctx->status = *(MMAL_STATUS_T *)buffer->data;
      } while (0) ;
      //fprintf(stderr,"error status:%s\n",mmal_status_to_string(ctx->status) ) ;
      break;
   case MMAL_EVENT_FORMAT_CHANGED :
      fprintf(stderr,"format changed\n");
      break ;
   default:
      break;
   }

   /* Done with the event, recycle it */
   mmal_buffer_header_release(buffer);

   //fprintf(stderr,"control cb. status %u\n", ctx->status);
}

int main(int argc, char* argv[]) {

    MMAL_STATUS_T status;
    MMAL_GRAPH_T *graph = 0;
    MMAL_COMPONENT_T *decoder = 0, *renderer=0;
    MMAL_POOL_T *pool_in = 0;
    MMAL_ES_FORMAT_T * format_in=0, * format_out=0 ;
    MMAL_PARAMETER_BOOLEAN_T zc;
    MMAL_BOOL_T eos_sent = MMAL_FALSE;
    MMAL_CONNECTION_T * connection = NULL ;

    char * bufs[2] = {NULL,NULL} ;
    unsigned int szs[2] = {0,0} ;
    loadFiles(bufs,szs) ;    

    bcm_host_init();
    
    vcos_semaphore_create(&context.semaphore, "example", 1);

    //SOURCE_OPEN("test.h264_2")


    /* Create the graph */
    status = mmal_graph_create(&graph, 0);
    CHECK_STATUS(status, "failed to create graph");

    status = mmal_graph_new_component(graph, MMAL_COMPONENT_DEFAULT_VIDEO_DECODER, &decoder);
    CHECK_STATUS(status, "failed to create decoder");

    status = mmal_graph_new_component(graph, MMAL_COMPONENT_DEFAULT_VIDEO_RENDERER, &renderer);
    CHECK_STATUS(status, "failed to create renderer");

    /* Enable control port so we can receive events from the component */
    decoder->control->userdata = (struct MMAL_PORT_USERDATA_T*)(void *)&context;
    status = mmal_port_enable(decoder->control, control_callback);
    CHECK_STATUS(status, "failed to enable control port");

    /* Set the zero-copy parameter on the input port */
    /*
    zc = {{MMAL_PARAMETER_ZERO_COPY, sizeof(zc)}, MMAL_TRUE};
    status = mmal_port_parameter_set(decoder->input[0], &zc.hdr);
    fprintf(stderr, "status: %i\n", status);*/
    status = mmal_port_parameter_set_boolean(decoder->input[0], MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
    CHECK_STATUS(status, "input zero copy failed.");

    /* Set the zero-copy parameter on the output port */
    status = mmal_port_parameter_set_boolean(decoder->output[0], MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
    CHECK_STATUS(status, "output zero copy failed.");

    /* Set format of video decoder input port */
    format_in = decoder->input[0]->format;
    format_in->type = MMAL_ES_TYPE_VIDEO;
    format_in->encoding = MMAL_ENCODING_MJPEG;
    //format_in->encoding_variant = MMAL_ENCODING_I420 ;
    #if USE_1080P
    format_in->es->video.width = 1920 ;
    format_in->es->video.height = 1080 ;
    #else
    format_in->es->video.width = 640 ;
    format_in->es->video.height = 590 ;
    #endif
    format_in->es->video.frame_rate.num = 0;
    format_in->es->video.frame_rate.den = 1;
    format_in->es->video.par.num = 1;
    format_in->es->video.par.den = 1;

    /* If the data is known to be framed then the following flag should be set:*/
    //format_in->flags |= MMAL_ES_FORMAT_FLAG_FRAMED;

    //format_out = decoder->output[0]->format;
    ////format_out->type = MMAL_ES_TYPE_VIDEO;
    //format_out->encoding = MMAL_ENCODING_I420;
    
    /*
    SOURCE_READ_CODEC_CONFIG_DATA(codec_header_bytes, codec_header_bytes_size);
    status = mmal_format_extradata_alloc(format_in, codec_header_bytes_size);
    CHECK_STATUS(status, "failed to allocate extradata");
    format_in->extradata_size = codec_header_bytes_size;
    if (format_in->extradata_size)
      memcpy(format_in->extradata, codec_header_bytes, format_in->extradata_size);
    */
    
    status = mmal_port_format_commit(decoder->input[0]);
    CHECK_STATUS(status, "failed to commit format");

    status = mmal_port_format_commit(decoder->output[0]);
    CHECK_STATUS(status, "failed to commit format");
    
    decoder->input[0]->buffer_num = decoder->input[0]->buffer_num_min;
    decoder->input[0]->buffer_size = decoder->input[0]->buffer_size_min;
    fprintf(stderr,"input buffer min num:%u min size:%u\n",decoder->input[0]->buffer_num_min,decoder->input[0]->buffer_size_min) ;

    #define MJPEG_BUF_SIZE (1920*1080*4)
    if ( decoder->input[0]->buffer_size < MJPEG_BUF_SIZE )
        decoder->input[0]->buffer_size = MJPEG_BUF_SIZE ;
        
    decoder->output[0]->buffer_num = decoder->output[0]->buffer_num_min;
    decoder->output[0]->buffer_size = decoder->output[0]->buffer_size_min;
    fprintf(stderr,"output buffer min num:%u min size:%u\n",decoder->output[0]->buffer_num_min,decoder->output[0]->buffer_size_min) ;
    if ( decoder->output[0]->buffer_size < MJPEG_BUF_SIZE )
        decoder->output[0]->buffer_size = MJPEG_BUF_SIZE ;

    pool_in = mmal_pool_create(decoder->input[0]->buffer_num,decoder->input[0]->buffer_size);

    context.queue = mmal_queue_create();

    /* Store a reference to our context in each port (will be used during callbacks) */
    decoder->input[0]->userdata = (struct MMAL_PORT_USERDATA_T*)(void *)&context;

    status = mmal_port_enable(decoder->input[0], input_callback);
    CHECK_STATUS(status, "failed to enable input port")


    /* connect them up - this propagates port settings from outputs to inputs */
    status = mmal_graph_new_connection(graph, decoder->output[0], renderer->input[0],
             0,  &connection);
             //MMAL_CONNECTION_FLAG_TUNNELLING, &connection);
    CHECK_STATUS(status, "failed to connect decoder to renderer");
    /*
    if ( connection ) {
        fprintf(stderr,"Connection:\n") ;
        fprintf(stderr,"    flag of tunnelling:%s\n",(connection->flags&MMAL_CONNECTION_FLAG_TUNNELLING)?"yes":"no") ;
        fprintf(stderr,"    enabled:%s\n",(connection->is_enabled)?"yes":"no") ;
        fprintf(stderr,"    callback:%p\n",connection->callback) ;
        //connection->flags |= MMAL_CONNECTION_FLAG_TUNNELLING ;
        //mmal_connection_enable(connection) ;
        fprintf(stderr,"    enabled:%s\n",(connection->is_enabled)?"yes":"no") ;
    }
    */

     /* Start playback */
    fprintf(stderr, "start");
    status = mmal_graph_enable(graph, NULL, NULL);
    CHECK_STATUS(status, "failed to enable graph");

    if ( connection ) {
        fprintf(stderr,"Connection:\n") ;
        fprintf(stderr,"    flag of tunnelling:%s\n",(connection->flags&MMAL_CONNECTION_FLAG_TUNNELLING)?"yes":"no") ;
        fprintf(stderr,"    enabled:%s\n",(connection->is_enabled)?"yes":"no") ;
        fprintf(stderr,"    callback:%p\n",connection->callback) ;
        connection->flags |= MMAL_CONNECTION_FLAG_TUNNELLING ;
        //mmal_connection_enable(connection) ;
        fprintf(stderr,"    enabled:%s\n",(connection->is_enabled)?"yes":"no") ;
    }


    /* load jpeg data */
    int count = 0 ;
    unsigned int buf_remain = szs[0] ;
    unsigned int buf_offset = 0 ;
    char * buf = bufs[0] ;
    
    unsigned int t0 = get_ms() ;
    fprintf(stdout,"t0:%u\n",t0) ;

    while(count < 10 && (context.status == MMAL_SUCCESS) )
    //while ( !eos_sent ) 
    {
        MMAL_BUFFER_HEADER_T *buffer;
        //fprintf(stderr,"context status:%d\n",context.status) ;
        
        /* Wait for buffer headers to be available on either the decoder input or the encoder output port */
        vcos_semaphore_wait(&context.semaphore);

        /* Send data to decode to the input port of the video decoder */
        if ((buffer = mmal_queue_get(pool_in->queue)) != NULL)
        {
            //SOURCE_READ_DATA_INTO_BUFFER(buffer);
           int s = buffer->alloc_size - 128  ;
           if ( s > buf_remain ) s = buf_remain ;
           memcpy(buffer->data, buf+buf_offset, s) ;
           buf_offset += s ;
           buf_remain -= s ;
           buffer->length = s ;
           buffer->offset = 0 ;
           buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN ;

           if(!buffer->length) {
              int idx = ++count%2 ;
              buf = bufs[idx] ;
              buf_remain = szs[idx] ; 
              buf_offset = 0 ;
              //buffer->flags |= MMAL_BUFFER_HEADER_FLAG_FRAME_END ;
              fprintf(stderr,"eos %d\n",count) ;
           }

           if(!buffer->length) eos_sent = MMAL_TRUE;

           buffer->flags = buffer->length ? 0 : MMAL_BUFFER_HEADER_FLAG_EOS;
           buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN;
           //fprintf(stderr, "sending %i bytes\n", (int)buffer->length);
           status = mmal_port_send_buffer(decoder->input[0], buffer);
           CHECK_STATUS(status, "failed to send buffer");
        }
        else {
            fprintf(stderr,"input pool is not empty\n");
        }
    }

    /* Stop decoding */
    fprintf(stderr, "stop decoding\n");
    unsigned int t1 = get_ms() ;
    fprintf(stderr, "%d frames/%u(ms), fps:%f\n",count,t1-t0,(float) count * 1000 / (t1-t0));

    fprintf(stderr, "press any key to quit\n");
    getchar() ;

    /* Stop everything. Not strictly necessary since mmal_component_destroy()
       * will do that anyway */
    mmal_port_disable(decoder->input[0]);
    mmal_port_disable(decoder->control);

    /* Stop everything */
    fprintf(stderr, "stop\n");
    mmal_graph_disable(graph);

error:
    /* Cleanup everything */
    if (        decoder)
        mmal_component_release(decoder);
    if (renderer)
        mmal_component_release(renderer);
    if (graph)
        mmal_graph_destroy(graph);
    
    if ( bufs[0] ) 
        free(bufs[0]) ;
    if ( bufs[1] )
        free(bufs[1]) ;
        
    return status == MMAL_SUCCESS ? 0 : -1;
}

zephyr5415
Posts: 6
Joined: Tue Jan 08, 2019 7:39 am

Re: Decode JPEG stream using MMAL or OpenMax

Tue Feb 12, 2019 9:29 am

zephyr5415 wrote:
Tue Feb 12, 2019 2:37 am
Hi @6by9,
sorry for reply late.

the codes I referenced are from here.
https://github.com/6by9/rpi_mmal_exampl ... e_render.c
which uses graphic_connection to connect the decode module and render module.

I just made merely change like following, I think I must miss some things to change it from playing a h264 file to MJPEG in memory.
BTW, I run mmal_vc_diag as testing my codes, it got:

Code: Select all

[email protected]:/opt/vc/bin $ sudo ./mmal_vc_diag components
48 created, 46 destroyed (0 destroying), 0 create failures
ril.video_decod                 : created: pid 23965 address 0x3f5058e4 pool mem alloc size 0
ril.video_rende                 : created: pid 23965 address 0x3f504fac pool mem alloc size 0

Code: Select all

[email protected]:/opt/vc/bin $ sudo ./mmal_vc_diag mmal-stats
component               port            buffers         fps     delay
ril.video_render        0 [in ]rx       0                0.0    0
ril.video_render        0 [in ]tx       0                0.0    0
ril.video_decode        0 [in ]rx       0                0.0    0
ril.video_decode        0 [in ]tx       0                0.0    0
ril.video_decode        0 [out]rx       3               210526.3        10
ril.video_decode        0 [out]tx       0                0.0    0

where the mem alloc size is 0, it looks abnormal.

zephyr5415
Posts: 6
Joined: Tue Jan 08, 2019 7:39 am

Re: Decode JPEG stream using MMAL or OpenMax

Wed Feb 13, 2019 3:29 am

6by9 wrote:
Fri Feb 01, 2019 12:38 pm

video_decode is really designed for decoding MJPEG or H264, not single JPEGs. It would probably work and ignore EXIF headers and the like, but image_decode is intended to decode single images (and provides access to EXIF metadata, thumbnails, and the like).
In either case the input port width/height/frame_rate are irrelevant as that information is encoded in the input bitstream. The output format will be changed based on the bitstream, and you'll get a MMAL_EVENT_FORMAT_CHANGED event callback (buffer->cmd != 0) from the output port.

Without your code I can only make guesses. MMAL_STATUS_EIO is normally that you have submitted a buffer that is smaller than port->buffer_size.
So can mmal decodes MJPEG composed by a series of JPEG files?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 6422
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Decode JPEG stream using MMAL or OpenMax

Wed Feb 13, 2019 10:43 am

zephyr5415 wrote:
Wed Feb 13, 2019 3:29 am
So can mmal decodes MJPEG composed by a series of JPEG files?
The video_decode component set to MJPEG will most likely accept that, ignoring all the EXIF headers.
The image_decode component set to JPEG will accept it.

Note that video_decode on MJPEG will normally expect all the images coming through to be of the same resolution. You are going to have to check for resolution change events, and reconfigure ports as required.
You're going to have to listen for those events on image_decode too, but it'll make fewer assumptions about the resolution.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Return to “C/C++”