HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 07, 2017 6:47 pm

On the weekend I successfully modified raspiraw.c for automatic camera tilt calibration on caterpillar robot (camera frames get analyzed in order to calibrate camera tilt via servo motor):
viewtopic.php?f=43&t=189661&p=1231577#p1231151

Doing this by modifying raspiraw.c was most easy way to achieve this (diff is 114 lines only, frame analysis, wiringPi servo control, ...).

I always wanted to learn how to write my own gstreamer plugins, in part for doing what I did via raspiraw.c modification (abuse?).
On gstreamer 2017 conference I learned in discussions that appsrc type plugin might be one of the first I want to try. It would allow me to use raspiraw to receive raw Bayer frames from Raspberry camera and push into a gstreamer pipeline. And I learned that even further processing of the raw Bayer data in gstreamer pipeline is possible by bayer2rgb plugin as well:

Code: Select all

pi@raspberrypi02:~ $ gst-inspect-1.0 | grep bayer2rgb
bayer:  bayer2rgb: Bayer to RGB decoder for cameras
pi@raspberrypi02:~ $ 
So plan is to build appsrc type gstreamer plugin, use it to get raspiraw raw bayer frames either directly or from .raw files stored on SD card into gstreamer pipeline and then play a bit with bayer2rgb to get some gstreamer plugin dev hands on.

I read gstreamer Plugin Writer's Guide and started here:
https://gstreamer.freedesktop.org/docum ... oiler.html

I followed the instructions to create my first gstreamer plugin, which worked after I did install these missing packages to Raspbian:

Code: Select all

sudo apt-get install autoreconf
sudo apt-get install autotools
sudo apt-get install autotools-dev
sudo apt-get install autoconf
sudo apt-get install libtool
sudo apt-get install libgstreamer1.0-dev
sudo apt-get install libgstreamer-plugins-base1.0-dev
After "make" and "sudo make install" I was able to run new "myfilter" plugin this way:

Code: Select all

pi@raspberrypi02:~ $ GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 gst-launch-1.0 -v -m fakesrc ! myfilter ! fakesink silent=TRUE | head -20 | tail -5
Got message #24 from element "pipeline0" (async-done): GstMessageAsyncDone, running-time=(guint64)18446744073709551615;
Setting pipeline to PLAYING ...
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
pi@raspberrypi02:~ $ 
I read that this is the old method of starting new plugin development:
Note

FIXME: this section is slightly outdated. gst-template is still useful as an example for a minimal plugin build system skeleton. However, for creating elements the tool gst-element-maker from gst-plugins-bad is recommended these days.
Anyway, it worked, and more importantly, took me less than an hour (I would have expected much more time to get first plugin built and running).

Next step is to get minimal appsrc plugin sample running:
https://gstreamer.freedesktop.org/docum ... a-pipeline

Hermann.
Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 08, 2017 12:17 am

Got Appsrc example running, these are the simple steps needed (not that easy to find this time):
  1. "ssh -X pi@..." into your Pi Zero (with USB2RJ45 connector or USB WLAN stick) or Pi Zero W
  2. Copy and paste Appsrc code into appsrc.c
    https://gstreamer.freedesktop.org/docum ... rc-example
  3. gcc appsrc.c -o appsrc `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
  4. GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 ./appsrc
    this will launch "appsrc" window in your host Linux (via "X11 forwarding" -X flag),
    of dimensions 384x288, that shows 0.5s full black, then 0.5s complete white, repeat
Looking at the code I cannot believe it is so simple to push data into a gstreamer pipeline via appsrc ...

Hermann.
Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 08, 2017 1:35 am

Just learned how to access and modify GstBuffer, a must for filling buffer 16bit wise instead byte wise for filling with RGB16 colors. This simple diff is all that is needed to blink red/blue instead of white/black, so easy!

Code: Select all

pi@raspberrypi02:~/gst-template/gst-plugin/src $ diff appsrc.c appsrc.rb.c 12a13,14
>   GstMapInfo info;
>   guint8 *p;
21d22
<   gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
22a24,26
>   gst_buffer_map(buffer, &info, GST_MAP_WRITE); 
>   for(p = info.data + info.size; p > info.data; )
>     *((guint16*)(--p,--p)) = white ? 0xf800 : 0x001f;  // RGB16: rgb 5-6-5 bits
pi@raspberrypi02:~/gst-template/gst-plugin/src $ 
Hermann.
Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 7:20 pm

In previous appsrc samples simple single colored frames were pushed into gstreamer pipeline.
I plan to make raspiraw push GRAY8 encoded frames generated from raw Bayer frames into gstreamer pipeline.

As a pre-study I compared the 3 ways I know of to process Raspberry camera video in gstreamer.
Result: raspividyuv is slowest, and GRAY8 makes the difference between v4l2src and rpicamsrc.
These numbers are fps for 640x480 video forced to 90fps:

Code: Select all

             format=GRAY8
              w/o   w/ 
raspividyuv   25    25
v4l2src       88    42
rpicamsrc     89    89


These are the complete command lines, run on a fresh SD card image with latest RASPBIAN STRETCH LITE (2017-09-07):

Code: Select all

pi@raspberrypi02:~ $ gst-launch-1.0 -v v4l2src ! videoconvert ! video/x-raw,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1162, dropped: 0, current: 88.86, average: 87.92
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1162, dropped: 0, current: 88.86, average: 87.92
...

pi@raspberrypi02:~ $ gst-launch-1.0 -v v4l2src ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1099, dropped: 0, current: 41.82, average: 42.39
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1099, dropped: 0, current: 41.82, average: 42.39
...


pi@raspberrypi02:~ $ gst-launch-1.0 -v rpicamsrc ! videoconvert ! video/x-raw,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1136, dropped: 0, current: 89.51, average: 89.67
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1136, dropped: 0, current: 89.51, average: 89.67
...

pi@raspberrypi02:~ $ gst-launch-1.0 -v rpicamsrc ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink" 
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1128, dropped: 0, current: 89.49, average: 89.05
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1128, dropped: 0, current: 89.49, average: 89.05
...



pi@raspberrypi02:~ $ raspividyuv -t 0 -w 640 -h 480 -fps 90 -o - | gst-launch-1.0 -v fdsrc ! videoparse format=i420 ! videoconvert ! video/x-raw,format=GRAY8 ! fpsdisplaysink video-sink="fakesink"
..
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1133, dropped: 0, current: 25.24, average: 25.03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1133, dropped: 0, current: 25.24, average: 25.03
...

pi@raspberrypi02:~ $ raspividyuv -t 0 -w 640 -h 480 -fps 90 -o - | gst-launch-1.0 -v fdsrc ! videoparse format=i420 ! videoconvert ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1042, dropped: 0, current: 24.85, average: 25.03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1042, dropped: 0, current: 24.85, average: 25.03
...

P.S:
GRAY8 is so easy, this is the small diff for changing original RGB16 black/white switching to GRAY8 dark/bright grey switching:

Code: Select all

pi@raspberrypi02:~/gst-template/gst-plugin/src $ diff appsrc.c appsrc.gray8.c 
16c16
<   size = 385 * 288 * 2;
---
>   size = 385 * 288;
21c21
<   gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
---
>   gst_buffer_memset (buffer, 0, white ? 0xAA : 0x55, size);
58c58
<                      "format", G_TYPE_STRING, "RGB16",
---
>                      "format", G_TYPE_STRING, "GRAY8",
pi@raspberrypi02:~/gst-template/gst-plugin/src $ 

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 9:36 pm

I wanted to create appsrc, but provide pipeline on command line like "gst-launch" does.

This is how the new pipeline gets launched:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch.c -o appsrc-launch `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
$ 
$ ./appsrc-launch "appsrc name=_ ! videoconvert ! xvimagesink"
Small diff that makes it work [especially "pipeline = gst_parse_launch(argv[1], NULL);"]:

Code: Select all

$ diff appsrc.c appsrc-launch.c 
16c16
<   size = 385 * 288 * 2;
---
>   size = 384 * 288 * 2;
43c43
<   GstElement *pipeline, *appsrc, *conv, *videosink;
---
>   GstElement *pipeline, *appsrc;
50,53c50,52
<   pipeline = gst_pipeline_new ("pipeline");
<   appsrc = gst_element_factory_make ("appsrc", "source");
<   conv = gst_element_factory_make ("videoconvert", "conv");
<   videosink = gst_element_factory_make ("xvimagesink", "videosink");
---
>   g_assert(argc > 1);
>   pipeline = gst_parse_launch(argv[1], NULL); 
>   g_assert(pipeline);
55a55,56
>   appsrc = gst_bin_get_by_name (GST_BIN(pipeline), "_");
>   g_assert(appsrc);
63,64d63
<   gst_bin_add_many (GST_BIN (pipeline), appsrc, conv, videosink, NULL);
<   gst_element_link_many (appsrc, conv, videosink, NULL);
$ 
Next step: add -l "gstreamer pipeline" to raspiraw.

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 10:55 pm

I really like gstreamer appsrc development !

Lookup what you are interested in, eg. fpsdisplaysink:
https://gstreamer.freedesktop.org/data/ ... ysink.html

Find what you want:
  • The “fps-measurements” signal
  • The “signal-fps-measurements” property
Add it and you are done -- if a fpsdisplaysink with name "#" is present in pipeline, then fps measurement data gets printed:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch2.c -o appsrc-launch2 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
$ 
$ ./appsrc-launch2 'appsrc name=_ ! videoconvert ! fpsdisplaysink video-sink="fakesink"'
^C
$ ./appsrc-launch2 'appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink="fakesink"'
fps-measurements connected
dropped: 0, current: 5.97, average: 5.97
dropped: 0, current: 1.67, average: 3.63
dropped: 0, current: 2.22, average: 3.00
dropped: 0, current: 1.87, average: 2.76
dropped: 0, current: 2.07, average: 2.57
dropped: 0, current: 1.89, average: 2.48
dropped: 0, current: 1.99, average: 2.43
^C
$ 
So easy:

Code: Select all

$ diff appsrc-launch.c appsrc-launch2.c 
5a6,16
> cb_fps_measurements(GstElement *fpsdisplaysink,
>                     gdouble arg0,
>                     gdouble arg1,
>                     gdouble arg2,
>                     gpointer user_data)
> {
>   g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
> }
> 
> 
> static void
43c54
<   GstElement *pipeline, *appsrc;
---
>   GstElement *pipeline, *appsrc, *fpsdisplaysink;
54a66
>   fpsdisplaysink = gst_bin_get_by_name (GST_BIN(pipeline), "#");
69a82,88
> 
>   /* setup fpsdisplaysink "#" if present */
>   if (fpsdisplaysink) {
>     g_object_set (G_OBJECT (fpsdisplaysink), "signal-fps-measurements", TRUE, NULL);
>     g_signal_connect (fpsdisplaysink, "fps-measurements", G_CALLBACK (cb_fps_measurements), NULL);
>     g_print("fps-measurements connected\n");
>   }
$ 

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 15, 2017 12:17 am

I found an interesting GStreamer-devel thread "appsrc usage (push and pull mode)":
http://gstreamer-devel.966125.n4.nabble ... 62768.html

Sebastian and Tim agreed that an application driven push into gstreamer pipeline should work [via gst_app_src_push_buffer() ].

I tried it and modified appsrc-launch2.c to appsrc-launch3.c, both files are attached.

This is the small diff:

Code: Select all

$ diff appsrc-launch2.c appsrc-launch3.c 
1a2,4
> #include <gst/app/gstappsrc.h>
> #include <stdlib.h>
> #include <unistd.h>
15d17
< 
17,19c19
< cb_need_data (GstElement *appsrc,
<           guint       unused_size,
<           gpointer    user_data)
---
> read_data (gpointer    user_data)
25a26
>   GstAppSrc *appsrc = user_data;
41,42c42
<   g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
<   gst_buffer_unref (buffer);
---
>   ret = gst_app_src_push_buffer(appsrc, buffer);
81d80
<   g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
88a88,89
> 
>   g_idle_add ((GSourceFunc) read_data, appsrc);
$ 
Executing

Code: Select all

$ ./appsrc-launch2 "appsrc name=_ ! videoconvert ! xvimagesink"
opens small window and changes from black to white and back with 2fps
(SSH session to Pi Zero got started via "ssh -X").

Executing

Code: Select all

$ ./appsrc-launch3 "appsrc name=_ ! videoconvert ! xvimagesink"
opens small window, displays black frame, and then freezes.

Compiling appsrc-launch3.c needs a library added:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch3.c -o appsrc-launch3 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0` -lgstapp-1.0
What can be the reason for freezing?
What am I missing for getting appsrc-launch3 do the same as appsrc-launch2?
Attachments
appsrc-launch23.zip
contains appsrc-launc2.c and appsrc-lauch3.c
(2.47 KiB) Downloaded 4 times

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 15, 2017 6:33 pm

I did ask in the shown thread on GStreamer-devel as well and got response from Antonio on what I did wrong:
http://gstreamer-devel.966125.n4.nabble ... l#a4685320

My mistake was to change callback function while not looking up documentation and seeing that function prototype differs. This callback function needs to return a gboolean, and returning FALSE stops. So returning G_SOURCE_CONTINUE (or TRUE) was the trick.

Since I run the gstreamer pipeline on Raspberry Pi Zero and only allow for display output via "ssh -X" I completely did the tests of new code with:

Code: Select all

$ ./appsrc-launch3 1000 "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'"
The 2nd argument is the target fps rate for pushing frames into the pipeline [by usleep(1000000/fps)].

No graphical display output, just some fps measurements from time to time, and notification on every 100th read_data() call.
This is output after I stopped execution of pipeline after more than 7 minutes:

Code: Select all

...
dropped: 0, current: 637.03, average: 626.86
read_data(440600)
read_data(440700)
read_data(440800)
dropped: 0, current: 632.32, average: 626.86
read_data(440900)
read_data(441000)
read_data(441100)
^C
$
Before I did get the rate limiting right I nearly froze the Pi Zero (several times) by consuming up all memory [not a good idea to push 100,000 frames per second into a gstreamer pipeline :) ]. I did monitor the complete run, and virtual as well as resident memory remained stable:

Code: Select all

pi@raspberrypi02:~ $ top -p856
top - 17:56:13 up 56 min,  3 users,  load average: 2.22, 1.58, 1.14
Tasks:   1 total,   1 running,   0 sleeping,   0 stopped,   0 zombie
%Cpu(s): 38.7 us, 11.3 sy,  0.0 ni, 48.7 id,  0.0 wa,  0.0 hi,  1.3 si,  0.0 st
KiB Mem :   379572 total,   271060 free,    41792 used,    66720 buff/cache
KiB Swap:   102396 total,    84920 free,    17476 used.   289960 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S %CPU %MEM     TIME+ COMMAND      
  856 pi        20   0   50100  25484   9012 R 38.2  6.7   4:19.69 appsrc-laun+ 
Working appsrc-launch3.c is attached, here is the difference to previous appsrc-launch2.c
(push vs. pull of frames):

Code: Select all

$ diff -pruN appsrc-launch2.c appsrc-launch3.c 
--- appsrc-launch2.c	2017-11-14 22:48:13.205222996 +0000
+++ appsrc-launch3.c	2017-11-15 17:44:25.560371444 +0000
@@ -1,6 +1,10 @@
 #include <gst/gst.h>
+#include <gst/app/gstappsrc.h>
+#include <stdlib.h>
+#include <unistd.h>
 
 static GMainLoop *loop;
+static gint fps, cnt=0;
 
 static void
 cb_fps_measurements(GstElement *fpsdisplaysink,
@@ -12,17 +16,15 @@ cb_fps_measurements(GstElement *fpsdispl
   g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
 }
 
-
-static void
-cb_need_data (GstElement *appsrc,
-          guint       unused_size,
-          gpointer    user_data)
+gboolean
+read_data (gpointer    user_data)
 {
   static gboolean white = FALSE;
   static GstClockTime timestamp = 0;
   GstBuffer *buffer;
   guint size;
   GstFlowReturn ret;
+  GstAppSrc *appsrc = user_data;
 
   size = 384 * 288 * 2;
 
@@ -34,17 +36,23 @@ cb_need_data (GstElement *appsrc,
   white = !white;
 
   GST_BUFFER_PTS (buffer) = timestamp;
-  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
+  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, fps);
 
   timestamp += GST_BUFFER_DURATION (buffer);
 
-  g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
-  gst_buffer_unref (buffer);
+  ret = gst_app_src_push_buffer(appsrc, buffer);
 
   if (ret != GST_FLOW_OK) {
     /* something wrong, stop pushing */
     g_main_loop_quit (loop);
   }
+
+  if (++cnt % 100 == 0)
+    g_print("read_data(%d)\n",cnt);
+
+  usleep(1000000/fps);
+
+  return G_SOURCE_CONTINUE;
 }
 
 gint
@@ -58,8 +66,9 @@ main (gint   argc,
   loop = g_main_loop_new (NULL, FALSE);
 
   /* setup pipeline */
-  g_assert(argc > 1);
-  pipeline = gst_parse_launch(argv[1], NULL); 
+  g_assert(argc == 3);
+  fps = atoi(argv[1]); 
+  pipeline = gst_parse_launch(argv[2], NULL); 
   g_assert(pipeline);
 
   /* setup */
@@ -78,7 +87,6 @@ main (gint   argc,
   g_object_set (G_OBJECT (appsrc),
         "stream-type", 0,
         "format", GST_FORMAT_TIME, NULL);
-  g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
 
   /* setup fpsdisplaysink "#" if present */
   if (fpsdisplaysink) {
@@ -87,6 +95,8 @@ main (gint   argc,
     g_print("fps-measurements connected\n");
   }
 
+  g_idle_add ((GSourceFunc) read_data, appsrc);
+
   /* play */
   gst_element_set_state (pipeline, GST_STATE_PLAYING);
   g_main_loop_run (loop);
$ 
So what did I do here?
I tried to push 384*288*2bytes=216KB frames with 1000fps into this pipeline:

Code: Select all

appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'
I am positively surprised by the measured 626fps on average shown!

My real application (modification of raspiraw) will only push 1/3rd framesize, and with only 90fps (maybe 120fps) into gstreamer pipeline (320x240 GRAY8 frames). The measurement results seen (626fps with 3 times target frame size) seem to indicate this will be easily possible.
Attachments
appsrc-launch3.zip
Working source discussed in this posting
(1.34 KiB) Downloaded 4 times

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 2:36 am

Took quite some time to get the timestamps right (raspiraw timestamps are [us], gstreamer timestamps are [ns]).
I just took appsrc-launch3.cpp and spread the different parts over raspiraw.c ;-)
Now raspiraw sends 640x480 raw10 size frames (384000 bytes, black/white blinking as in appsrc-launch3) into gstreamer pipeline. And that pipelines measures 60fps on average, which is minimal fps rate for mode 7 I started raspiraw with!

raspiraw gets started with timeout 5000ms (-t), mode 7 (640x480, 60-90fps), saverate 1 (every frame received from camera triggers creation of new black/white frame of same size and pushes it into gstreamer pipeline). raspiraw ends after 5.6s, the stdout output from gstreamer callback and fpsdisplaysink gets "tee"d into file "out", grepping for "dropped" shows the framerates, tail of out shows that callback was triggered 302 times (5s x 60fps):

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 5000 -sr 1 -o foobar | tee out
...
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154870253, flags 0004
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154870253, flags 0084
read_data(183,3028046000,5000000000)
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154886890, flags 0004
mmal: Buffer 0x72f6f0 returned, filled 384000, timestamp 14154886890, flags 0084
read_data(184,3044683000,5000000000)
dropped: 0, current: 67.65, average: 60.38
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154903528, flags 0004
read_data(185,3061321000,5000000000)
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154903528, flags 0084
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154920166, flags 0004
mmal: read_data(186,3077959000,5000000000)
Buffer 0x72f6f0 returned, filled 384000, timestamp 14154920166, flags 0084
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154936803, flags 0004
mmal: read_data(187,3094596000,5000000000)
Buffer 0x72f340 returned, filled 384000, timestamp 14154936803, flags 0084
...
mmal: mmal_port_disconnect: vc.ril.video_render:in:0(0x72d8b0) is not connected
mmal: mmal_component_destroy_internal: vc.ril.video_render 2
mmal: mmal_port_free: vc.ril.video_render:in:0 at 0x72d8b0
mmal: mmal_port_free: vc.ril.video_render:ctr:0 at 0x72d590

real	0m5.662s
user	0m2.440s
sys	0m0.700s
pi@raspberrypi02:~/userland-rawcam $ grep dropped out 
dropped: 0, current: 63.07, average: 63.07
dropped: 0, current: 29.39, average: 46.18
dropped: 0, current: 56.73, average: 49.71
dropped: 0, current: 73.32, average: 55.57
dropped: 0, current: 72.40, average: 58.95
dropped: 0, current: 67.65, average: 60.38
dropped: 0, current: 60.14, average: 60.35
dropped: 0, current: 60.12, average: 60.32
dropped: 0, current: 60.05, average: 60.29
pi@raspberrypi02:~/userland-rawcam $ tail -3 out
read_data(300,4974646000,5000000000)
read_data(301,4991284000,5000000000)
read_data(302,5007922000,5000000000)
pi@raspberrypi02:~/userland-rawcam $ 
This shows the gstreamer pipeline used, how camera streaming gets started and then gstreamer main loop gets entered:

Code: Select all

...
  pipeline = gst_parse_launch("appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'", NULL);
...
  g_idle_add ((GSourceFunc) read_data, appsrc);

  /* play */
  gst_element_set_state (pipeline, GST_STATE_PLAYING);

        start_camera_streaming(sensor, sensor_mode);

  g_main_loop_run (gloop);

  /* clean up */
  gst_element_set_state (pipeline, GST_STATE_NULL);
...
This shows the gstreamer callback, called every 4ms, and pushing a frame if available into gstreamer pipeline.
It also shuts down gstreamer main loop in case of problem, and more importantly when timeout is reached:

Code: Select all

gboolean
read_data (gpointer    user_data)
{
  if (gbuffer) { // frame available to push

    GstAppSrc *appsrc = user_data;
    GstFlowReturn ret = gst_app_src_push_buffer(appsrc, gbuffer);

    if (++gcnt % 100 == 0) {}
      g_print("read_data(%d,%llu,%llu)\n",gcnt,GST_BUFFER_PTS (gbuffer),ggtimeout);

    if ((ret != GST_FLOW_OK) || (ggtimeout && (GST_BUFFER_PTS (gbuffer) > ggtimeout))) {
      /* something wrong or timeout, stop pushing */
      g_main_loop_quit (gloop);
    }

    gbuffer = NULL; // ready for new frame
  }

  vcos_sleep(4); // TODO 250fps max?

  return G_SOURCE_CONTINUE;
}
This is the current code that replaced the "SD card write code block" of raspiraw:

Code: Select all

  if (!gbuffer) { // no frame left to send, take new one

    GstBuffer *buff;
    guint size;

    size = 800 * 480;

    buff = gst_buffer_new_allocate (NULL, size, NULL);

    gst_buffer_memset (buff, 0, white ? 0xff : 0x0, size);

    white = !white;

    if (!ggtimeout) {
      gbase     = buffer->pts; // us
      ggtimeout = gst_util_uint64_scale_int (cfg->timeout, GST_MSECOND, 1);
    }

    GST_BUFFER_PTS (buff) = (buffer->pts - gbase) * 1000; // us -> ns
    GST_BUFFER_DURATION (buff) = 16666666; // TODO value?  1/60 s in [ns]

    gbuffer = buff; // new frame ready for push
  }
All not very nice yet, and most importantly, copying the the raw Bayer frame captured from raspberry camera in buffer->data into the gst_buffer created for pushing into pipeline.

That will be done tomorrow evening, now its far too late here, have to stop for sleeping.

Although it took some time, it was not that difficult to make raspiraw push its stream into gstreamer pipeline, and it works already partially ...

Code: Select all

$ diff raspiraw.c.orig raspiraw.c | wc --lines
136
$ 

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 8:16 pm

The solution last night worked, but it was not really a "push" solution, more a "poll 250 times a second for push allowed, then push".

I completely eliminated the gstreamer read_data() callback and do not register any callback function via g_idle_add(). Instead the raspiraw callback routine called every 1/60th second directly pushes the buffer into gstreamer pipeline via (global) appsrc variable.

Instead of just counting framerate via fpsdisplaysink in pipeline I wanted to see "some" video frames. 640x480 raw10 frame has width 640*5/4=800 and height 480. I just initialized appsrc caps by

Code: Select all

        gst_caps_new_simple ("video/x-raw",
                     "format", G_TYPE_STRING, "GRAY8",
                     "width", G_TYPE_INT, 800,
                     "height", G_TYPE_INT, 480,
                     "framerate", GST_TYPE_FRACTION, 0, 1,
                     NULL), NULL);
cheating a bit in stating that 640x480 raw10 frame should be taken as 800x480 GRAY8. This does a simple gray conversion for each 2x2 "rg/Gb" raw Bayer pattern pixel. Only every 5th column is junk (the lowest 2 bits of 10bit value for the 4 pixels left). This is one of the first frames I saw and captured, straight up, directed at lamp on ceiling, and a hand (similar to https://pbs.twimg.com/media/DOno31cXkAEwnn-.jpg:large):
Image

This is how raspiraw got started -- yes, the gstreamer pipeline now gets passed after -o raspiraw parameter on shell command line.
15 seconds of recording, with saverate 30, translating into 2 frames per second at 60fps camera speed:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 30 -o "appsrc name=_ ! videoconvert ! xvimagesink"
Next: better raw10 conversion, I saw problems with bayer2rgb before, read that gstreamer needs to be recompiled with raw10 support, way to go ...

P.S:
Below is the problem I have seen before with bayer2rgb and gstreamer (there is only one lamp on ceiling, not four).

Compiled without cheating:

Code: Select all

  g_object_set (G_OBJECT (appsrc), "caps",
        gst_caps_new_simple ("video/x-bayer",
                     "format", G_TYPE_STRING, "rggb",
                     "width", G_TYPE_INT, 640,
                     "height", G_TYPE_INT, 480,
                     "framerate", GST_TYPE_FRACTION, 0, 1,
                     NULL), NULL);
Command line used:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 30 -o "appsrc name=_ ! bayer2rgb ! videoconvert ! xvimagesink"
Image

This is real view, taken with raspistill (-w 640 -h 480):
Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 9:04 pm

Instead of "memcpy"ing raspiraw frame to gstreamer buffer I did now manually convert raw10 to raw8.
This code seems to be doing the right thing according
https://developer.xamarin.com/api/field ... mat.Raw10/

Code: Select all

    gst_buffer_map(buff, &info, GST_MAP_WRITE);
//    memcpy(info.data, buffer->data, buffer->length);
  p=buffer->data;
  q=info.data;
  for(i=0; i<640*480; i+=4)
  {
    k = (((unsigned int)p[0])<<2) + ((p[4]>>0)&0x03);
    if (k>255) k=255;
    q[0] = k;
    k = (((unsigned int)p[1])<<2) + ((p[4]>>2)&0x03);
    if (k>255) k=255;
    q[1] = k;
    k = (((unsigned int)p[2])<<2) + ((p[4]>>4)&0x03);
    if (k>255) k=255;
    q[2] = k;
    k = (((unsigned int)p[3])<<2) + ((p[4]>>6)&0x03);
    if (k>255) k=255;
    q[3] = k;
    p+=5; q+=4;
  }


After recompiling the same bayer2rgb raspiraw command line as before produces below image.
At least the format 640x480 looks right, proving that bayer2rgb expects raw8 format.
But the image is a bit dark ...
Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 11:45 pm

Oh oh, I thought it must be a programming error on my side, or a misunderstanding on video/x-bayer or GRAY8 conversion.
But it was not, it was incorrect test setup.
Now I added flashlight from my smartphone to the game and all looks good !
For all 3 cases below I did determine the full 10bit pixel value, capped at 255 and stored.

This is with 640x480 video/x-bayer "bggr" (raw8) mode, and "...! bayer2rgb !..." in pipeline:
Image

This is with 640x480 video/x-raw "GRAY8" mode, with just "...! videoconvert !..." in pipeline:
Image

And this is with 320x240 video/x-raw "GRAY8" mode, where I took the left bottom (brightest) green pixel of each 2x2 bg/Gr tile:
Image

I was just lucky when doing automatic robot camera tilt calibration 11 days ago:
viewtopic.php?f=43&t=189661&p=1218763#p1231151

On the caterpillar robot I have a night vision v1 camera module, but more importantly a 3W infrared LED mounted at the camera.
Calibration (useful picture taking from raw bayer data) worked in darkness due to this LED:
Image

Image

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Fri Nov 17, 2017 12:46 am

I added the ability to send the same 640x480 bytes extracted from 800x640 raw10 captured camera frame as either "video/x-bayer,format='bggr'" or as "video/x-raw,format='GRAY8'", selected by the name of the appsrc element in gstreamer pipeline passed to raspiraw's -o element:

Code: Select all

  /* setup */
  if ( (appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "x-raw")) ) {
    g_object_set (G_OBJECT (appsrc), "caps",
          gst_caps_new_simple ("video/x-raw",
                       "format", G_TYPE_STRING, "GRAY8",
                       "width", G_TYPE_INT, 640,
                       "height", G_TYPE_INT, 480,
                       "framerate", GST_TYPE_FRACTION, 0, 1,
                       NULL), NULL);
  } else {
    g_assert( appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "x-bayer") );
    g_object_set (G_OBJECT (appsrc), "caps",
          gst_caps_new_simple ("video/x-bayer",
                       "format", G_TYPE_STRING, "bggr",
                       "width", G_TYPE_INT, 640,
                       "height", G_TYPE_INT, 480,
                       "framerate", GST_TYPE_FRACTION, 0, 1,
                       NULL), NULL);
  }
I had heard before at gstreamer Prague 2017 conference that bayer2rgb might be slow.
Here now is the confirmation (output of raspivid goes to stderr, from gstreamer to stdout, redirection to files necessary since otherwise slowdown).

User of bayer2rgb drops 60 fps to only 27fps measured at end of pipeline:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 1 -o "appsrc name=x-bayer ! bayer2rgb ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m15.662s
user	0m13.140s
sys	0m2.170s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 27.15, average: 27.55
dropped: 0, current: 27.23, average: 27.54
dropped: 0, current: 27.37, average: 27.54
read_data(582,14973466000)
dropped: 0, current: 27.37, average: 27.54
read_data(583,14990104000)
pi@raspberrypi02:~/userland-rawcam $ 
GRAY8 x-raw does only slightly reduce from 60fps to 57fps measured:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 1 -o "appsrc name=x-raw ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m15.649s
user	0m9.640s
sys	0m1.630s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 58.30, average: 57.21
dropped: 0, current: 56.23, average: 57.17
dropped: 0, current: 56.24, average: 57.14
read_data(853,14923566000)
read_data(854,14956840000)
read_data(855,14973478000)
pi@raspberrypi02:~/userland-rawcam $ 

HermannSW
Posts: 277
Joined: Fri Jul 22, 2016 9:09 pm

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Sat Nov 18, 2017 9:57 pm

The raspiraw2gstreamer approach was motivated by the possibility of getting raspiraw doing higher than 90fps video capturing, and then push that into gstreamer pipeline for further processing in context of robot control.

I did remove the bayer mode, because that can be better done with rpicamsrc.

This new version of raspiraw.c is based on that from Aug 8 version of rawcam:
https://github.com/6by9/userland/tree/rawcam

Find attached the new modified version, as well as a patchfile (do "patch -p0 < patch.txt" in raspiraw.c directory).

This version generates 320x240 x-raw GRAY8 format stream, currently at 60fps (with md=7, 42fps with md=6):

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ ./camera_i2c 
setting GPIO for board revsion: 9000c1
PiZero / PiZero W - I2C 0 on GPIO 28 & 29. GPIOs 40 & 44 for LED and power
pi@raspberrypi02:~/userland-rawcam $ 
pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 5000 -sr 1 -o "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m5.557s
user	0m1.750s
sys	0m0.580s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 56.39, average: 59.27
dropped: 0, current: 59.94, average: 59.35
dropped: 0, current: 58.32, average: 59.23
read_data(294,4957991000)
read_data(295,4974628000)
read_data(296,4991265000)
pi@raspberrypi02:~/userland-rawcam $ 
Before making this version of raspiraw able to capture with more than 90fps, use of rpicamsrc or v4l2src are preferred. They achieve GRAY8 format with 90fps:

Code: Select all

$ gst-launch-1.0 rpicamsrc ! video/x-raw,format=GRAY8,width=640,height=480 !  ... 
$ gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480 ! ... 
Nice is that the current version needs mofifications of raspiraw.c at 3 places only.

The first does include the needed gstreamer headers and defines fps_measurements() callback:

Code: Select all

42a43,57
> #include   <gst/gst.h>
> GMainLoop  *gloop  = NULL;
> #include   <gst/app/gstappsrc.h>
> GstAppSrc  *appsrc = NULL;
> 
> void
> cb_fps_measurements(GstElement *fpsdisplaysink,
>               gdouble arg0,
>               gdouble arg1,
>               gdouble arg2,
>               gpointer user_data)
> {
>       g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
> }
> 
The second deals with pushing captured frame directly into gstreamer pipeline. This is done only if gstreamer output was selected on command line. I love how simple it finally turned out to push the converted 320x240 GRAY8 frame (from 640x480 raw10 bayer captured by camera) into gstreamer pipeline!

Code: Select all

362a378,379
>               if (!appsrc)   // to keep SD card code indentation
>               {
379a397,443
>               else
>               {
>                       static GstClockTime ggtimeout = 0, gbase = 0;
> 
>                       GstBuffer *buff;
>                       GstMapInfo info;
>                       unsigned char *p, *q;
>                       static gint gcnt=0;
>                       unsigned int i,j,k;
> 
>                       buff = gst_buffer_new_allocate (NULL, 320*(240+1), NULL);
>                       gst_buffer_map(buff, &info, GST_MAP_WRITE);
> 
>                       p=buffer->data;
>                       q=info.data;
>                       for(i=1; i<480; i+=2) {
>                               p+=800;
>                               for(j=0; j<800; p+=5,j+=5) {
>                                       k = (((unsigned)p[0])<<2) + ((p[4]>>0)&0x03);
>                                       if (k>255) k=255;
>                                       *q++ = k;
> 
>                                       k = (((unsigned)p[2])<<2) + ((p[4]>>4)&0x03);
>                                       if (k>255) k=255;
>                                       *q++ = k;
>                               }
>                       }
> 
>                       if (!ggtimeout) {           // take baseline timestamp
>                               gbase     = buffer->pts;  // us
>                               ggtimeout = gst_util_uint64_scale_int (cfg->timeout, GST_MSECOND, 1);
>                       }
> 
>                       GST_BUFFER_PTS (buff)      = (buffer->pts - gbase) * 1000; // us -> ns
>                       GST_BUFFER_DURATION (buff) = gst_util_uint64_scale_int (1, GST_SECOND, 100);
> 
>                       if (GST_BUFFER_PTS (buff) > ggtimeout) {
>                               g_main_loop_quit (gloop);              // timeout, stop pushing
>                       } else {
>                               if (GST_FLOW_OK != gst_app_src_push_buffer(appsrc, buff)) {
>                                       g_main_loop_quit (gloop);            // something wrong, stop pushing
>                               } else {
>                                       g_print("read_data(%d,%llu)\n", ++gcnt, GST_BUFFER_PTS (buff));
>                               }
>                       }
>               }   // to keep SD card code indentation
>               }
The third does check whether gstreamer output is requested [-o present, its argument starting with "appsrc " and mode 6 or 7 (the 640x480 modes)]. After creation of all gstreamer stuff, start_camera_streaming() gets called, followed by g_main_loop_run (gloop). This loop gets teared down if timeout is reached or a gstreamer error happened in 2nd part:

Code: Select all

982,984c1046,1092
<       start_camera_streaming(sensor, sensor_mode);
< 
<       vcos_sleep(cfg.timeout);
---
>       if (cfg.output && (cfg.mode>>1 == 3) && (strncmp(cfg.output, "appsrc ", 7)==0)) {
>               /* init GStreamer */
>               GstElement *pipeline, *fpsdisplaysink;
>               gst_init (&argc, (char ***)&argv);
>               gloop = g_main_loop_new (NULL, FALSE);
> 
>               /* setup pipeline */
>               g_assert( pipeline = gst_parse_launch(cfg.output, NULL) ); 
> 
>               /* setup */
>               g_assert( appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "_") );
>               g_object_set (G_OBJECT (appsrc), "caps",
>                       gst_caps_new_simple ("video/x-raw",
>                                       "format", G_TYPE_STRING, "GRAY8",
>                                       "width", G_TYPE_INT, 320,
>                                       "height", G_TYPE_INT, 240,
>                                       "framerate", GST_TYPE_FRACTION, 0, 1,
>                                       NULL), NULL);
> 
>               /* setup appsrc */
>               g_object_set (G_OBJECT (appsrc),
>                       "stream-type", 0,
>                       "format", GST_FORMAT_TIME, NULL);
> 
>               /* setup fpsdisplaysink "#" if present */
>               fpsdisplaysink = gst_bin_get_by_name (GST_BIN(pipeline), "#");
>               if (fpsdisplaysink) {
>                       g_object_set (G_OBJECT (fpsdisplaysink), "signal-fps-measurements", TRUE, NULL);
>                       g_signal_connect (fpsdisplaysink, "fps-measurements", G_CALLBACK (cb_fps_measurements), NULL);
>                       g_print("fps-measurements connected\n");
>               }
> 
>               /* play */
>               gst_element_set_state (pipeline, GST_STATE_PLAYING);
> 
>               start_camera_streaming(sensor, sensor_mode);
> 
>               g_main_loop_run (gloop);
> 
>               /* clean up */
>               gst_element_set_state (pipeline, GST_STATE_NULL);
>               gst_object_unref (GST_OBJECT (pipeline));
>               g_main_loop_unref (gloop);
>       } else {
>               start_camera_streaming(sensor, sensor_mode);
>               vcos_sleep(cfg.timeout);
>       }
P.S:
I run the Pi Zero (W) headless, but when HDMI monitor is connected, this command works:

Code: Select all

$ time raspiraw -md 7 -t 5000 -sr 1 -o "appsrc name=_ ! videoconvert ! fbdevsink device=/dev/fb0"
Image
Attachments
raspiraw-appsrc.zip
modified raspiraw.c and patchfile
(11.27 KiB) Not downloaded yet

Return to “Camera board”

Who is online

Users browsing this forum: Pablosj and 14 guests