User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 07, 2017 6:47 pm

On the weekend I successfully modified raspiraw.c for automatic camera tilt calibration on caterpillar robot (camera frames get analyzed in order to calibrate camera tilt via servo motor):
viewtopic.php?f=43&t=189661&p=1231577#p1231151

Doing this by modifying raspiraw.c was most easy way to achieve this (diff is 114 lines only, frame analysis, wiringPi servo control, ...).

I always wanted to learn how to write my own gstreamer plugins, in part for doing what I did via raspiraw.c modification (abuse?).
On gstreamer 2017 conference I learned in discussions that appsrc type plugin might be one of the first I want to try. It would allow me to use raspiraw to receive raw Bayer frames from Raspberry camera and push into a gstreamer pipeline. And I learned that even further processing of the raw Bayer data in gstreamer pipeline is possible by bayer2rgb plugin as well:

Code: Select all

pi@raspberrypi02:~ $ gst-inspect-1.0 | grep bayer2rgb
bayer:  bayer2rgb: Bayer to RGB decoder for cameras
pi@raspberrypi02:~ $ 
So plan is to build appsrc type gstreamer plugin, use it to get raspiraw raw bayer frames either directly or from .raw files stored on SD card into gstreamer pipeline and then play a bit with bayer2rgb to get some gstreamer plugin dev hands on.

I read gstreamer Plugin Writer's Guide and started here:
https://gstreamer.freedesktop.org/docum ... oiler.html

I followed the instructions to create my first gstreamer plugin, which worked after I did install these missing packages to Raspbian:

Code: Select all

sudo apt-get install autotools-dev
sudo apt-get install autoconf
sudo apt-get install libtool
sudo apt-get install libgstreamer1.0-dev
sudo apt-get install libgstreamer-plugins-base1.0-dev
After "make" and "sudo make install" I was able to run new "myfilter" plugin this way:

Code: Select all

pi@raspberrypi02:~ $ GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 gst-launch-1.0 -v -m fakesrc ! myfilter ! fakesink silent=TRUE | head -20 | tail -5
Got message #24 from element "pipeline0" (async-done): GstMessageAsyncDone, running-time=(guint64)18446744073709551615;
Setting pipeline to PLAYING ...
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
I'm plugged, therefore I'm in.
pi@raspberrypi02:~ $ 
I read that this is the old method of starting new plugin development:
Note

FIXME: this section is slightly outdated. gst-template is still useful as an example for a minimal plugin build system skeleton. However, for creating elements the tool gst-element-maker from gst-plugins-bad is recommended these days.
Anyway, it worked, and more importantly, took me less than an hour (I would have expected much more time to get first plugin built and running).

Next step is to get minimal appsrc plugin sample running:
https://gstreamer.freedesktop.org/docum ... a-pipeline

Hermann.
Image
Last edited by HermannSW on Wed Nov 22, 2017 8:13 pm, edited 1 time in total.
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 08, 2017 12:17 am

Got Appsrc example running, these are the simple steps needed (not that easy to find this time):
  1. "ssh -X pi@..." into your Pi Zero (with USB2RJ45 connector or USB WLAN stick) or Pi Zero W
  2. Copy and paste Appsrc code into appsrc.c
    https://gstreamer.freedesktop.org/docum ... rc-example
  3. gcc appsrc.c -o appsrc `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
  4. GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 ./appsrc
    this will launch "appsrc" window in your host Linux (via "X11 forwarding" -X flag),
    of dimensions 384x288, that shows 0.5s full black, then 0.5s complete white, repeat
Looking at the code I cannot believe it is so simple to push data into a gstreamer pipeline via appsrc ...

Hermann.
Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 08, 2017 1:35 am

Just learned how to access and modify GstBuffer, a must for filling buffer 16bit wise instead byte wise for filling with RGB16 colors. This simple diff is all that is needed to blink red/blue instead of white/black, so easy!

Code: Select all

pi@raspberrypi02:~/gst-template/gst-plugin/src $ diff appsrc.c appsrc.rb.c 12a13,14
>   GstMapInfo info;
>   guint8 *p;
21d22
<   gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
22a24,26
>   gst_buffer_map(buffer, &info, GST_MAP_WRITE); 
>   for(p = info.data + info.size; p > info.data; )
>     *((guint16*)(--p,--p)) = white ? 0xf800 : 0x001f;  // RGB16: rgb 5-6-5 bits
pi@raspberrypi02:~/gst-template/gst-plugin/src $ 
Hermann.
Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 7:20 pm

In previous appsrc samples simple single colored frames were pushed into gstreamer pipeline.
I plan to make raspiraw push GRAY8 encoded frames generated from raw Bayer frames into gstreamer pipeline.

As a pre-study I compared the 3 ways I know of to process Raspberry camera video in gstreamer.
Result: raspividyuv is slowest, and GRAY8 makes the difference between v4l2src and rpicamsrc.
These numbers are fps for 640x480 video forced to 90fps:

Code: Select all

             format=GRAY8
              w/o   w/ 
raspividyuv   25    25
v4l2src       88    42
rpicamsrc     89    89


These are the complete command lines, run on a fresh SD card image with latest RASPBIAN STRETCH LITE (2017-09-07):

Code: Select all

pi@raspberrypi02:~ $ gst-launch-1.0 -v v4l2src ! videoconvert ! video/x-raw,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1162, dropped: 0, current: 88.86, average: 87.92
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1162, dropped: 0, current: 88.86, average: 87.92
...

pi@raspberrypi02:~ $ gst-launch-1.0 -v v4l2src ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1099, dropped: 0, current: 41.82, average: 42.39
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1099, dropped: 0, current: 41.82, average: 42.39
...


pi@raspberrypi02:~ $ gst-launch-1.0 -v rpicamsrc ! videoconvert ! video/x-raw,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1136, dropped: 0, current: 89.51, average: 89.67
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1136, dropped: 0, current: 89.51, average: 89.67
...

pi@raspberrypi02:~ $ gst-launch-1.0 -v rpicamsrc ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480,framerate=90/1 ! fpsdisplaysink video-sink="fakesink" 
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1128, dropped: 0, current: 89.49, average: 89.05
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1128, dropped: 0, current: 89.49, average: 89.05
...



pi@raspberrypi02:~ $ raspividyuv -t 0 -w 640 -h 480 -fps 90 -o - | gst-launch-1.0 -v fdsrc ! videoparse format=i420 ! videoconvert ! video/x-raw,format=GRAY8 ! fpsdisplaysink video-sink="fakesink"
..
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1133, dropped: 0, current: 25.24, average: 25.03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1133, dropped: 0, current: 25.24, average: 25.03
...

pi@raspberrypi02:~ $ raspividyuv -t 0 -w 640 -h 480 -fps 90 -o - | gst-launch-1.0 -v fdsrc ! videoparse format=i420 ! videoconvert ! fpsdisplaysink video-sink="fakesink"
...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 1042, dropped: 0, current: 24.85, average: 25.03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 1042, dropped: 0, current: 24.85, average: 25.03
...

P.S:
GRAY8 is so easy, this is the small diff for changing original RGB16 black/white switching to GRAY8 dark/bright grey switching:

Code: Select all

pi@raspberrypi02:~/gst-template/gst-plugin/src $ diff appsrc.c appsrc.gray8.c 
16c16
<   size = 385 * 288 * 2;
---
>   size = 385 * 288;
21c21
<   gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
---
>   gst_buffer_memset (buffer, 0, white ? 0xAA : 0x55, size);
58c58
<                      "format", G_TYPE_STRING, "RGB16",
---
>                      "format", G_TYPE_STRING, "GRAY8",
pi@raspberrypi02:~/gst-template/gst-plugin/src $ 
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 9:36 pm

I wanted to create appsrc, but provide pipeline on command line like "gst-launch" does.

This is how the new pipeline gets launched:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch.c -o appsrc-launch `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
$ 
$ ./appsrc-launch "appsrc name=_ ! videoconvert ! xvimagesink"
Small diff that makes it work [especially "pipeline = gst_parse_launch(argv[1], NULL);"]:

Code: Select all

$ diff appsrc.c appsrc-launch.c 
16c16
<   size = 385 * 288 * 2;
---
>   size = 384 * 288 * 2;
43c43
<   GstElement *pipeline, *appsrc, *conv, *videosink;
---
>   GstElement *pipeline, *appsrc;
50,53c50,52
<   pipeline = gst_pipeline_new ("pipeline");
<   appsrc = gst_element_factory_make ("appsrc", "source");
<   conv = gst_element_factory_make ("videoconvert", "conv");
<   videosink = gst_element_factory_make ("xvimagesink", "videosink");
---
>   g_assert(argc > 1);
>   pipeline = gst_parse_launch(argv[1], NULL); 
>   g_assert(pipeline);
55a55,56
>   appsrc = gst_bin_get_by_name (GST_BIN(pipeline), "_");
>   g_assert(appsrc);
63,64d63
<   gst_bin_add_many (GST_BIN (pipeline), appsrc, conv, videosink, NULL);
<   gst_element_link_many (appsrc, conv, videosink, NULL);
$ 
Next step: add -l "gstreamer pipeline" to raspiraw.
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Nov 14, 2017 10:55 pm

I really like gstreamer appsrc development !

Lookup what you are interested in, eg. fpsdisplaysink:
https://gstreamer.freedesktop.org/data/ ... ysink.html

Find what you want:
  • The “fps-measurements” signal
  • The “signal-fps-measurements” property
Add it and you are done -- if a fpsdisplaysink with name "#" is present in pipeline, then fps measurement data gets printed:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch2.c -o appsrc-launch2 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
$ 
$ ./appsrc-launch2 'appsrc name=_ ! videoconvert ! fpsdisplaysink video-sink="fakesink"'
^C
$ ./appsrc-launch2 'appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink="fakesink"'
fps-measurements connected
dropped: 0, current: 5.97, average: 5.97
dropped: 0, current: 1.67, average: 3.63
dropped: 0, current: 2.22, average: 3.00
dropped: 0, current: 1.87, average: 2.76
dropped: 0, current: 2.07, average: 2.57
dropped: 0, current: 1.89, average: 2.48
dropped: 0, current: 1.99, average: 2.43
^C
$ 
So easy:

Code: Select all

$ diff appsrc-launch.c appsrc-launch2.c 
5a6,16
> cb_fps_measurements(GstElement *fpsdisplaysink,
>                     gdouble arg0,
>                     gdouble arg1,
>                     gdouble arg2,
>                     gpointer user_data)
> {
>   g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
> }
> 
> 
> static void
43c54
<   GstElement *pipeline, *appsrc;
---
>   GstElement *pipeline, *appsrc, *fpsdisplaysink;
54a66
>   fpsdisplaysink = gst_bin_get_by_name (GST_BIN(pipeline), "#");
69a82,88
> 
>   /* setup fpsdisplaysink "#" if present */
>   if (fpsdisplaysink) {
>     g_object_set (G_OBJECT (fpsdisplaysink), "signal-fps-measurements", TRUE, NULL);
>     g_signal_connect (fpsdisplaysink, "fps-measurements", G_CALLBACK (cb_fps_measurements), NULL);
>     g_print("fps-measurements connected\n");
>   }
$ 
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 15, 2017 12:17 am

I found an interesting GStreamer-devel thread "appsrc usage (push and pull mode)":
http://gstreamer-devel.966125.n4.nabble ... 62768.html

Sebastian and Tim agreed that an application driven push into gstreamer pipeline should work [via gst_app_src_push_buffer() ].

I tried it and modified appsrc-launch2.c to appsrc-launch3.c, both files are attached.

This is the small diff:

Code: Select all

$ diff appsrc-launch2.c appsrc-launch3.c 
1a2,4
> #include <gst/app/gstappsrc.h>
> #include <stdlib.h>
> #include <unistd.h>
15d17
< 
17,19c19
< cb_need_data (GstElement *appsrc,
<           guint       unused_size,
<           gpointer    user_data)
---
> read_data (gpointer    user_data)
25a26
>   GstAppSrc *appsrc = user_data;
41,42c42
<   g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
<   gst_buffer_unref (buffer);
---
>   ret = gst_app_src_push_buffer(appsrc, buffer);
81d80
<   g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
88a88,89
> 
>   g_idle_add ((GSourceFunc) read_data, appsrc);
$ 
Executing

Code: Select all

$ ./appsrc-launch2 "appsrc name=_ ! videoconvert ! xvimagesink"
opens small window and changes from black to white and back with 2fps
(SSH session to Pi Zero got started via "ssh -X").

Executing

Code: Select all

$ ./appsrc-launch3 "appsrc name=_ ! videoconvert ! xvimagesink"
opens small window, displays black frame, and then freezes.

Compiling appsrc-launch3.c needs a library added:

Code: Select all

$ gcc -Wall -pedantic appsrc-launch3.c -o appsrc-launch3 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0` -lgstapp-1.0
What can be the reason for freezing?
What am I missing for getting appsrc-launch3 do the same as appsrc-launch2?
Attachments
appsrc-launch23.zip
contains appsrc-launc2.c and appsrc-lauch3.c
(2.47 KiB) Downloaded 110 times
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Nov 15, 2017 6:33 pm

I did ask in the shown thread on GStreamer-devel as well and got response from Antonio on what I did wrong:
http://gstreamer-devel.966125.n4.nabble ... l#a4685320

My mistake was to change callback function while not looking up documentation and seeing that function prototype differs. This callback function needs to return a gboolean, and returning FALSE stops. So returning G_SOURCE_CONTINUE (or TRUE) was the trick.

Since I run the gstreamer pipeline on Raspberry Pi Zero and only allow for display output via "ssh -X" I completely did the tests of new code with:

Code: Select all

$ ./appsrc-launch3 1000 "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'"
The 2nd argument is the target fps rate for pushing frames into the pipeline [by usleep(1000000/fps)].

No graphical display output, just some fps measurements from time to time, and notification on every 100th read_data() call.
This is output after I stopped execution of pipeline after more than 7 minutes:

Code: Select all

...
dropped: 0, current: 637.03, average: 626.86
read_data(440600)
read_data(440700)
read_data(440800)
dropped: 0, current: 632.32, average: 626.86
read_data(440900)
read_data(441000)
read_data(441100)
^C
$
Before I did get the rate limiting right I nearly froze the Pi Zero (several times) by consuming up all memory [not a good idea to push 100,000 frames per second into a gstreamer pipeline :) ]. I did monitor the complete run, and virtual as well as resident memory remained stable:

Code: Select all

pi@raspberrypi02:~ $ top -p856
top - 17:56:13 up 56 min,  3 users,  load average: 2.22, 1.58, 1.14
Tasks:   1 total,   1 running,   0 sleeping,   0 stopped,   0 zombie
%Cpu(s): 38.7 us, 11.3 sy,  0.0 ni, 48.7 id,  0.0 wa,  0.0 hi,  1.3 si,  0.0 st
KiB Mem :   379572 total,   271060 free,    41792 used,    66720 buff/cache
KiB Swap:   102396 total,    84920 free,    17476 used.   289960 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S %CPU %MEM     TIME+ COMMAND      
  856 pi        20   0   50100  25484   9012 R 38.2  6.7   4:19.69 appsrc-laun+ 
Working appsrc-launch3.c is attached, here is the difference to previous appsrc-launch2.c
(push vs. pull of frames):

Code: Select all

$ diff -pruN appsrc-launch2.c appsrc-launch3.c 
--- appsrc-launch2.c	2017-11-14 22:48:13.205222996 +0000
+++ appsrc-launch3.c	2017-11-15 17:44:25.560371444 +0000
@@ -1,6 +1,10 @@
 #include <gst/gst.h>
+#include <gst/app/gstappsrc.h>
+#include <stdlib.h>
+#include <unistd.h>
 
 static GMainLoop *loop;
+static gint fps, cnt=0;
 
 static void
 cb_fps_measurements(GstElement *fpsdisplaysink,
@@ -12,17 +16,15 @@ cb_fps_measurements(GstElement *fpsdispl
   g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
 }
 
-
-static void
-cb_need_data (GstElement *appsrc,
-          guint       unused_size,
-          gpointer    user_data)
+gboolean
+read_data (gpointer    user_data)
 {
   static gboolean white = FALSE;
   static GstClockTime timestamp = 0;
   GstBuffer *buffer;
   guint size;
   GstFlowReturn ret;
+  GstAppSrc *appsrc = user_data;
 
   size = 384 * 288 * 2;
 
@@ -34,17 +36,23 @@ cb_need_data (GstElement *appsrc,
   white = !white;
 
   GST_BUFFER_PTS (buffer) = timestamp;
-  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
+  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, fps);
 
   timestamp += GST_BUFFER_DURATION (buffer);
 
-  g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
-  gst_buffer_unref (buffer);
+  ret = gst_app_src_push_buffer(appsrc, buffer);
 
   if (ret != GST_FLOW_OK) {
     /* something wrong, stop pushing */
     g_main_loop_quit (loop);
   }
+
+  if (++cnt % 100 == 0)
+    g_print("read_data(%d)\n",cnt);
+
+  usleep(1000000/fps);
+
+  return G_SOURCE_CONTINUE;
 }
 
 gint
@@ -58,8 +66,9 @@ main (gint   argc,
   loop = g_main_loop_new (NULL, FALSE);
 
   /* setup pipeline */
-  g_assert(argc > 1);
-  pipeline = gst_parse_launch(argv[1], NULL); 
+  g_assert(argc == 3);
+  fps = atoi(argv[1]); 
+  pipeline = gst_parse_launch(argv[2], NULL); 
   g_assert(pipeline);
 
   /* setup */
@@ -78,7 +87,6 @@ main (gint   argc,
   g_object_set (G_OBJECT (appsrc),
         "stream-type", 0,
         "format", GST_FORMAT_TIME, NULL);
-  g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
 
   /* setup fpsdisplaysink "#" if present */
   if (fpsdisplaysink) {
@@ -87,6 +95,8 @@ main (gint   argc,
     g_print("fps-measurements connected\n");
   }
 
+  g_idle_add ((GSourceFunc) read_data, appsrc);
+
   /* play */
   gst_element_set_state (pipeline, GST_STATE_PLAYING);
   g_main_loop_run (loop);
$ 
So what did I do here?
I tried to push 384*288*2bytes=216KB frames with 1000fps into this pipeline:

Code: Select all

appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'
I am positively surprised by the measured 626fps on average shown!

My real application (modification of raspiraw) will only push 1/3rd framesize, and with only 90fps (maybe 120fps) into gstreamer pipeline (320x240 GRAY8 frames). The measurement results seen (626fps with 3 times target frame size) seem to indicate this will be easily possible.
Attachments
appsrc-launch3.zip
Working source discussed in this posting
(1.34 KiB) Downloaded 125 times
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 2:36 am

Took quite some time to get the timestamps right (raspiraw timestamps are [us], gstreamer timestamps are [ns]).
I just took appsrc-launch3.cpp and spread the different parts over raspiraw.c ;-)
Now raspiraw sends 640x480 raw10 size frames (384000 bytes, black/white blinking as in appsrc-launch3) into gstreamer pipeline. And that pipelines measures 60fps on average, which is minimal fps rate for mode 7 I started raspiraw with!
<ADD>in order to build raspiraw/fast I had to append "-lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 -lgstapp-1.0" to "userland-rawcam/build/raspberry/release/host_applications/linux/apps/raspicam/CMakeFiles/raspiraw.dir/link.txt" and "-I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/arm-linux-gnueabihf/glib-2.0/include" to "userland-rawcam/build/raspberry/release/host_applications/linux/apps/raspicam/CMakeFiles/raspiraw.dir/flags.make"</ADD>

raspiraw gets started with timeout 5000ms (-t), mode 7 (640x480, 60-90fps), saverate 1 (every frame received from camera triggers creation of new black/white frame of same size and pushes it into gstreamer pipeline). raspiraw ends after 5.6s, the stdout output from gstreamer callback and fpsdisplaysink gets "tee"d into file "out", grepping for "dropped" shows the framerates, tail of out shows that callback was triggered 302 times (5s x 60fps):

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 5000 -sr 1 -o foobar | tee out
...
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154870253, flags 0004
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154870253, flags 0084
read_data(183,3028046000,5000000000)
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154886890, flags 0004
mmal: Buffer 0x72f6f0 returned, filled 384000, timestamp 14154886890, flags 0084
read_data(184,3044683000,5000000000)
dropped: 0, current: 67.65, average: 60.38
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154903528, flags 0004
read_data(185,3061321000,5000000000)
mmal: Buffer 0x72f340 returned, filled 384000, timestamp 14154903528, flags 0084
mmal: Buffer 0x72f518 returned, filled 384000, timestamp 14154920166, flags 0004
mmal: read_data(186,3077959000,5000000000)
Buffer 0x72f6f0 returned, filled 384000, timestamp 14154920166, flags 0084
mmal: Buffer 0x72f168 returned, filled 384000, timestamp 14154936803, flags 0004
mmal: read_data(187,3094596000,5000000000)
Buffer 0x72f340 returned, filled 384000, timestamp 14154936803, flags 0084
...
mmal: mmal_port_disconnect: vc.ril.video_render:in:0(0x72d8b0) is not connected
mmal: mmal_component_destroy_internal: vc.ril.video_render 2
mmal: mmal_port_free: vc.ril.video_render:in:0 at 0x72d8b0
mmal: mmal_port_free: vc.ril.video_render:ctr:0 at 0x72d590

real	0m5.662s
user	0m2.440s
sys	0m0.700s
pi@raspberrypi02:~/userland-rawcam $ grep dropped out 
dropped: 0, current: 63.07, average: 63.07
dropped: 0, current: 29.39, average: 46.18
dropped: 0, current: 56.73, average: 49.71
dropped: 0, current: 73.32, average: 55.57
dropped: 0, current: 72.40, average: 58.95
dropped: 0, current: 67.65, average: 60.38
dropped: 0, current: 60.14, average: 60.35
dropped: 0, current: 60.12, average: 60.32
dropped: 0, current: 60.05, average: 60.29
pi@raspberrypi02:~/userland-rawcam $ tail -3 out
read_data(300,4974646000,5000000000)
read_data(301,4991284000,5000000000)
read_data(302,5007922000,5000000000)
pi@raspberrypi02:~/userland-rawcam $ 
This shows the gstreamer pipeline used, how camera streaming gets started and then gstreamer main loop gets entered:

Code: Select all

...
  pipeline = gst_parse_launch("appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'", NULL);
...
  g_idle_add ((GSourceFunc) read_data, appsrc);

  /* play */
  gst_element_set_state (pipeline, GST_STATE_PLAYING);

        start_camera_streaming(sensor, sensor_mode);

  g_main_loop_run (gloop);

  /* clean up */
  gst_element_set_state (pipeline, GST_STATE_NULL);
...
This shows the gstreamer callback, called every 4ms, and pushing a frame if available into gstreamer pipeline.
It also shuts down gstreamer main loop in case of problem, and more importantly when timeout is reached:

Code: Select all

gboolean
read_data (gpointer    user_data)
{
  if (gbuffer) { // frame available to push

    GstAppSrc *appsrc = user_data;
    GstFlowReturn ret = gst_app_src_push_buffer(appsrc, gbuffer);

    if (++gcnt % 100 == 0) {}
      g_print("read_data(%d,%llu,%llu)\n",gcnt,GST_BUFFER_PTS (gbuffer),ggtimeout);

    if ((ret != GST_FLOW_OK) || (ggtimeout && (GST_BUFFER_PTS (gbuffer) > ggtimeout))) {
      /* something wrong or timeout, stop pushing */
      g_main_loop_quit (gloop);
    }

    gbuffer = NULL; // ready for new frame
  }

  vcos_sleep(4); // TODO 250fps max?

  return G_SOURCE_CONTINUE;
}
This is the current code that replaced the "SD card write code block" of raspiraw:

Code: Select all

  if (!gbuffer) { // no frame left to send, take new one

    GstBuffer *buff;
    guint size;

    size = 800 * 480;

    buff = gst_buffer_new_allocate (NULL, size, NULL);

    gst_buffer_memset (buff, 0, white ? 0xff : 0x0, size);

    white = !white;

    if (!ggtimeout) {
      gbase     = buffer->pts; // us
      ggtimeout = gst_util_uint64_scale_int (cfg->timeout, GST_MSECOND, 1);
    }

    GST_BUFFER_PTS (buff) = (buffer->pts - gbase) * 1000; // us -> ns
    GST_BUFFER_DURATION (buff) = 16666666; // TODO value?  1/60 s in [ns]

    gbuffer = buff; // new frame ready for push
  }
All not very nice yet, and most importantly, copying the the raw Bayer frame captured from raspberry camera in buffer->data into the gst_buffer created for pushing into pipeline.

That will be done tomorrow evening, now its far too late here, have to stop for sleeping.

Although it took some time, it was not that difficult to make raspiraw push its stream into gstreamer pipeline, and it works already partially ...

Code: Select all

$ diff raspiraw.c.orig raspiraw.c | wc --lines
136
$ 
Last edited by HermannSW on Wed Nov 22, 2017 8:21 pm, edited 2 times in total.
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 8:16 pm

The solution last night worked, but it was not really a "push" solution, more a "poll 250 times a second for push allowed, then push".

I completely eliminated the gstreamer read_data() callback and do not register any callback function via g_idle_add(). Instead the raspiraw callback routine called every 1/60th second directly pushes the buffer into gstreamer pipeline via (global) appsrc variable.

Instead of just counting framerate via fpsdisplaysink in pipeline I wanted to see "some" video frames. 640x480 raw10 frame has width 640*5/4=800 and height 480. I just initialized appsrc caps by

Code: Select all

        gst_caps_new_simple ("video/x-raw",
                     "format", G_TYPE_STRING, "GRAY8",
                     "width", G_TYPE_INT, 800,
                     "height", G_TYPE_INT, 480,
                     "framerate", GST_TYPE_FRACTION, 0, 1,
                     NULL), NULL);
cheating a bit in stating that 640x480 raw10 frame should be taken as 800x480 GRAY8. This does a simple gray conversion for each 2x2 "rg/Gb" raw Bayer pattern pixel. Only every 5th column is junk (the lowest 2 bits of 10bit value for the 4 pixels left). This is one of the first frames I saw and captured, straight up, directed at lamp on ceiling, and a hand (similar to https://pbs.twimg.com/media/DOno31cXkAEwnn-.jpg:large):
Image

This is how raspiraw got started -- yes, the gstreamer pipeline now gets passed after -o raspiraw parameter on shell command line.
15 seconds of recording, with saverate 30, translating into 2 frames per second at 60fps camera speed:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 30 -o "appsrc name=_ ! videoconvert ! xvimagesink"
Next: better raw10 conversion, I saw problems with bayer2rgb before, read that gstreamer needs to be recompiled with raw10 support, way to go ...

P.S:
Below is the problem I have seen before with bayer2rgb and gstreamer (there is only one lamp on ceiling, not four).

Compiled without cheating:

Code: Select all

  g_object_set (G_OBJECT (appsrc), "caps",
        gst_caps_new_simple ("video/x-bayer",
                     "format", G_TYPE_STRING, "rggb",
                     "width", G_TYPE_INT, 640,
                     "height", G_TYPE_INT, 480,
                     "framerate", GST_TYPE_FRACTION, 0, 1,
                     NULL), NULL);
Command line used:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 30 -o "appsrc name=_ ! bayer2rgb ! videoconvert ! xvimagesink"
Image

This is real view, taken with raspistill (-w 640 -h 480):
Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 9:04 pm

Instead of "memcpy"ing raspiraw frame to gstreamer buffer I did now manually convert raw10 to raw8.
This code seems to be doing the right thing according
https://developer.xamarin.com/api/field ... mat.Raw10/

Code: Select all

    gst_buffer_map(buff, &info, GST_MAP_WRITE);
//    memcpy(info.data, buffer->data, buffer->length);
  p=buffer->data;
  q=info.data;
  for(i=0; i<640*480; i+=4)
  {
    k = (((unsigned int)p[0])<<2) + ((p[4]>>0)&0x03);
    if (k>255) k=255;
    q[0] = k;
    k = (((unsigned int)p[1])<<2) + ((p[4]>>2)&0x03);
    if (k>255) k=255;
    q[1] = k;
    k = (((unsigned int)p[2])<<2) + ((p[4]>>4)&0x03);
    if (k>255) k=255;
    q[2] = k;
    k = (((unsigned int)p[3])<<2) + ((p[4]>>6)&0x03);
    if (k>255) k=255;
    q[3] = k;
    p+=5; q+=4;
  }


After recompiling the same bayer2rgb raspiraw command line as before produces below image.
At least the format 640x480 looks right, proving that bayer2rgb expects raw8 format.
But the image is a bit dark ...
Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Thu Nov 16, 2017 11:45 pm

Oh oh, I thought it must be a programming error on my side, or a misunderstanding on video/x-bayer or GRAY8 conversion.
But it was not, it was incorrect test setup.
Now I added flashlight from my smartphone to the game and all looks good !
For all 3 cases below I did determine the full 10bit pixel value, capped at 255 and stored.

This is with 640x480 video/x-bayer "bggr" (raw8) mode, and "...! bayer2rgb !..." in pipeline:
Image

This is with 640x480 video/x-raw "GRAY8" mode, with just "...! videoconvert !..." in pipeline:
Image

And this is with 320x240 video/x-raw "GRAY8" mode, where I took the left bottom (brightest) green pixel of each 2x2 bg/Gr tile:
Image

I was just lucky when doing automatic robot camera tilt calibration 11 days ago:
viewtopic.php?f=43&t=189661&p=1218763#p1231151

On the caterpillar robot I have a night vision v1 camera module, but more importantly a 3W infrared LED mounted at the camera.
Calibration (useful picture taking from raw bayer data) worked in darkness due to this LED:
Image

Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Fri Nov 17, 2017 12:46 am

I added the ability to send the same 640x480 bytes extracted from 800x640 raw10 captured camera frame as either "video/x-bayer,format='bggr'" or as "video/x-raw,format='GRAY8'", selected by the name of the appsrc element in gstreamer pipeline passed to raspiraw's -o element:

Code: Select all

  /* setup */
  if ( (appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "x-raw")) ) {
    g_object_set (G_OBJECT (appsrc), "caps",
          gst_caps_new_simple ("video/x-raw",
                       "format", G_TYPE_STRING, "GRAY8",
                       "width", G_TYPE_INT, 640,
                       "height", G_TYPE_INT, 480,
                       "framerate", GST_TYPE_FRACTION, 0, 1,
                       NULL), NULL);
  } else {
    g_assert( appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "x-bayer") );
    g_object_set (G_OBJECT (appsrc), "caps",
          gst_caps_new_simple ("video/x-bayer",
                       "format", G_TYPE_STRING, "bggr",
                       "width", G_TYPE_INT, 640,
                       "height", G_TYPE_INT, 480,
                       "framerate", GST_TYPE_FRACTION, 0, 1,
                       NULL), NULL);
  }
I had heard before at gstreamer Prague 2017 conference that bayer2rgb might be slow.
Here now is the confirmation (output of raspivid goes to stderr, from gstreamer to stdout, redirection to files necessary since otherwise slowdown).

User of bayer2rgb drops 60 fps to only 27fps measured at end of pipeline:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 1 -o "appsrc name=x-bayer ! bayer2rgb ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m15.662s
user	0m13.140s
sys	0m2.170s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 27.15, average: 27.55
dropped: 0, current: 27.23, average: 27.54
dropped: 0, current: 27.37, average: 27.54
read_data(582,14973466000)
dropped: 0, current: 27.37, average: 27.54
read_data(583,14990104000)
pi@raspberrypi02:~/userland-rawcam $ 
GRAY8 x-raw does only slightly reduce from 60fps to 57fps measured:

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 15000 -sr 1 -o "appsrc name=x-raw ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m15.649s
user	0m9.640s
sys	0m1.630s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 58.30, average: 57.21
dropped: 0, current: 56.23, average: 57.17
dropped: 0, current: 56.24, average: 57.14
read_data(853,14923566000)
read_data(854,14956840000)
read_data(855,14973478000)
pi@raspberrypi02:~/userland-rawcam $ 
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Sat Nov 18, 2017 9:57 pm

The raspiraw2gstreamer approach was motivated by the possibility of getting raspiraw doing higher than 90fps video capturing, and then push that into gstreamer pipeline for further processing in context of robot control.

I did remove the bayer mode, because that can be better done with rpicamsrc.

This new version of raspiraw.c is based on that from Aug 8 version of rawcam:
https://github.com/6by9/userland/tree/rawcam

Find attached the new modified version, as well as a patchfile (do "patch -p0 < patch.txt" in raspiraw.c directory).

This version generates 320x240 x-raw GRAY8 format stream, currently at 60fps (with md=7, 42fps with md=6):

Code: Select all

pi@raspberrypi02:~/userland-rawcam $ ./camera_i2c 
setting GPIO for board revsion: 9000c1
PiZero / PiZero W - I2C 0 on GPIO 28 & 29. GPIOs 40 & 44 for LED and power
pi@raspberrypi02:~/userland-rawcam $ 
pi@raspberrypi02:~/userland-rawcam $ time raspiraw -md 7 -t 5000 -sr 1 -o "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'" 2>err 1>out

real	0m5.557s
user	0m1.750s
sys	0m0.580s
pi@raspberrypi02:~/userland-rawcam $ (grep dropped out && tail -3 out) | tail -6
dropped: 0, current: 56.39, average: 59.27
dropped: 0, current: 59.94, average: 59.35
dropped: 0, current: 58.32, average: 59.23
read_data(294,4957991000)
read_data(295,4974628000)
read_data(296,4991265000)
pi@raspberrypi02:~/userland-rawcam $ 
Before making this version of raspiraw able to capture with more than 90fps, use of rpicamsrc or v4l2src are preferred. They achieve GRAY8 format with 90fps:

Code: Select all

$ gst-launch-1.0 rpicamsrc ! video/x-raw,format=GRAY8,width=640,height=480 !  ... 
$ gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! video/x-raw,format=GRAY8,width=640,height=480 ! ... 
Nice is that the current version needs mofifications of raspiraw.c at 3 places only.

The first does include the needed gstreamer headers and defines fps_measurements() callback:

Code: Select all

42a43,57
> #include   <gst/gst.h>
> GMainLoop  *gloop  = NULL;
> #include   <gst/app/gstappsrc.h>
> GstAppSrc  *appsrc = NULL;
> 
> void
> cb_fps_measurements(GstElement *fpsdisplaysink,
>               gdouble arg0,
>               gdouble arg1,
>               gdouble arg2,
>               gpointer user_data)
> {
>       g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
> }
> 
The second deals with pushing captured frame directly into gstreamer pipeline. This is done only if gstreamer output was selected on command line. I love how simple it finally turned out to push the converted 320x240 GRAY8 frame (from 640x480 raw10 bayer captured by camera) into gstreamer pipeline!

Code: Select all

362a378,379
>               if (!appsrc)   // to keep SD card code indentation
>               {
379a397,443
>               else
>               {
>                       static GstClockTime ggtimeout = 0, gbase = 0;
> 
>                       GstBuffer *buff;
>                       GstMapInfo info;
>                       unsigned char *p, *q;
>                       static gint gcnt=0;
>                       unsigned int i,j,k;
> 
>                       buff = gst_buffer_new_allocate (NULL, 320*(240+1), NULL);
>                       gst_buffer_map(buff, &info, GST_MAP_WRITE);
> 
>                       p=buffer->data;
>                       q=info.data;
>                       for(i=1; i<480; i+=2) {
>                               p+=800;
>                               for(j=0; j<800; p+=5,j+=5) {
>                                       k = (((unsigned)p[0])<<2) + ((p[4]>>0)&0x03);
>                                       if (k>255) k=255;
>                                       *q++ = k;
> 
>                                       k = (((unsigned)p[2])<<2) + ((p[4]>>4)&0x03);
>                                       if (k>255) k=255;
>                                       *q++ = k;
>                               }
>                       }
> 
>                       if (!ggtimeout) {           // take baseline timestamp
>                               gbase     = buffer->pts;  // us
>                               ggtimeout = gst_util_uint64_scale_int (cfg->timeout, GST_MSECOND, 1);
>                       }
> 
>                       GST_BUFFER_PTS (buff)      = (buffer->pts - gbase) * 1000; // us -> ns
>                       GST_BUFFER_DURATION (buff) = gst_util_uint64_scale_int (1, GST_SECOND, 100);
> 
>                       if (GST_BUFFER_PTS (buff) > ggtimeout) {
>                               g_main_loop_quit (gloop);              // timeout, stop pushing
>                       } else {
>                               if (GST_FLOW_OK != gst_app_src_push_buffer(appsrc, buff)) {
>                                       g_main_loop_quit (gloop);            // something wrong, stop pushing
>                               } else {
>                                       g_print("read_data(%d,%llu)\n", ++gcnt, GST_BUFFER_PTS (buff));
>                               }
>                       }
>               }   // to keep SD card code indentation
>               }
The third does check whether gstreamer output is requested [-o present, its argument starting with "appsrc " and mode 6 or 7 (the 640x480 modes)]. After creation of all gstreamer stuff, start_camera_streaming() gets called, followed by g_main_loop_run (gloop). This loop gets teared down if timeout is reached or a gstreamer error happened in 2nd part:

Code: Select all

982,984c1046,1092
<       start_camera_streaming(sensor, sensor_mode);
< 
<       vcos_sleep(cfg.timeout);
---
>       if (cfg.output && (cfg.mode>>1 == 3) && (strncmp(cfg.output, "appsrc ", 7)==0)) {
>               /* init GStreamer */
>               GstElement *pipeline, *fpsdisplaysink;
>               gst_init (&argc, (char ***)&argv);
>               gloop = g_main_loop_new (NULL, FALSE);
> 
>               /* setup pipeline */
>               g_assert( pipeline = gst_parse_launch(cfg.output, NULL) ); 
> 
>               /* setup */
>               g_assert( appsrc = (GstAppSrc*) gst_bin_get_by_name (GST_BIN(pipeline), "_") );
>               g_object_set (G_OBJECT (appsrc), "caps",
>                       gst_caps_new_simple ("video/x-raw",
>                                       "format", G_TYPE_STRING, "GRAY8",
>                                       "width", G_TYPE_INT, 320,
>                                       "height", G_TYPE_INT, 240,
>                                       "framerate", GST_TYPE_FRACTION, 0, 1,
>                                       NULL), NULL);
> 
>               /* setup appsrc */
>               g_object_set (G_OBJECT (appsrc),
>                       "stream-type", 0,
>                       "format", GST_FORMAT_TIME, NULL);
> 
>               /* setup fpsdisplaysink "#" if present */
>               fpsdisplaysink = gst_bin_get_by_name (GST_BIN(pipeline), "#");
>               if (fpsdisplaysink) {
>                       g_object_set (G_OBJECT (fpsdisplaysink), "signal-fps-measurements", TRUE, NULL);
>                       g_signal_connect (fpsdisplaysink, "fps-measurements", G_CALLBACK (cb_fps_measurements), NULL);
>                       g_print("fps-measurements connected\n");
>               }
> 
>               /* play */
>               gst_element_set_state (pipeline, GST_STATE_PLAYING);
> 
>               start_camera_streaming(sensor, sensor_mode);
> 
>               g_main_loop_run (gloop);
> 
>               /* clean up */
>               gst_element_set_state (pipeline, GST_STATE_NULL);
>               gst_object_unref (GST_OBJECT (pipeline));
>               g_main_loop_unref (gloop);
>       } else {
>               start_camera_streaming(sensor, sensor_mode);
>               vcos_sleep(cfg.timeout);
>       }
P.S:
I run the Pi Zero (W) headless, but when HDMI monitor is connected, this command works:

Code: Select all

$ time raspiraw -md 7 -t 5000 -sr 1 -o "appsrc name=_ ! videoconvert ! fbdevsink device=/dev/fb0"
Image
Attachments
raspiraw-appsrc.zip
modified raspiraw.c and patchfile
(11.27 KiB) Downloaded 135 times
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Jul 18, 2018 11:25 pm

It is 9 months ago that I last updated this thread.
It was before I made v1 and v2 cameras capture with much more than 90fps.
The highest framerates with raspiraw are 750fps with v1 camera and 1007fps with v2 camera!

In a recent thread I detailed my plans of adding a Pi ZeroW as backpack to Eachine E52 drone and make it the drone pilot by wirelessly connecting to the drone AP and controlling drone flight as well as capturing the drone FPV camera video (see animation below):
viewtopic.php?f=43&t=190407&p=1341629#p1341895

I will see on weekend whether the code pieces I found will work with E52 drone unchanged.
If not it will be easy to replicate what others did (capture Wifi traffic between Android drone app and drone).
There are tons of packet capturing tools for Android, and drone Wifi traffic is completely unsecured (no SSL),
I use Wireshark a lot at work (for analyzing traffic to and from DataPower appliances), analyzing the packet captures will be easy.

The 3rd planned project from above posting requires an appsink in a gstreamer pipeline to capture drone camera frames and process them. Today I used this thread to refresh my skills with gstreamer plugin and appsrc programming by compiling, running and looking into all sample codes of this thread in order to understand again how this all works.

Before going to appsink programming I stumbled over the 600 something fps upper limit for framrate of appsrc-launch3. I realized that appsrc-launch3 did appsrc programming in "pull" mode. I changed that to "push" mode and the result is appsrc-launch4 (also attached), which can do much higher framerates (it generates 384x288 frames, all black, then all white and repeat, and sends them to fpsdisplaysink for just measuring the real framerate, no frame processing sofar, just a videoconvert). As you can see 7500fps can easily be handled (on a Pi 3B+ this time, last time it was Pi Zero). But 3B+ to Zero compute power difference does not explain the 12 times higher framerate possible. It seems that appsrc "push" mode is the way to go for high framerates:

Code: Select all

pi@raspberrypi3BplusX:~ $ gcc -O6 appsrc-launch4.c -o appsrc-launch4 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0` 
pi@raspberrypi3BplusX:~ $ ./appsrc-launch4 7500 "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'"
fps-measurements connected
dropped: 0, current: 7494.68, average: 7494.68
dropped: 0, current: 6650.59, average: 7072.63
dropped: 0, current: 7697.56, average: 7280.92
dropped: 0, current: 7985.07, average: 7456.95
dropped: 0, current: 7609.61, average: 7487.48
dropped: 0, current: 7557.27, average: 7499.12
dropped: 0, current: 7500.79, average: 7499.36
dropped: 0, current: 7499.25, average: 7499.34
^C
pi@raspberrypi3BplusX:~ $ 

This is the small diff between appsrc-launch[34].c:

Code: Select all

pi@raspberrypi3BplusX:~ $ diff -u appsrc-launch3.c appsrc-launch4.c
--- appsrc-launch3.c	2018-07-18 23:50:20.415578525 +0200
+++ appsrc-launch4.c	2018-07-19 00:02:35.950553641 +0200
@@ -16,15 +16,16 @@
   g_print("dropped: %.0f, current: %.2f, average: %.2f\n", arg1, arg0, arg2);
 }
 
-gboolean
-read_data (gpointer    user_data)
+static void
+cb_need_data (GstElement *appsrc,
+          guint       unused_size,
+          gpointer    user_data)
 {
   static gboolean white = FALSE;
   static GstClockTime timestamp = 0;
   GstBuffer *buffer;
   guint size;
   GstFlowReturn ret;
-  GstAppSrc *appsrc = user_data;
 
   size = 384 * 288 * 2;
 
@@ -40,19 +41,13 @@
 
   timestamp += GST_BUFFER_DURATION (buffer);
 
-  ret = gst_app_src_push_buffer(appsrc, buffer);
+  g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
+  gst_buffer_unref (buffer);
 
   if (ret != GST_FLOW_OK) {
     /* something wrong, stop pushing */
     g_main_loop_quit (loop);
   }
-
-  if (++cnt % 100 == 0)
-    g_print("read_data(%d)\n",cnt);
-
-  usleep(1000000/fps);
-
-  return G_SOURCE_CONTINUE;
 }
 
 gint
@@ -96,7 +91,7 @@
     g_print("fps-measurements connected\n");
   }
 
-  g_idle_add ((GSourceFunc) read_data, appsrc);
+  g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
 
   /* play */
   gst_element_set_state (pipeline, GST_STATE_PLAYING);
pi@raspberrypi3BplusX:~ $ 

If you skip the video-sink property in gstreamer pipeline fpsdisplaysink, the framerate will get reduced greatly, but you will be able to see the alternating all black/all white frames just for verification purposes:

Code: Select all

./appsrc-launch4 10 "appsrc name=_ ! videoconvert ! fpsdisplaysink name=#"

Way to go with appsink live video frame analysis, but a good first step.

Image
Attachments
appsrc-launch4.zip
contains appsrc-launch4.c
(1.29 KiB) Downloaded 105 times
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Fri Jul 20, 2018 4:55 pm

There is a difference between framerate achievable with Pi Zero and 3B+.
While 3B+ is capable of 7500fps, Pi Zero only can do 2600fps, but that is more than 4 times as with appsrc pull before:

Code: Select all

pi@raspberrypi02:~ $ grep Revision /proc/cpuinfo 
Revision	: 900093
pi@raspberrypi02:~ $ gcc -O6 appsrc-launch4.c -o appsrc-launch4 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0`
pi@raspberrypi02:~ $ ./appsrc-launch4 2600 "appsrc name=_ ! videoconvert ! fpsdisplaysink name=# video-sink='fakesink'"
fps-measurements connected
dropped: 0, current: 2363.19, average: 2363.19
dropped: 0, current: 2397.82, average: 2380.48
dropped: 0, current: 2676.93, average: 2479.23
dropped: 0, current: 2703.89, average: 2535.35
dropped: 0, current: 2695.02, average: 2567.27
dropped: 0, current: 2705.73, average: 2590.34
dropped: 0, current: 2653.55, average: 2599.36
dropped: 0, current: 2600.29, average: 2599.48
dropped: 0, current: 2599.61, average: 2599.49
^C
pi@raspberrypi02:~ $ 
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Fri Jul 20, 2018 6:30 pm

It turned out that for (drone control) gstreamer frame analysis an appsink is not needed.
Instead handoff signal of identity element can be used to inspect the frames.
The advantage is that final fakesink (or any other sink) will do all the needed gstreamer pipeline cleanup automatically.
In case final gstreamer pipeline element is really a fakesink, then its handover signal can be used alternatively.

Here you can see the handover callback output, really called for complete GstBuffer (the 384*288 rgb16 frames consume 384*288*2=221184 bytes):

Code: Select all

pi@raspberrypi02:~ $ ./appsrc-launch5 5 "appsrc name=_ ! identity name== ! fpsdisplaysink name=# video-sink='fakesink'"
identity handoff
fps-measurements connected
buf(221184) buf(221184) buf(221184) buf(221184) buf(221184) dropped: 0, current: 8.29, average: 8.29
buf(221184) buf(221184) buf(221184) dropped: 0, current: 5.00, average: 6.65
buf(221184) buf(221184) buf(221184) dropped: 0, current: 5.00, average: 6.10
buf(221184) buf(221184) buf(221184) dropped: 0, current: 5.00, average: 5.83
buf(221184) buf(221184) buf(221184) dropped: 0, current: 5.00, average: 5.66
buf(221184) buf(221184) buf(221184) dropped: 0, current: 5.00, average: 5.55
buf(221184) buf(221184) ^C
pi@raspberrypi02:~ $ 

This is the small diff between appsrc-launch[45].c, appsrc-launch5.c is attached as well:

Code: Select all

pi@raspberrypi02:~ $ diff -u appsrc-launch4.c appsrc-launch5.c
--- appsrc-launch4.c	2018-07-18 23:05:12.000000000 +0000
+++ appsrc-launch5.c	2018-07-20 17:01:31.747177110 +0000
@@ -6,6 +6,19 @@
 static GMainLoop *loop;
 static gint fps, cnt=0;
 
+static void 
+handoff(GstElement *object, 
+GstBuffer *arg0, 
+GstPad *arg1, 
+gpointer user_data) 
+{ 
+  GstMapInfo info;
+
+  gst_buffer_map (arg0, &info, GST_MAP_READ);
+  g_print("buf(%d) ",info.size);
+  gst_buffer_unmap (arg0, &info);
+} 
+
 static void
 cb_fps_measurements(GstElement *fpsdisplaysink,
                     gdouble arg0,
@@ -54,7 +67,7 @@
 main (gint   argc,
       gchar *argv[])
 {
-  GstElement *pipeline, *appsrc, *fpsdisplaysink;
+  GstElement *pipeline, *appsrc, *identity, *fpsdisplaysink;
 
   /* init GStreamer */
   gst_init (&argc, &argv);
@@ -67,6 +80,7 @@
   g_assert(pipeline);
 
   /* setup */
+  identity = gst_bin_get_by_name (GST_BIN(pipeline), "=");
   fpsdisplaysink = gst_bin_get_by_name (GST_BIN(pipeline), "#");
   appsrc = gst_bin_get_by_name (GST_BIN(pipeline), "_");
   g_assert(appsrc);
@@ -84,6 +98,14 @@
         "stream-type", 0,
         "format", GST_FORMAT_TIME, NULL);
 
+  /* setup identity "=" if present */
+  if (identity) {
+    g_object_set(G_OBJECT(identity), "signal-handoffs", TRUE, NULL); 
+//  g_object_set(G_OBJECT(identity), "dump", TRUE, NULL); 
+    g_signal_connect(identity, "handoff", G_CALLBACK(handoff), NULL); 
+    g_print("identity handoff\n");
+  }
+
   /* setup fpsdisplaysink "#" if present */
   if (fpsdisplaysink) {
     g_object_set (G_OBJECT (fpsdisplaysink), "signal-fps-measurements", TRUE, NULL);
pi@raspberrypi02:~ $ 

P.S:
appsrc-launch5.c requires an appsrc, but that can be sent to fakesink, and a videotestsrc can be used as source for the rest of the pipeline. videotestsrc generates 320x240 16rgb frames of size 320*240*2=153600 bytes. As you can see the conversion to GRAY8 really halves the GstBuffer frame size:

Code: Select all

pi@raspberrypi02:~ $ ./appsrc-launch5 1 "appsrc name=_ ! fakesink videotestsrc  ! video/x-raw,format=GRAY8 ! identity name== ! videoconvert ! fpsdisplaysink name=# "
identity handoff
fps-measurements connected
buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) dropped: 5, current: 9.04, average: 9.04
buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) dropped: 7, current: 5.31, average: 7.16
buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) buf(76800) dropped: 7, current: 7.29, average: 7.20
...

Of course the colors of videotestsrc are gone with GRAY8 conversion, and the handoff callback will not see the fpsdisplaysink overlay messages:
Image
Attachments
appsrc-launch5.zip
contains appsrc-launch5.c
(1.45 KiB) Downloaded 86 times
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Mon Aug 13, 2018 10:59 pm

Recently I reverse engineered the wireless protocol between Android UFO app and Eachine E52 FPV drone. I was able to make the Pi capture the drone camera h264 video. Target is to let Pi ZeroW lift off on the drone and fly it wirelessly (fully autonomous) as its pilot:
https://github.com/Hermann-SW/wireless- ... -E52-drone


Today I remembered appsrc-launch5.c from this thread and gave it a try with drone video:

Code: Select all

pi@raspberrypi3BplusX:~ $ ./wireless-control-Eachine-E52-drone/pull_video 2>err | ./appsrc-launch5 1 "appsrc name=_ ! fakesink fdsrc fd=0 ! h264parse ! omxh264dec ! identity name== ! fpsdisplaysink name=#  video-sink='fakesink'"
identity handoff
fps-measurements connected
buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) buf(622080) ^C
pi@raspberrypi3BplusX:~ $ 

As can be seen the handoff callback gets called for each frame of drone h264 video (drone camera gives 25fps). And the buffer size makes sense, drone camera gives 720x576 i420 video, which has 12bit per pixel (720x576*12/8=622080).

Here it is run with conversion to GRAY8 (1byte per pixel, 720x576=414720):

Code: Select all

pi@raspberrypi3BplusX:~ $ ./wireless-control-Eachine-E52-drone/pull_video 2>err | ./appsrc-launch5 1 "appsrc name=_ ! fakesink fdsrc fd=0 ! h264parse ! omxh264dec ! videoconvert ! video/x-raw,format=GRAY8 ! identity name== ! fpsdisplaysink name=#  video-sink='fakesink'"
identity handoff
fps-measurements connected
buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) buf(414720) ^C
pi@raspberrypi3BplusX:~ $ 

The gstreamer pipeline is fast by using hardware accelerated h264 decoding (omxh264dec). Today I added watch_video.sh tool that displays either drone canera or raspivid h264 video stream fast, by omxh264dec and direct output to /dev/fb0 framebuffer. That works for X11 display as well, the video was taken with v1 NoIR camera (640x480), you can see surrounding parts from previous 720x576 drone video as well:
Image


P.S:
Vacation house Trip Trap child chair on top of desk brings Waveshare 7inch Raspberry PI LCD into direct horizontal view ... [no problem to use child chair for Raspberry, our 3rd child is 18yo ;-)]
Image
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Tue Jul 30, 2019 6:02 pm

Back in Denmark for vacation, same house, same Trip Trap, same 7" lcd display, Pi 3A+ instead of Pi 3B+ last year.

I looked into this thread and was surprised by the 7500fps that appsrc-launch4.c was able to push through gstreamer pipeline.
I looked into watch_video.c and thought how much real video could be pushed through.

I used new script wv (below) and was able to view videotestsrc at 960fps without any frame drop (by utilizing fast fbdevsink). Uncommenting conversion to GRAY8 shows still 370fps:
Image


This is the script used, fpsdisplaysink measures framerate, text-overlay has to be false in order to not drop framerate, "-v" is verbose and shows framerate measurements on console, setting fpsdisplaysink's video-sink to fbdevsink shows videotestsrc fast:

Code: Select all

#!/bin/sh
launch="gst-launch-1.0"
     v="-v"
   src="videotestsrc"

framerate="video/x-raw,framerate=960/1"
#   gray8=",format=GRAY8 ! videoconvert"

textoverlay="text-overlay=false"
  videosink='video-sink="fbdevsink device=/dev/fb0"'
$launch $v $src ! $framerate $gray8 ! fpsdisplaysink $textoverlay $videosink
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

User avatar
HermannSW
Posts: 3958
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Wed Jul 31, 2019 2:45 pm

Added all appsrc* demos of this thread to gitlab repo:
https://gitlab.freedesktop.org/HermannS ... rc/READMEa

With "compile" script, and "demo" script providing corrrect command arguments:

Code: Select all

pi@raspberrypi3Aplus:~/gst-template/appsrc $ ./demo
0) appsrc
1) appsrc-launch
2) appsrc-launch2
3) appsrc-launch3
4) appsrc-launch4
5) appsrc-launch5
6) appsrc.rb
4
fps-measurements connected
dropped: 0, current: 7492.29, average: 7492.29
dropped: 0, current: 5001.78, average: 6247.15
dropped: 0, current: 7796.45, average: 6763.61
^C
pi@raspberrypi3Aplus:~/gst-template/appsrc $ 
https://stamm-wilbrandt.de/2wheel_balancing_robot
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://github.com/Hermann-SW/raspiraw
https://stamm-wilbrandt.de/en/Raspberry_camera.html

raun0
Posts: 6
Joined: Thu May 21, 2020 8:15 am

Re: gstreamer plugin dev / appsrc / raspiraw / Raspberry camera

Sun Feb 28, 2021 9:24 am

I patched newer raspiraw 0.0.3 with your code in the hope to get better camera support. I ran into issue with imx219 with:
./raspiraw -n -md 7 -t 5000 -sr 1 -o 'appsrc name_ ! videoconvert ! ximagesink'

Code: Select all

Using I2C device /dev/i2c-10
RaspiRaw: Probing sensor ov5647 on addr 36
RaspiRaw: Probing sensor imx219 on addr 10
RaspiRaw: Found sensor imx219 at address 10
RaspiRaw: Encoding 41414270
No AWB
mmal: Set pack to 0, unpack to 0
mmal: Timing 6/2, 2/6/0, 0/0
mmal: Create pool of 6 buffers of size 384000
mmal: Create pool of 6 buffers of size 384000
mmal: Buffer 0x1af7bf0 returned, filled 460800, timestamp 453724673, flags 0004
mmal: Buffer 0x1af7dc8 returned, filled 460800, timestamp 453746834, flags 0004
mmal: Buffer 0x1af7fa0 returned, filled 460800, timestamp 453780075, flags 0004
mmal: Buffer 0x1af8178 returned, filled 460800, timestamp 453824399, flags 0004
mmal: Buffer 0x1af8350 returned, filled 460800, timestamp 453857639, flags 0004
mmal: Buffer 0x1af8528 returned, filled 460800, timestamp 453890881, flags 0004
mmal: Buffer 0x1af7bf0 returned, filled 460800, timestamp 453935204, flags 0004
mmal: Buffer 0x1af7dc8 returned, filled 460800, timestamp 453968446, flags 0004
Segmentation fault

Code: Select all

Using I2C device /dev/i2c-10
RaspiRaw: Probing sensor ov5647 on addr 36
RaspiRaw: Probing sensor imx219 on addr 10
RaspiRaw: Found sensor imx219 at address 10
RaspiRaw: Encoding 41414270
No AWB
mmal: Set pack to 0, unpack to 0
mmal: Timing 6/2, 2/6/0, 0/0
mmal: Create pool of 6 buffers of size 384000
mmal: Create pool of 6 buffers of size 384000
0:00:00.000560363  1204   0x463800 INFO                GST_INIT gst.c:586:init_pre: Initializing GStreamer Core Library version 1.14.4
0:00:00.000876560  1204   0x463800 INFO                GST_INIT gst.c:587:init_pre: Using library installed in /usr/lib/arm-linux-gnueabihf
0:00:00.001054164  1204   0x463800 INFO                GST_INIT gst.c:607:init_pre: Linux raspberrypi 5.10.11-v7+ #1399 SMP Thu Jan 28 12:06:05 GMT 2021 armv7l
0:00:00.002825617  1204   0x463800 INFO                GST_INIT gstmessage.c:127:_priv_gst_message_initialize: init messages
0:00:00.006451077  1204   0x463800 INFO                GST_INIT gstcontext.c:84:_priv_gst_context_initialize: init contexts
0:00:00.007976594  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:317:_priv_gst_plugin_initialize: registering 0 static plugins
0:00:00.008796331  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:225:gst_plugin_register_static: registered static plugin "staticelements"
0:00:00.008972789  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:227:gst_plugin_register_static: added static plugin "staticelements", result: 1
0:00:00.009206122  1204   0x463800 INFO            GST_REGISTRY gstregistry.c:1727:ensure_current_registry: reading registry cache: /home/pi/.cache/gstreamer-1.0/registry.arm.bin
mmal: Buffer 0x460bf0 returned, filled 460800, timestamp 811351049, flags 0004
mmal: Buffer 0x460dc8 returned, filled 460800, timestamp 811373209, flags 0004
mmal: Buffer 0x460fa0 returned, filled 460800, timestamp 811406451, flags 0004
mmal: Buffer 0x461178 returned, filled 460800, timestamp 811439694, flags 0004
0:00:00.191065605  1204   0x463800 INFO            GST_REGISTRY gstregistrybinary.c:621:priv_gst_registry_binary_read_cache: loaded /home/pi/.cache/gstreamer-1.0/registry.arm.bin in 0.181581 seconds
0:00:00.191679614  1204   0x463800 INFO            GST_REGISTRY gstregistry.c:1583:scan_and_update_registry: Validating plugins from registry cache: /home/pi/.cache/gstreamer-1.0/registry.arm.bin
mmal: Buffer 0x461350 returned, filled 460800, timestamp 811506177, flags 0004
0:00:00.203804583  1204   0x463800 INFO            GST_REGISTRY gstregistry.c:1685:scan_and_update_registry: Registry cache has not changed
0:00:00.204113019  1204   0x463800 INFO            GST_REGISTRY gstregistry.c:1762:ensure_current_registry: registry reading and updating done, result = 1
0:00:00.204247863  1204   0x463800 INFO                GST_INIT gst.c:807:init_post: GLib runtime version: 2.58.3
0:00:00.204340883  1204   0x463800 INFO                GST_INIT gst.c:809:init_post: GLib headers version: 2.58.1
0:00:00.204404321  1204   0x463800 INFO                GST_INIT gst.c:810:init_post: initialized GStreamer successfully
0:00:00.204759684  1204   0x463800 INFO            GST_PIPELINE gstparse.c:337:gst_parse_launch_full: parsing pipeline description 'appsrc name=_ ! videoconvert ! ximagesink'
0:00:00.207191501  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:901:_priv_gst_plugin_load_file_for_registry: plugin "/usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstapp.so" loaded
0:00:00.207379261  1204   0x463800 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: creating element "appsrc"
0:00:00.209243996  1204   0x463800 INFO        GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<GstBaseSrc@0x5e4160> adding pad 'src'
0:00:00.217990067  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:901:_priv_gst_plugin_load_file_for_registry: plugin "/usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstvideoconvert.so" loaded
0:00:00.218204702  1204   0x463800 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: creating element "videoconvert"
0:00:00.220841102  1204   0x463800 INFO        GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<GstBaseTransform@0x5ee2a0> adding pad 'sink'
0:00:00.221128393  1204   0x463800 INFO        GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<GstBaseTransform@0x5ee2a0> adding pad 'src'
0:00:00.235931480  1204   0x463800 INFO      GST_PLUGIN_LOADING gstplugin.c:901:_priv_gst_plugin_load_file_for_registry: plugin "/usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstximagesink.so" loaded
0:00:00.236149865  1204   0x463800 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: creating element "ximagesink"
mmal: Buffer 0x461528 returned, filled 460800, timestamp 811550499, flags 0004
0:00:00.237457674  1204   0x463800 INFO        GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<GstBaseSink@0x5f1d30> adding pad 'sink'
0:00:00.237820746  1204   0x463800 INFO     GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: creating element "pipeline"
0:00:00.238667619  1204   0x463800 INFO            GST_PIPELINE grammar.y:652:gst_parse_perform_link: linking some pad of GstAppSrc named _ to some pad of GstVideoConvert named videoconvert0 (0/0) with caps "(NULL)"
0:00:00.238901316  1204   0x463800 INFO        GST_ELEMENT_PADS gstutils.c:1774:gst_element_link_pads_full: trying to link element _:(any) to element videoconvert0:(any)
0:00:00.239080222  1204   0x463800 INFO                GST_PADS gstutils.c:1035:gst_pad_check_link: trying to link _:src and videoconvert0:sink
0:00:00.239363346  1204   0x463800 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<videoconvert0:src> pad has no peer
0:00:00.250046079  1204   0x463800 INFO                GST_PADS gstutils.c:1588:prepare_link_maybe_ghosting: _ and videoconvert0 in same bin, no need for ghost pads
0:00:00.250739046  1204   0x463800 INFO                GST_PADS gstpad.c:2378:gst_pad_link_prepare: trying to link _:src and videoconvert0:sink
0:00:00.251113576  1204   0x463800 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<videoconvert0:src> pad has no peer
0:00:00.263144744  1204   0x463800 INFO                GST_PADS gstpad.c:2586:gst_pad_link_full: linked _:src and videoconvert0:sink, successful
0:00:00.263499013  1204   0x463800 INFO               GST_EVENT gstevent.c:1517:gst_event_new_reconfigure: creating reconfigure event
0:00:00.263678023  1204   0x463800 INFO               GST_EVENT gstpad.c:5808:gst_pad_send_event_unchecked:<_:src> Received event on flushing pad. Discarding
0:00:00.264445209  1204   0x463800 INFO            GST_PIPELINE grammar.y:652:gst_parse_perform_link: linking some pad of GstVideoConvert named videoconvert0 to some pad of GstXImageSink named ximagesink0 (0/0) with caps "(NULL)"
0:00:00.264752500  1204   0x463800 INFO        GST_ELEMENT_PADS gstutils.c:1774:gst_element_link_pads_full: trying to link element videoconvert0:(any) to element ximagesink0:(any)
0:00:00.264982447  1204   0x463800 INFO                GST_PADS gstutils.c:1035:gst_pad_check_link: trying to link videoconvert0:src and ximagesink0:sink
mmal: Buffer 0x460bf0 returned, filled 460800, timestamp 811583741, flags 0004
0:00:00.275582212  1204   0x463800 INFO                GST_PADS gstutils.c:1588:prepare_link_maybe_ghosting: videoconvert0 and ximagesink0 in same bin, no need for ghost pads
0:00:00.276004502  1204   0x463800 INFO                GST_PADS gstpad.c:2378:gst_pad_link_prepare: trying to link videoconvert0:src and ximagesink0:sink
0:00:00.286391038  1204   0x463800 INFO                GST_PADS gstpad.c:2586:gst_pad_link_full: linked videoconvert0:src and ximagesink0:sink, successful
0:00:00.286739735  1204   0x463800 INFO               GST_EVENT gstevent.c:1517:gst_event_new_reconfigure: creating reconfigure event
0:00:00.286920881  1204   0x463800 INFO               GST_EVENT gstpad.c:5808:gst_pad_send_event_unchecked:<videoconvert0:src> Received event on flushing pad. Discarding
0:00:00.287247755  1204   0x463800 INFO           GST_PARENTAGE gstbin.c:4468:gst_bin_get_by_name: [pipeline0]: looking up child element _
0:00:00.287884837  1204   0x463800 INFO              GST_STATES gstbin.c:2506:gst_bin_element_set_state:<ximagesink0> current NULL pending VOID_PENDING, desired next READY
mmal: Buffer 0x460dc8 returned, filled 460800, timestamp 811616984, flags 0004
Segmentation fault
I have tried figuring out this issue with my knowledge but ran out of hope.

Return to “Camera board”