HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

[Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Mon Sep 11, 2017 9:42 pm

I was able to get PS3 Eye camera record 100fps 320x240 video with Raspberry Pi ZeroW, and will describe what I was able to find out and do in this posting. In next posting I will give some questions that remain.
(ugly colors animated .gif, plays slowed down by factor of 16, allows to see 0.01s steps in timeoverlay)
Image

Why PS3 Eye as webcam?
I really like to take 90fps 640x480 videos with Pi camera v1. I also want to go with higher fps with that camera, and 6by9's cool "raspiraw" might even make >300fps possible.

Last week I googled for alternative high framerate cameras and stumbled over this posting:
https://www.zonetrigger.com/articles/hi ... e-cameras/

It mentioned Sony PS3 Eye camera as being able to do 150fps@320x240 as USB webcam:
https://en.wikipedia.org/wiki/PlayStation_Eye

Then I found this 2014 article
http://www.phoronix.com/scan.php?page=n ... px=MTg3NTM

that speaks of 187fps support addition to Linux kernel for PS3 Eye:
http://lkml.iu.edu/hypermail/linux/kern ... 01236.html

At that point I searched where to get and what price, and ordered on amazon for 10$.

PS3 Eye modes available under Linux/Raspbian
After I received PS3 Eye, I attached it to Pi ZeroW and some googling showed how to determine the modes available. I was not impressed by the 640x480 modes, but definitely was from the many >100fps 320x240 modes:
pi@raspberrypi03:~ $ v4l2-ctl --list-devices
USB Camera-B4.09.24.1 (usb-20980000.usb-1):
/dev/video0

pi@raspberrypi03:~ $ v4l2-ctl --list-formats-ext -d /dev/video0
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'YUYV'
Name : YUYV 4:2:2
Size: Discrete 320x240
Interval: Discrete 0.005s (187.000 fps)
Interval: Discrete 0.007s (150.000 fps)
Interval: Discrete 0.007s (137.000 fps)
Interval: Discrete 0.008s (125.000 fps)
Interval: Discrete 0.010s (100.000 fps)
Interval: Discrete 0.013s (75.000 fps)
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.020s (50.000 fps)
Interval: Discrete 0.027s (37.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.020s (50.000 fps)
Interval: Discrete 0.025s (40.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)

pi@raspberrypi03:~ $
Capturing and storing 100fps PS3 Eye 320x240 video with Pi ZeroW
I tried ffmpeg to capture normal low fps video from PS3 Eye, but only ran into problems, too much frame losses, unreliable.

Then I remembered gstreamer library (I showed how to stream with it with PI here). After some playing I was able to setup a 100fps pipeline that stored video as file. Unfortunately a 3sec video took more than 5sec to record. It turned out that storing on SD card was bottleneck. I resolved that by outputting file into /dev/shm (memory filesystem). Now only 0.1sec overhead was added to video taking time (Execution ended after 3116775270 ns):

Code: Select all

pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! timeoverlay ! avimux ! filesink location=/dev/shm/x.avi
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:video_sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstAviMux:avimux0.GstPad:video_00: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)100/1
/GstPipeline:pipeline0/GstAviMux:avimux0.GstPad:src: caps = video/x-msvideo
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-msvideo
Got EOS from element "pipeline0".
Execution ended after 3116775270 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstAviMux:avimux0.GstPad:video_00: caps = NULL
/GstPipeline:pipeline0/GstAviMux:avimux0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:video_sink: caps = NULL
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = NULL
Setting pipeline to NULL ...
Freeing pipeline ...
pi@raspberrypi03:~ $ 


I uploaded the video to youtube (the original x.avi is available here as well, 44MB):
https://www.youtube.com/watch?v=WwBwgtl ... e=youtu.be

In order to avoid youtube upload to drop a lot of frames from the .avi I had to do this strange conversion to .mp4 with fake 25fps framerate:

Code: Select all

$ ffmpeg -i x.avi frame.%03d.png; ffmpeg -r 25 -f image2 -i frame.%03d.png -vcodec libx264 -crf 25 -pix_fmt yuv420p frame.mp4
You have to stop youtube video play, and then single frame step forward/backward with '.'/',' keys to see the details. The timeoverlay I added to the pipeline shows you that really 0.01s are between successive frames.

Or you choose 0.25 as speed in youtube settings, then video plays slowed down by factor of 16 and you can see the 0.01s steps of timeoverlay as well.

Here you can see that from first frame you can see match glow (0.866s) it takes until 1.026s before flame gets bigger:
Image

The framerate is shaky until 0.2s, and a bit shaky until 0.7s. It turned out that the camera framerate is 100fps, and the shaky timestamps are result of using gstreamer timeoverlay at all ...:

Code: Select all

pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! fakesink | grep duration | cut -f 5-8 -d\: | cut -f2 -d, | tail -3
 timestamp: 0:00:02.970570154
 timestamp: 0:00:02.980650070
 timestamp: 0:00:02.990441988
pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! fakesink | grep duration | cut -f 5-8 -d\: | cut -f2 -d, | tail -3
 timestamp: 0:00:02.970263156
 timestamp: 0:00:02.980386072
 timestamp: 0:00:02.990262989
pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! timeoverlay ! fakesink | grep duration | cut -f 5-8 -d\: | cut -f2 -d, | tail -3
 timestamp: 0:00:03.073793293
 timestamp: 0:00:03.083821210
 timestamp: 0:00:03.093823126
pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! timeoverlay ! fakesink | grep duration | cut -f 5-8 -d\: | cut -f2 -d, | tail -3
 timestamp: 0:00:03.044038542
 timestamp: 0:00:03.054030458
 timestamp: 0:00:03.064040375
pi@raspberrypi03:~ $ 
Hermann.
Last edited by HermannSW on Wed Sep 13, 2017 4:22 pm, edited 7 times in total.

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: How? PS3eye from 100fps to 187fps with Pi ZeroW?

Mon Sep 11, 2017 10:00 pm

You may ask why I used gst-launch-0.10 instead of gst-launch-1.0 (I did install 0.10 version in the past for the streaming posting I linked to). I tried both, but the 1.0 version does take >5s to store 3s video even when storing into /dev/shm.

Why is 1.0 version so much slower than 0.10 version?
Can 1.0 gstreamer pipeline be constructed that has perfromance similar to 0.10?

More important question:
How can gstreamer make use of >100fps modes?

The v4l2 driver only supports modes up to 100fps (for 0.10 as well as for 1.0) :

Code: Select all

pi@raspberrypi03:~ $ gst-inspect-0.10 v4l2src | grep framerate | wc --lines
30
pi@raspberrypi03:~ $ gst-inspect-0.10 v4l2src | grep framerate | sort -u
              framerate: [ 0/1, 100/1 ]
pi@raspberrypi03:~ $ gst-inspect-1.0 v4l2src | grep framerate | sort -u
              framerate: [ 0/1, 100/1 ]
pi@raspberrypi03:~ $ 
Is there a better v4l2 driver available somewhere?
Can a driver different to v4l2 be used for PS3 Eye gstreamer pipeline in order to support >100fps?

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: How? PS3eye from 100fps to 187fps with Pi ZeroW?

Mon Sep 11, 2017 10:07 pm

I installed qv4l2 test propgram on Raspbian. While it did show all modes up to 187fps, this program cannot be used for anything at least on Pi ZeroW. Even for low framerates the preview window does not really work on X Windows (HDMI display).

I installed qv4l2 on RHEL linux (qv4l2-1.6.2-3.el6.x86_64.rpm), and on the 2.8GHz laptop the program works well. Unfortunately the maximal framerate in that program is 125fps. As you can see in bottom line of qv4l2 the program really does 125fps:
Image

How can qv4l2 be made working on Pi ZeroW?

What other video capturing software is able to capture with 187fps on Pi ZeroW?

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: How? PS3eye from 100fps to 187fps with Pi ZeroW?

Tue Sep 12, 2017 4:45 pm

Just did a https://en.wikipedia.org/wiki/Video_feedback recording with gstreamer into fpsdisplaysink:

Code: Select all

$ gst-launch-0.10 -v v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240, framerate=100/1 ! videoscale ! timeoverlay ! fpsdisplaysink
It shows 100fps measurement for pipeline:
Image

Will run qv4l2 (with 187fps mode) on Raspberry Pi 3 of a colleague in the next days, hopefully achieving 187fps capturing rate.

In addition to the questions already stated above:
In case qv4l2 is able to capture with 187fps, can it be used as a command line tool for capturing raw frames, or is there another command line tool that can do the same?

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: How? PS3eye from 100fps to 187fps with Pi ZeroW?

Wed Sep 13, 2017 4:20 pm

I was able to play with Raspberry Pi3 and PS3 Eye camera in the office today. I installed and started "qv4l2", and the result was really ugly. The video frame was strange (PS3 Eye looked directly on the Pi3 with cables connected), totally reduced set of colors. And as you can see at bottom line the real framerate was only 2fps despite the selected 187fps.

But now the good news, I got 187fps to work with gstreamer pipeline. Unfortunately there was a problem with the display, so that I was not able to use fpsdisplaysink as yesterday. But I used fakesink for poor man's fps determination instead:

Code: Select all

pi@Pi3 $ gst-launch-0.10 -v v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! fakesink > out
^C
(gst-launch-0.10:10377): GLib-CRITICAL **: Source ID 22 was not found when attempting to remove it
pi@Pi3 $ grep timestamp out | head -1
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (153600 bytes, timestamp: 0:00:00.000199831, duration: 0:00:00.005347593, offset: 0, offset_end: 1, flags: 32 discont ) 0x75701c48"
pi@Pi3 $ grep timestamp out | tail -1
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (153600 bytes, timestamp: 0:00:32.548699584, duration: 0:00:00.005347593, offset: 6064, offset_end: 6065, flags: 0 ) 0x75702148"
pi@Pi3 $ grep timestamp out | wc --lines
6065
pi@Pi3 $
And the measurents showed 186.3fps, matching the duration 0.005347593 shown above:

Code: Select all

pi@Pi3 $ bc -ql
1/((32.548699584-0.000199831)/6064)
186.30659004309654045391
1/0.005347593
187.00002038300222174724
Now I wanted to see whether qv4l2 was needed at all. I did shutdown the Pi3, disconnected PS3 Eye, started and connected, and then immediately executed above gstreamer pipeline after login. Result was the same, 187fps! What I learned from this is that gstreamer and camera seem to negotiate a framerate, and as long as I don't specify framerate in gstreamer pipeline, that is the maximal camera framerate.

Will look to get streaming with fpsdisplaysink working tomorrow, and am really waiting to do 187fps capturing with Pi Zero when back at home on weekend.

Hermann.
Image

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 4542
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Wed Sep 13, 2017 6:11 pm

Capture menu, disable OpenGL.
Don't expect massive framerates as it will be using the CPU to convert yuyv to rgb for the frame buffer.

The command line version is v4l2-ctl.
--help-all will give you the full help text, but you'll be wanting -v to set the resolution and pixel format, and -p to set the frame rate.
--stream-mmap=3 --stream-count=<number of frames> --stream-to <file>
To capture.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Thu Sep 14, 2017 12:50 pm

Thanks for that 6by9, I did not find those options because I did "v4l2-ctl --help" and expected to see all help. I have never seen a program where "--help-all" is needed.

So you can capture a video with just v4l2-ctl and sepcify resolution, fps, filename, ... and in the end the file contains the raw YUYV frame data. I did last night and the raw size of the file generated was #frames * 153600 for 320x240 resolution, 4 bytes per 2 pixel:
https://www.linuxtv.org/downloads/v4l-d ... -YUYV.html

I googled a lot on how to convert either this raw video or only a single 153600 byte frame from YUYV to eg. .avi or .png, tried many commands, and nothing worked (I could see videos or images,. but with completely wrong coloring, shifted, ...).

What is the coorrect ffmpeg command to convert raw YUYV
  • (153600 byte) frame to .png?
  • video to .avi?
Hermann.

P.S:
As stated yesterday I recorded a 187fps video (5s, 137MB) with gstremer pipeline on Pi3 successfully:
https://stamm-wilbrandt.de/en/forum/pi3.187fps.avi

P.P.S:
I created falling water drop animated .gif from PS3 Eye 320x240@125fps gstreamer video with timeoverlay, intentionally slowed down by factor 125:
Image

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 4542
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Thu Sep 14, 2017 2:42 pm

HermannSW wrote:
Thu Sep 14, 2017 12:50 pm
Thanks for that 6by9, I did not find those options because I did "v4l2-ctl --help" and expected to see all help. I have never seen a program where "--help-all" is needed.
You didn't read the help text very well then seeing as it includes
-h, --help display this help message
--help-all all options
--help-io input/output options
--help-misc miscellaneous options
--help-overlay overlay format options
--help-sdr SDR format options
--help-selection crop/selection options
--help-stds standards and other video timings options
--help-streaming streaming options
--help-tuner tuner/modulator options
--help-vbi VBI format options
--help-vidcap video capture format options
--help-vidout vidout output format options
--help-edid edid handling options
Seeing as there are over 500 lines of help, breaking it down into sections is very sensible.
It's not that uncommon, seeing as "git --help" also only gives you an overview.
HermannSW wrote:So you can capture a video with just v4l2-ctl and sepcify resolution, fps, filename, ... and in the end the file contains the raw YUYV frame data. I did last night and the raw size of the file generated was #frames * 153600 for 320x240 resolution, 4 bytes per 2 pixel:
https://www.linuxtv.org/downloads/v4l-d ... -YUYV.html

I googled a lot on how to convert either this raw video or only a single 153600 byte frame from YUYV to eg. .avi or .png, tried many commands, and nothing worked (I could see videos or images,. but with completely wrong coloring, shifted, ...).

What is the coorrect ffmpeg command to convert raw YUYV
  • (153600 byte) frame to .png?
  • video to .avi?
Pass on ffmpeg, although you can try following https://askubuntu.com/questions/258744/ ... ing-ffmpeg. I'm surprised that that thread doesn't include reference to "ffmpeg -pixel_format yuv420p -video_size 720x576 -framerate 25 -i …" or similar, as the YUYV stream has no framing information (this thread may also be useful).
Be aware that some containers can't handle raw video frames.
PNG is a bad choice as it is RGB rather than YUV.

I always use Vooya for viewing raw images, although sadly there isn't an ARM version.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Thu Sep 14, 2017 5:53 pm

Thanks for the comments, I found that my installed ffmpeg had a problem.
I downloaded version 3.3.4 and built it from scratch.

Thanks to your help this is how I recorded a 3 second 75fps raw video with just v4l2-ctl:

Code: Select all

$ v4l2-ctl -d /dev/video1 -p 75 -v width=320,height=240,pixelformat=,field=none,bytesperline=640 --stream-mmap=3 --stream-count=225 --stream-to file.raw
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 75 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 75 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Frame rate set to 75.000 fps
$ 
For the ffmpeg conversion of the video I seem to miss something though, any help appreciated:

Code: Select all

$ ffmpeg -pixel_format yuv420p -video_size 320x240 -framerate 75 -i file.raw file.avi
ffmpeg version 3.3.4 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-18)
  configuration: --disable-yasm
  libavutil      55. 58.100 / 55. 58.100
  libavcodec     57. 89.100 / 57. 89.100
  libavformat    57. 71.100 / 57. 71.100
  libavdevice    57.  6.100 / 57.  6.100
  libavfilter     6. 82.100 /  6. 82.100
  libswscale      4.  6.100 /  4.  6.100
  libswresample   2.  7.100 /  2.  7.100
[image2 @ 0x2243480] Format image2 detected only with low score of 5, misdetection possible!
Input #0, image2, from 'file.raw':
  Duration: 00:00:00.01, start: 0.000000, bitrate: 20736518 kb/s
    Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 320x240, 75 tbr, 75 tbn, 75 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo (native) -> mpeg4 (native))
Press [q] to stop, [?] for help
Output #0, avi, to 'file.avi':
  Metadata:
    ISFT            : Lavf57.71.100
    Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p, 320x240, q=2-31, 200 kb/s, 75 fps, 75 tbn, 75 tbc
    Metadata:
      encoder         : Lavc57.89.100 mpeg4
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame=    1 fps=0.0 q=5.8 Lsize=      38kB time=00:00:00.01 bitrate=23489.4kbits/s speed= 2.8x    
video:33kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 17.079882%
$ 
Here is file.raw generated by first command for trying out (33MB):
https://stamm-wilbrandt.de/en/forum/file.raw

Frames should look similar to this frame taken from a gstreamer video:
Image

Hermann.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 4542
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Thu Sep 14, 2017 7:58 pm

I always have to Google for options around ffmpeg.

pixel format is yuyv422.

Code: Select all

ffmpeg -pixel_format yuyv422 -video_size 320x240 -framerate 75 -i file.raw file.avi
translates one frame of the file, but has MPEG4 encoded it. End "-i file.raw -vcodec copy file.avi" and it doesn't encode.
To get it to accept multiple frames from the same file, you need to specify the rawvideo filter at the input.

Code: Select all

ffmpeg -f rawvideo -pixel_format yuyv422 -video_size 320x240 -framerate 75 -i file.raw -vcodec copy file.avi
VLC appears to be happy to play the result.

Actually ffplay can play the raw stream:

Code: Select all

ffplay -f rawvideo -pix_fmt yuyv422 -s 320x240 -i file.raw
I'll leave it as a task for you to work out how to set the frame rate.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

HermannSW
Posts: 281
Joined: Fri Jul 22, 2016 9:09 pm

Re: [Solved] How? PS3eye from 100fps to 187fps with Pi ZeroW?

Sat Sep 16, 2017 9:40 pm

Thanks for your commands, I immediately used them in Light bulb power line flicker observed with 125fps video posting with laptop and PS3 Eye.

At home I tested PS3 Eye with Pi Zero W, but the gstreamer pipeline did not achieve 187fps first, although target file was placed under /dev/shm. Reason was the use of timeoverlay, which has been described to add overhead before, After leaving out timeoverlay gstreamer pipeline did record with 187fps. But without timeoverlay I see no point for gstreamer pipeline, at least in this "just save video with 187fps" case. So I will use your v4l2-ctl command lines with -p 187 as done in the other thread from now on.

I thought on a simple to setup scenario to record high speed (>40km/h) with 187fps, and chose free fall. The only problem I had was that /dev/shm only allows to store 6 seconds of raw 187fps 320x240 video. It was not easy to trigger delayed recording, then quickly climb a high ladder in 1st floor and drop marble so that it reaches basement floor more than 6m below, inside the 6s recording time.

I had to tidy up after that try because my family came home. For some reason there is a offset/displacement in the video. The right part belongs to the left side, but you can see that. The plan was for the marble to land in the open box on ground, but by accident the marble reached one of the closing flaps, which catapulted away the marble, This animated .gif is created from the video frames with activity, with slowdown factor 187:
Image

So 187fps recording works on Pi Zero W, but adding timeoverlay at that frequency is not possible with Pi Zero W.

Hermann.

P.S:
I have to correct myself, there is a reason to better use gstreamer pipeline even without timeoverlay. It is the only way to verify recording execution time, and in case that is only few milliseconds above #frames/187 seconds, then really recording was done at 187fps (6s=1122 frames):

Code: Select all

pi@raspberrypi03:~ $ gst-launch-0.10 -v v4l2src device=/dev/video0 num-buffers=1122 ! video/x-raw-yuv,width=320,height=240 ! videoscale ! avimux ! filesink location=/dev/shm/file.187.avi | egrep \(Execution\|framerate\) | tail -2
/GstPipeline:pipeline0/GstAviMux:avimux0.GstPad:video_00: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)320, height=(int)240, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)187/1
Execution ended after 6001407037 ns.
pi@raspberrypi03:~ $ 

Return to “Camera board”

Who is online

Users browsing this forum: Pablosj and 13 guests