I had some problems my self interpreting data from motion vectors according to guide http://www.raspberrypi.org/vectors-from ... estimation
(it was great help though, but actually didn't describe accurately how vectors could be read from stream).
So I wrote program, which read motion vector data from raspivid
and streams it over websocket to be interpreted (https://github.com/elhigu/wsstreamprocessor
). Browser reads vector data from websocket and plots motion vector information to screen so that color is selected by vector angle and intensity by vector length.
Main findings about raspivid -x
format is described in README.md:
Motion vectors are coming from raspicam in following format:
signed char dx; // values seems to be around +- 80
signed char dy; // values seems to be around +-80
signed short sad; // values seems to be around 0..512 little endian
Numer of motion vectors can be calculated from your video resolution
var vectorsPerLine = Math.floor(imageWidth/16)+1
var vectorLines = Math.floor(imageHeight/16)+1
So for 1920x1080 FullHD video there are 121x68 vectors. First vector in frame is from bottom-left corner.
e.g vectors for 65x31 video stream (5x2 vectors) would be:
v6 v7 v8 v9 v10
v1 v2 v3 v4 v5
And frame size would be 40 bytes / frame.