User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Camera controlled automatic PT approach for a landing capturing

Thu Sep 19, 2019 7:47 pm

Two years ago I played with PT (Pan Tilt) bracket for Raspberry camera with my caterpilar robot (only the tilt part):
https://www.raspberrypi.org/forums/view ... 1#p1231151

The raspiraw based automatic tilt procedure brought camera into a defined position after starting.
Camera went up, then vertically down, and then stepwise up until black robot front became bottom of FoV:
Image


My secondary living place in Holzgerlingen/Germany (near IBM Böblingen lab) has a nice view out of slanted roof window onto entry lane of Stuttgart airport. As a new project I wanted (as the title says) do automatic PT control for a (centered) capturing of approach for a landing. Recently I received some new 1$ PT brackets from aliexpress for that (I have some SG90 servos at home already):
Image

As a pilot study I took a short video with Raspberry camera of an approach for a landing during dusk, with this command:

Code: Select all

$ raspivid -t 0 -md 1 -w 1920 -h 1080 -o tst.h264 -fps 30

Then I converted it to .mp4:

Code: Select all

$ ffmpeg -framerate 30 -i tst.h264 -c copy out.mp4

Finally I did cut out partfrom second 16 to second 44:

Code: Select all

ffmpeg -ss 16 -i out.mp4  -t 28 -c copy out.1.mp4

And converted that to a nice (scaled down to 640x360) animated .gif using this thechnique:
http://blog.pkh.me/p/21-high-quality-gi ... fmpeg.html
Image


I did upload the complete 2MP@30fps .mp4 to youtube as well.
In one of the first frames you can see roof top of next house and airplane at top left:
https://www.youtube.com/watch?v=sW4VpSM ... e=youtu.be
Image


I am really happy that capturing airplane video at dusk works, and that I did capture the airplane quite well given the fact that I did it blindly with no video control during capturing.

Yesterday I took some screenshots from https://www.flightradar24.com. In this screenshot I marked the airplanes with approach for a landing at Stuttart airport (white circle) with blue circles (you can click on an airplane and get start and destination airport information). Holzgerlingen is marked with filled red circle:
Image


As you can see there is a steady stream of airplanes landing, so automatic PT capturing might move camera to left, wait for airplane to come in sight, follow it centered until it is out of sight, and then repeat.

Today was a good pilot study, more to come.

P.S:
This is a good "slow speed" (the servo motors, not the airplanes) live camera video processing excercise before doing live camera video processing to control high speed caterpillar robot.
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
DougieLawson
Posts: 39551
Joined: Sun Jun 16, 2013 11:19 pm
Location: A small cave in deepest darkest Basingstoke, UK
Contact: Website Twitter

Re: Camera controlled automatic PT approach for a landing capturing

Fri Sep 20, 2019 2:15 pm

Get yourself a RTL-SDR USB radio and a copy of PiAware. From that you can get JSON for every aircraft in your local area - with speed, heading, altitude and lat/lon.
Note: Any requirement to use a crystal ball or mind reading will result in me ignoring your question.

Criticising any questions is banned on this forum.

Any DMs sent on Twitter will be answered next month.
All fake doctors are on my foes list.

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Sat Sep 21, 2019 2:19 pm

While that sounds interesting, it does not help to get centered video of each approach for a landing.
And this project will be an exercise on life video frame feature extraction and live control of PT camera.
When capturing blinking plane in dusk will work (easier), next step could be capturing airplane at daylight as well.
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Wed Oct 02, 2019 5:56 pm

Today I was at secondary living place earlier, at daylight. And I took video of airplane that did take off in Stuttgart with destination Milan, when passing Holzgerlingen. This time with smartphone camera, so audio is available:
https://www.youtube.com/watch?v=Akf9BqgvXac
Image


It is better to capture airplane at daylight (dark dot before bright background) than capturing only airplane lights in darkness. I will try to use 3.98° M12 mount lens to see more from the plane (if PT platform can handle 70mm lens weight):
"70mm F1.6 3.98° lens allows for clear 500m distant video"
https://www.raspberrypi.org/forums/view ... p?t=227166
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Fri Oct 11, 2019 4:58 pm

I just measured the weight of v1 camera with 70mm M12 lens mounted as 19.1g.
Not sure whether PT bracket can carry that and function as well:
Image


I don't know the shortest distance between secondary living place slanted roof window and a passing airplane.
But in "WoodenBoardPi" thread I just made photo with 70mm lens showing house of my friend 1.73km distant !
This is promising for "automatic capturing of airplane" size on image and quality.
(passing airplanes feel to be closer than 1.7km at shortest distance)
https://www.raspberrypi.org/forums/view ... 9#p1550469
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Sat Oct 12, 2019 2:18 pm

I am not in Holzgerlingen right now, but flightradar24 allowed me to watch some departures from Stuttgart airport.
Four airplanes heading to southeast Europe or Italy took the lower trajectory, but the Airbus A319 to Lisbon took the upper trajectory:
Image


I constructed the trajectory from two known points in gmaps pedometer in order to find the minimal distance point, marked with blue above. The distance to my slanted roof window is 1.46km. Assuming that airplane is still below 1km height the total distance is slightly more than to the house of my friend in previous posting [sqrt(1.46^2+1^2)=1.77]. Airbus A319 has length 34m, perhasp 3 times the width of my friends house. Fuselage height is 4m, a little less than 2 floors of my friend's house.

Expected size of airplane is roughly 300 pixel width and 60 pixel height -- we will see later how good this rule of a thumb calculation matches real airplane captures:
Image


P.S:
Just building PT camera system. Controlling two SG90 servos with pigpio pigs library is so simple:
"flag_semaphore att rest R A S P B E R R Y ready"
https://www.raspberrypi.org/forums/view ... 7&t=254180
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Wed Oct 16, 2019 10:37 pm

Initial build of PT camera system with 70mm done.

First I missed matching gearwheel connector for ground plate.
As workaround I did simply superglue the existing connector to desk.
Two drops of superglue to Raspberry camera module connector.
And some superglue on top plastic line of bracket, connect camera to PT (Pan Tilt) camera platform easily:
Image


Next question got answered whether PT system can deal with 70mm long lens -- it can:
Image


I forgot to take 3 needed male/female connector cables to connect at least one SG90 servo with me.
This is a real hack, just using what I have here:
https://www.youtube.com/watch?v=Tt7uMuvfItk
Image


With that working, next problem was that I cannot get any sharp image inside room with that lens, and it is dark outside.
So I decided to take a video for patrolling the underside of window curtain ;-)
Not sharp video inside room, but works.

First I started raspivid to capture 25s of 1920x1080 video at 30fps:

Code: Select all

pi@raspberrypi2B:~ $ raspivid -md 1 -w 1920 -h 1080 -fps 30 -t 25000 -o tst.h264
pi@raspberrypi2B:~ $ 

Then I started endless patrolling (after having started "sudo pigpiod"):

Code: Select all

pi@raspberrypi2B:~ $ while true
> do
> for((i=1180; i<1450; ++i)); do pigs s 18 $i; done
> for((i=1450; i>1180; --i)); do pigs s 18 $i; done
> done

This is resulting video, uploaded to youtube, played at 25fps, 0.833 times slower than real:
https://www.youtube.com/watch?v=bYRJ0Ie1alU
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Fri Oct 25, 2019 7:09 pm

I was concentrated on Stuttgart airport because of my secondary living place. I totally forgot about our primary living place in Eberbach, 50km east of Mannheim. Mannheim airport is only a small airport, but the airplanes landing in Mannheim typically use Eberbach as fly over point, perhaps because it is located at river Neckar. I just heard an airplane flying over Eberbach, opened flightradar24 and determined trajectory of airplane that started in Berlin and lands in Mannheim. It turned out that shortest distance between our house and airplane is less than 300m, not sure how high the airplane is flying here only 50km away from landing. Dornier 328-110 is shorter than an Airbus, but should be visible on photo taken with 70mm lens quite well with its length 23.21m:
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Sun Oct 27, 2019 9:40 pm

I do have the basis for YUV frame processing while recording (with raspividyuv) before storing as .h264 video (with i420toh264 tool utilizing the VPU) in this posting:
https://www.raspberrypi.org/forums/view ... 4#p1556034

Unfortunately I am in hospital with a broken leg. While I have an old (Ubuntu) laptop here, I don't have any Pi with camera here.

It turned out that the pipeline framework developed in the pointed to thread has a big advantage -- as long as I can get YUV I420 format video data I can work on the plugin without a Pi(!), and that is what I did today.

I just took the video from earlier posting in this thread:
https://www.raspberrypi.org/forums/view ... 6#p1545992

Not the video uploaded to youtube.com, but the original stored on smartphone.
I did cut out 574 frames from the smartphone .mp4 and converted to I420 format:

Code: Select all

$ time ( echo "`bzcat day.yuv.bz2 | wc --bytes`/(640*480*3/2)" | bc -ql )
574.00000000000000000000

real	0m10.085s
user	0m9.509s
sys	0m1.444s
$ 

I did commit&push that (30MB compressed) sample input, as well as initial sample_yuv_airplane.c to github repo:
https://github.com/Hermann-SW2/userland ... airplane.c

Later the code will control PT camera servos for getting an always centered on airplane recording.
Now the sample just determines a dark point in the airplane (see code for simple initial heuristic) and marks a 2x2 area there white for each frame.

I uploaded the 640x480 video generated with these two commands (2nd command is i420toh264 replacement without Pi) ...

Code: Select all

$ ./sample_yuv_airplane 640 480 320 20 < <(bzip2 -dc day.yuv.bz2) >out.yuv 2>err
$ ffmpeg -video_size 640x480 -pixel_format yuv420p -i out.yuv out.h264 -y 2>err
$
... demonstrating the pretty well working to youtube:
https://www.youtube.com/watch?v=QdGaEZIitUY

I used makeagif.com to quickly get an animated .gif without running ffmpeg myself.
The generated animation is 320x240 -- good that I did mark 2x2 area on airplane white in 640x480 frames, now the white dot can be seen well (it is not present on video sample input was derived from):
Image


Of course the youtube 640x480 video looks better than the 4.6MB animation.

A real i420toh264 pipeline with 640x480 resolution (will use bigger resolutions) looks like this:

Code: Select all

$ raspividyuv -md 7 -w 640 -h 480 -o - -fps 90 |
> ./sample_yuv_airplane 640 480 320 20 |
> ./i420toh264 tst.h264 640 480 > /dev/null

This should work without issues because the heuristic does not even look at a full frame, so is very time efficient.

Code in line 52 and lines 64-84 demonstrates how to address a pixel (row=r, column=c) in Y, U and V planes:
https://github.com/Hermann-SW2/userland ... lane.c#L64


P.S:
The heuristic tries to avoid getting stuck in local brightness minima not belonging to airplane similar to threshold accepting:
https://github.com/Hermann-SW2/userland ... lane.c#L42
http://comisef.wikidot.com/concept:thresholdaccepting
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Thu Nov 07, 2019 11:42 am

I added section "Locating airplane" to github repo:
https://github.com/Hermann-SW2/userland ... g-airplane

The 640x480 animation for part of the video is overlaid with flow of operation (normal raspividyuv, as well as other i420 frames input):
(GraphvizFiddle share link)
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Wed Nov 13, 2019 12:51 am

I will switch the 70mm lens onto a newly built PT camera system:
Servos have limited range 0..180°, and 1 step/°.
28BYJ-48 stepper motor has 4096(!) half steps per revolution, unlimited range and 11.3 steps/°:
"4 drops of superglue result in high precision PT camera system"
https://forum.arduino.cc/index.php?topic=647703.0
(the demo youtube video shows independent as well as collective stepper motor movement)
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Wed Nov 13, 2019 10:29 am

70mm lens moved to stepper PT camera system already:
https://forum.arduino.cc/index.php?topi ... msg4369656
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Fri Nov 15, 2019 4:20 pm

I switched lens back (temporarily for inhouse algorithm development) to normal v1 camera.
You have seen in previous posting how smooth the stepper motors move.

In this posting I described how the first 1920x1080@30fps stepper PT system camera video was taken:
https://www.raspberrypi.org/forums/view ... 8#p1566138
1920x1080 got scaled down to 320x180 in order to reduce animated .gif size from >100MB to 3.5MB.
I reduced aimation speed to 3fps for easy inspection.
The pendulum is used as airplane replacement for dev work:
Image


The simple heuristic in sample_yuv_airplane.c for locating a point inside "airplane" had problems with ingrain wallpaper. Since those problems do not exist with blue sky, I used a new white only background. Next problem was that the heuristic detected different balls at pendulum for different frames, so I reduced to only one ball. Finally the ball reflected the bright light. I just painted most of the ball with Black 3.0 (absorbs 98% of visible light), that resolved the issue:
Image


The point determined by the simple heuristic has two properties. The first and good one, it lies inside "airplane". Negative is that it jumps heavily inside the ball for different frames, not a good basis for moving camera onto it. I added a 2nd heuristic, a configurable size square centered at the determined point. Inside that square a configurable threshold is applied to get b&w bitmap. All black point coordinates are added and devided by their count to get mean coordinate. In below animation generated from 640x480@90fps video (slowed down to 3fps for easy inspection). I went down to 640x480 because the 30fps only for 1920x1080 have a stepper move problem. There needs to be a 0.8ms pause at least for a stepper motor step to complete. (1000/30)/0.8<42 steps could be done between frames. In the end no modification of the frame will be done, what you see is for visual debugging only. The determined mean point is marked with 2x2 blue area. The center of the hopping squares is the point determined by 1st heuristic (pendulum is 30 times faster than this in real):
Image


I did commit intermediate version, nice how easy the not trivial 2nd heuristic looks like:
https://github.com/Hermann-SW2/userland ... rplane.2.c

These are the commands I did -- later sample_yuv_airplane.2 will be part of i420toh264 bash pipeline without ffmpeg involved :

Code: Select all

$ time ./sample_yuv_airplane.2 640 480 380 100 30 60 50 < tst.yuv >out.yuv 
$ time ffmpeg -video_size 640x480 -pixel_format yuv420p -i out.yuv out.h264 -y 2>err
$ ffmpeg -i out.h264 ana/frame.%04d.png
$ ffmpeg -r 3 -i out.h264 -c copy out.mp4

Work in progress, now the stepper motors need to get moving dependent of the mean point coordinate relative to center of frame for each frame ...

P.S:
I determined blue point x-coordinates for frames around pendulum passing lowest point (maximal speed).
Many times difference is 8, two times it is 10 pixels.

P.P.S:
Took a 640x480 photo with raspistill.
Moved motor 10 steps left, took 2nd photo.
Moved motor 10 steps right, took 3rd photo.

x-coordinate of cord at ball was identical for 1st and 3rd photo.
Difference between 1st and 2nd photo x-coordinates is 10.
So at least around pendulum equilibrium position there is a 1:1 relation (at the given distance between pendulum and camera) between horizontal stepper motor steps and pixels on frame ...

And 10 steps with 0.8ms delay each would fit into the 11.1ms available between frames captured at 90fps.
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Sun Nov 17, 2019 7:12 pm

I made progress, although it turned out that processing 90fps video was too aggressive.
Now pipeline does process 640x480@60fps, runs both heuristics for each frame and adds the blue marker (only).
In addition it does usleep(10000) which waits for 1/100 second before processing next frame.
This is the time reserved for stepper motor move per frame.
All stepper motor code is integrated already, but not yet used.
https://github.com/Hermann-SW2/userland ... 58d67c4792

This is the pipeline I used:

Code: Select all

$ raspividyuv -md 7 -w 640 -h 480 -p 22,50,640,480 -fps 60 -o - | 
> sudo ./sample_yuv_airplane.2 640 480 380 100 10  ̶6̶0̶ 150 50 2>err2 |
> ./i420toh264 tst.h264 640 480 2>err
The animated .gif was created from part of the recorded video, and is played at 15fps, 4 times slower than real.
̶M̶a̶y̶b̶e̶ ̶a̶n̶o̶t̶h̶e̶r̶ ̶h̶e̶u̶r̶i̶s̶t̶i̶c̶/̶a̶l̶g̶o̶r̶i̶t̶h̶m̶ ̶i̶s̶ ̶n̶e̶e̶d̶e̶d̶ ̶t̶o̶ ̶a̶v̶o̶i̶d̶ ̶t̶h̶e̶ ̶b̶l̶u̶e̶ ̶p̶o̶i̶n̶t̶ ̶h̶o̶p̶p̶i̶n̶g̶ ̶i̶n̶s̶i̶d̶e̶ ̶b̶l̶a̶c̶k̶ ̶b̶a̶l̶l̶:̶
I used a too low quadratic side length for 2nd heuristic, after correction there is no hopping:
Image


P.S:
I made the P[an] stepper motor move -- it follows moving black ball automatically, a bit shaky, but it follows!
https://www.youtube.com/watch?v=dL2TQGBm6Fg

P.P.S:
The top line is x-coordinate of determined blue dot in black ball for a similar recording.
Follows movement, but oscilates (320=width/2 is target):
Image

P.P.P.S:
Same diagram for non-moving black ball:
https://github.com/Hermann-SW2/userland ... bdd93cb6c0
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Mon Nov 18, 2019 7:25 am

The explanation for the oscillation is simple.
I had implemented a P only PID controller with too high K_p constant.
It seems that I will have to implement my first PID controller for this application.
As 1st step I separated camera and stepper coordinate system and set K_p=0.5:
https://github.com/Hermann-SW2/userland ... dca55db85a
This gives much less oscillation result.
In order to reduce size of animated .gif I scaled down to 320x240, played 2 times slower than real:
Image

Next step: implement complete PID controller, for P and T stepper motors, and manual tuning:
https://en.wikipedia.org/wiki/PID_contr ... ual_tuning
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Mon Nov 18, 2019 4:25 pm

As first step I did enable P only PID control for the T[ilt] stepper motor of stepper PT camera system in addition to the P[an] stepper (was surprisingly simple):
https://github.com/Hermann-SW2/userland ... cb54e14215

I was able to overlay a fine crosshair reticle in i420toh264 bash pipeline with sample_yuv_alpha.c, but that negatively affected the stepper motor control. Since the reticle was needed only for inspection of how the control algorithm did, it was no problem to add it in post processing. I modified gordon77's Python script that did overlay reticle on live camera stream to work file2file:
https://www.raspberrypi.org/forums/view ... 0#p1568030

That way I added the reticle to the recorded .h264 video (and slowed down by factor 4 to 15fps). I took the first 7 seconds and converted them with gifenc.sh to animated .gif, scaling down to 320x240 for keeping size at 4MB, full video here:
https://www.youtube.com/watch?v=Yj9Dlw7YB8E
Image


The missing ID part of PID controller results in some oscillation, but maybe the current state would already be good enough to automatically center on airplane of approach for landing (the white dot painted on the airplane is determined by same 1st heuristic in use right now):
Image


Btw, this is the whole setup, on the right is the stepper PT camera system ("4 drops of superglue result in high precision PT camera system"), steppers controlled from Pi2B as well as video processing done with it. On the left you see the white area playing role of bright sky, and black ball playing role of airplane. The crutches in background are the reason why I have much more time to play with PIs, steppers and cameras than normally:
Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Fri Nov 22, 2019 5:28 pm

Based on the image of the 1.7km distant house of my friend (see below), further above in this thread I estimated the horizontal length of same distance Airbus A319 as 300 pixels in 2MP mode1 1920x1080 frame, when captured through 70mm 3.98° lens.

First I computed roughly how many pixel move in frame a single horizontal step of the stepper PT camera system would do.
For a v2 camera with 62.2° it would be slightly less than 3 pixels, for a v1 camera with 53.5° it would be slightly more than 3:

Code: Select all

1920/(53.5/360*4096)
3.15420560747663551404
1920/(62.2/360*4096)
2.71302250803858520912
Next I did same calculation for 3.98° lens, and here a single horizontal stepper motor step does 42 pixels already. So 8 stepper steps would allow to go from just behind A319 to just before:

Code: Select all

1920/(3.98/360*4096)
42.39949748743718595095
Maximal speed of A319 is 871km/h or 242m/s. In a single second A319 will move 2145 pixel horizontally. With 42 pixels per stepper motor step, 51 stepper motor steps in a second suffice to keep track of the airplane. But the stepper motor can easily do 1000 steps per second, so even for A319 viewed through 3.98° lens it will be totally sufficient:

Code: Select all

871/3.6
241.94444444444444444444
871/3.6/33.84*300
2144.89755713159968479000
871/3.6/33.84*300/(1920/(3.98/360*4096))
50.58780608820009920696
Image

Image
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

User avatar
HermannSW
Posts: 2767
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Camera controlled automatic PT approach for a landing capturing

Sat Nov 23, 2019 1:58 pm

I did some more calculations. While capturing airplaine is fine (with and without 70mm lens) with stepper PT camera system, it turned out that simple gravity pendulum is not a good choice for developing PID controller for airplane use case -- it is just too fast for the stepper to keep black ball centered!

Image
This was pendulum shown further above in the thread, this time at original speed.

https://en.wikipedia.org/wiki/Pendulum
Period 0.78s is is not depending on pendulum start angle, only on pendulum length 0.15m.

Code: Select all

$ bc -ql 
pi=4*a(1)
g=9.80665
l=0.15
2*pi*sqrt(l/g)
.77707897758731580516

https://physics.bu.edu/~redner/211-sp06 ... dulum.html
Maximal speed 0.463m/s is dependent on length and start angle.
I used gimp to determine start angle as 22°:

Code: Select all

deg=pi/180
define cos(x) { return c(x) }
a=22*deg
sqrt(2*g*l*(1-cos(a)))
.46284418089005321374

I did not want to loose the black ball before white background, and remembered that people used two motors for creating a table drawing robot.I just want to move black ball around, not draw.

Two drops of superglue fixated the two stepper motors,and I used Dremel to drill two holes into Lego brick with diameter 2mm. Then I did screw Arduino Uno with two M3x16 screws into the Lego brick, and fixated the Uno at top of white background. The Uno has two (different) ULN2003 driver boards that drive the two steppers:
Image


I used this small Arduino sketch to enter stepper motor position pair and let the steppers move the black ball to the desired position:

Code: Select all

#include <AccelStepper.h>

// ULN2003
AccelStepper L(AccelStepper::FULL4WIRE, 2, 4, 3, 5);
AccelStepper R(AccelStepper::FULL4WIRE, 9,11,10,12);

void setup()
{  
  L.setMaxSpeed(600);
  L.setAcceleration(300);
  R.setMaxSpeed(600);
  R.setAcceleration(300);
  Serial.begin(9600);
}

void loop()
{
  if (L.distanceToGo() == 0 &&
      R.distanceToGo() == 0)
  {
    Serial.print("pos? ");
    while (!Serial.available()) { delay(50); }

    int Lpos = Serial.parseInt();
    Serial.read();
    int Rpos = Serial.parseInt();
    if (Serial.read() == '\n') {
      Serial.print(Lpos);
      Serial.print(",");
      Serial.println(Rpos);
      L.moveTo(Lpos);
      R.moveTo(Rpos);
    }
  }
    
  L.run();
  R.run();
}
:


I took a v2 camera mode4 1640x1232 video with v1 camera (by upscaling 1296x972), at 5fps framerate:

Code: Select all

$ raspivid -md 4 -w 1640 -h 1232 -p 22,50,820,616 -t 0 -fps 5 -o tst.h264
Motion is slow, so I sped up the video by factor 5 for animated .gif, and scaled down framesize by factor of 10:
Image


At original speed motion is slower, but should be good enough for airplane tracking PID development.
I entered 4 pairs of coordinates into the Arduino serial monitor to produce this animation:

Code: Select all

0,0
12000,-4000
-4000,12000
6500,6500

P.S:
The not always perfect moves of black ball will be perfect to verify that PID controller is tuned in the end.
https://stamm-wilbrandt.de/en/Raspberry_camera.html
https://stamm-wilbrandt.de/en#raspcatbot
https://github.com/Hermann-SW/raspiraw
https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://stamm-wilbrandt.de/github_repo_i420toh264

Return to “Camera board”