This is another Raspberry Pi motion tracker based on Python, picamera and opencv.
It makes heavily use of the motion blocks generated by GPU of the Raspberry Pi.
Target A crossing event is detected on a GPIO port and a picture of the event is transmitted via nginx to a smart phone for example.
2021-04-13: V0.7.1 image released today. (pi password is still Olav01)
- capture sensitivity generally enhanced by using additional static data from camera and a more enhanced collection algorithm.
- to reduce noise the sensibility may be reduced by configuration file and/or web interface.
- improve nearby crossing with a new fast mode which may be changed by config file and/or web interface (V1:640x480@90fps; V2:928x704@90fps)
- web interface completely rewritten.
- snapshots are transferred on change only
- you can scroll through the last 50 snapshots by scrollmove from left to right
- The new Raspberry Pi 4 Model B is the new standard model. (This image is still running on Rpi3 but I will stop testing on that platform in future)
- The Raspberry can provide a Wifi hotspot (PICAM/Olav0101) in parallel to the standard Wifi client.
- If enabled you can connect via http://192.168.16.1 to the piCAMTracker web interface.
- The object detection is working quiet well in different light conditions.
- Low light performance increased a lot by variable framerate adaption.
- Dark backgrounds may be lightened by manually configurable exposure compensation. (positive values)
- Backlit shot accuracy may be anhanced by manually configurable exposure compensation. (negative values)
- The object detection has limits:
- The camera cannot distinguish between birds/bees and planes. All moving things are evaluated.
- moving objects in the turn area can lead to false positives. (grass, bushes, flags, etc)
- The minimum distance for fast moving objecst should be 7 meters. Objects moving too fast in front of the camera cannot be detected by the internal algorithm. (rule of thumb is trackMaturity x 2 in meters for a plane flying 40 m/s)
- to make faster movements possible I implemented a bypass of the full tracking.
- in fast mode nearby crossings should be detected with more accuracy.
- The fisheye setup with the V1 camera seems to be the best setup for our purposes. (F3F model air racing)
- I am using a V2 camera from Innomaker with a 144° lens in permanent enabled fast mode currently. (I will report once I have more experience)
- If you want to follow up more far away objects (F3B speed for example) the newer V2 camera is the better choice. There are lenses with longer focal length which enabled my testers to run speed mode quite well.
- To improve nearby crossing detection the camera could be mounted straight. (Not in 90 degrees as used before)
- Crossing is now in X direction (viewAngle: 0, xCross: 40, yCross: -1)
- V1 camera from Waveshare with fisheye lens (module G);
- "normal" mode: 1280x960 pixels, 42 f/s; full FOV; 192MB GPU memory
- fast mode: 640x480 @ 90 fps; full FOV
- Mode 5 (1280x720p @ 49 fps) does not center the frames horizontally
- V2 camera with standard lens;
- "normal" mode: 1632x896, 40 f/s; full FOV; 192MB GPU memory.
- fast mode: 928x704 @ 90 fps; reduced (but acceptable) FOV
- 1280x720p @ 62 f/s (FOV is very small)
- In general the V2 camera needs more light than the V1 with the fisheye
- In stormy conditions you need to fix the camera very well. Otherwise a lot of wrong positives are genenrated.
- The web interface supports the most rudimentary stuff to control the camera.
- For each debug signal a motion data and a video file is saved to disk. (To USB stick if connected)
- we have a printed cover available. (see wiki section)
- you can download the old V0.6.4 image here pi-password: Olav01
see FAQ section
see wiki section
- The next version will track with Kalman filter support