Replies: 4 comments 4 replies
-
Honestly, You would be better to start from scratch. the picamera2 module is completely different, and I'm not sure it can be made to work in the same fashion. |
Beta Was this translation helpful? Give feedback.
-
Here is a picamera2 version:
|
Beta Was this translation helpful? Give feedback.
-
I'll checkout your fork @chradev. Thanks for example @nikola-j. I'm also working on a picamera2 derivative -- but am using FASTApi rather than tornado. I haven't used jumuxer but rather closely following the picamera2 examples mjpyeg_server_2.py. This essentially takes video off picamera2, encodes with MJPEG, captures in a custom buffer, and then uses a frame generator to generate the MJPEG video rendered directly in the browser. my thinking, since I'm trying to make all this work in a PiZero2 is to use hardware encoding and hardware buffered as much as I can leaving CPU headroom for other work. i've got the stream working in FASTApi but I'm having some instability issues with thread locks particularly when two browsers sessions hit it or when a user accesses the endpoint/route for the stream directly (vs. through the HTML index page). I think that is probably because the picamera2 example was not designed for async so am down a rabbit hole now on making the buffer/frame generator asynchronous. |
Beta Was this translation helpful? Give feedback.
-
Hi All, Just food for thought:
I Like the Amcrest $99 turret cameras, they have an on-board wifi config interface. No stupid download app to configure. A better option than the PI would be the NVIDIA Jetson one SBC if you want to do your own motion detection/object detection image processing on the stream. Power consumption is a beast, but so is the processing. If you just process the motion clips stored on the camera, you might be able to take that $250 board and be running AI on multiple cameras on the cheap. And I had a lot of issues with WiFi H264 feeds on the pi. It would work for a bit and then get into some 'resend the packet' feedback loop that would take down the PI. Did much better with a WavLink POE Hat and hardwired ethernet connection. You can get away with a Wireless Bridge like a TPLink 800Megabit,but there is still video ghosting at only 300feet. Be careful of powering any larger 12v fans off that hat, the EMF will 'warp' the video coming across those ribbons. Those ribbons were also another headache. I don't know why they didn't just use USB3! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I attempted to run this on my Raspberry pi 4 with Picamera2 and coultn't get it to work.
the picamera2 docs only have a mjpeg server example and a seperate H264 quantiser example
Is there another branch/version of this repo where this works with picamera2 ?
I suspect the changes are something like the following however I don't speak python fluently.
To be added:
and then the camera references should be removed or replaced by picam2
where picam2 iis intialised and starts streaming.
Also in the try:
If anyone has any hints on getting this to work, I'm willing to give it some attempts but it's not familiar to me, I would need some guidance.
Beta Was this translation helpful? Give feedback.
All reactions