I have setup a raspberry pi to stream an h264 video feed from a camera to a remote host running Red5 Pro. The program is written in python as uses ffmpeg to transmit with camera feed. On the backend, I use openCV cv::VideoCapture to read the stream and process it. Im currently using RTMP and finding an ~1 sec delay. Along with RTMP, Red5Pro also supports webRTC which has a lower latency.
Can anyone explain how to send a video from a camera to a remote host using webRTC. All the information I found has been for streaming video to a local browser.
Related
I would like to send sensor data (string) and live video from the Raspberry Pi camera over Bluetooth. I would like to do this with python. I would like to display the video on the PC in a tkinter window. Any ideas??
Raspberry Pi: Pi 3
PC: Windows 10 with Python 3.8.3 64 bit
Video might be a bit tricky. More details at: https://stackoverflow.com/a/64062680/7721752
Bluetooth on Windows with Python is not that well supported. However, with Python 3.9 there is support for Serial Port Profile (SPP). There is details of how to create the client at: https://stackoverflow.com/a/62815818/7721752
To create the Server on the Raspberry Pi (RPi) I would use the Bluedot library:
https://bluedot.readthedocs.io/en/latest/btcommapi.html#bluetoothserver
(Bluedot only works on RPi)
Developing both ends of a Bluetooth link at the same time is tricky if you are new to creating Bluetooth. My suggestion would be to create the Server on the RPi first and use an app like Serial Bluetooth Terminal on your phone to test it. Once that is working start on developing the client.
I have a program that is a face recognition application and I want to run it on a PC client.
But it shows an error like this
I think it might be that the program cannot connect to any camera device in my client ,
in my python code i use this
cap=cv2.VideoCapture(0)
and i have install logitech camera driver in my client pc
but it is still can not detect any camera in my client
how can i solve this?
I've had this issue before and what solved it for me was using a different USB port on the PC for the webcam. It was due to the driver being connected to certain ports that opencv read from.
I am working on a small project in Ubuntu 14.04 LTS. I want to capture an UDP multicast stream and get a frame from the stream to do image processing. I use opencv3.3.1 and python3.4 My codes :
import numpy,cv2
cap = cv2.VideoCapture('udp://#239.0.0.1:1234')
print (cap.isOpened())
Right i want to check if the stream is captured first. However, the program freeze at the VideoCapture line. In my opencv build, i have ffmpeg and gstreamer installed. Can anyone help?
i have made a P2P video streaming application using nodejs.Now i want to deploy that system on orange-pi and on orange-pi there are no browsers that support live streaming.So i was thinking that instead of opening the browser can i write a script in nodejs or python to capture stream directly from my orange-pi and send it to other peer?.Is it possible?
Thanks
I've been trying to capture a H264 video file and a live video stream accessible via HTTP or RTP, using Python and the picamera module.
Currently, the example I am using is http://blog.miguelgrinberg.com/post/video-streaming-with-flask ,however it works only for JPG.
Is there a similar approach for using HTTP connection and live streaming of H264 video?