Pipe streaming video from FFMPEG into OpenCV - python

I am using a Logitech C920 webcam with the beaglebone black running debian. I have successfully written an OpenCV python script that allows me get video from the camera and track objects. Unfortunately, the resolution, FPS, and overall video quality are poor using this method. I have found the proper code that allows me to utilize the webcam's H264 feature outside of OpenCV.
I've been trying to figure out how to pipe the output from ffmpeg INTO openCV so that I can utilize the camera's H264 capabilities as my video input for my OpenCV script. My commands look like this:
./capture -F -o -c0|avconv -re -i - -vcodec copy -f rtp rtp://192.168.7.1:1234/|python camera.py
The first part of the pipe is using the linux v4l2 program that allows me to utilize my camera's H264 capabilities. The output from that is then fed to ffmpeg and then streamed to RTP. I want to retrieve the RTP stream in my OpenCV script but I haven't had success in getting it to work. Would I initialize something like the following in my script:
capture = VideoCapture("rtp://192.168.7.1:1234/")
Please let me know if I am even going about this the right way. Is there a way to capture the stream directly from ffmpeg instead of streaming to RTP? Any suggestions are greatly appreciated. Thank you!

Related

Stream Webcam Feed to Server - webRTC

I have setup a raspberry pi to stream an h264 video feed from a camera to a remote host running Red5 Pro. The program is written in python as uses ffmpeg to transmit with camera feed. On the backend, I use openCV cv::VideoCapture to read the stream and process it. Im currently using RTMP and finding an ~1 sec delay. Along with RTMP, Red5Pro also supports webRTC which has a lower latency.
Can anyone explain how to send a video from a camera to a remote host using webRTC. All the information I found has been for streaming video to a local browser.

Capture UDP multicast stream with Python and OpenCV

I am working on a small project in Ubuntu 14.04 LTS. I want to capture an UDP multicast stream and get a frame from the stream to do image processing. I use opencv3.3.1 and python3.4 My codes :
import numpy,cv2
cap = cv2.VideoCapture('udp://#239.0.0.1:1234')
print (cap.isOpened())
Right i want to check if the stream is captured first. However, the program freeze at the VideoCapture line. In my opencv build, i have ffmpeg and gstreamer installed. Can anyone help?

picamera Python module: How to simultaneously capture H264 video to file and live stream

I've been trying to capture a H264 video file and a live video stream accessible via HTTP or RTP, using Python and the picamera module.
Currently, the example I am using is http://blog.miguelgrinberg.com/post/video-streaming-with-flask ,however it works only for JPG.
Is there a similar approach for using HTTP connection and live streaming of H264 video?

Recording video from raspberry pi and saving it to a external hard drive

I have a Raspberry pi model b, and a raspberry pi camera module
I also have either a 3TB external hard drive or a apple time capsule
What I want is to be able to record a video remotely (via ssh to issue commands) and then I want it to record for an unlimited time until I issue a command to stop the recording. I want the video to be streamed and saved directly to the time capsule if possible.
So easy way of explaining what i want
I plug in the raspberry pi and connect to it via ssh
Tell the raspberry pi to start recording a video at 1080p at 30fps
while the video is being recorded it is getting saved directly onto the time capsual
Have a live preview to my mac as the video is getting recorded so i can see if i need to adjust anything
Issue a stop command to end the recording.
Storage space is not really an issue for me.
This is what i have to work with
Raspberry Pi model B
8Gb SD card
something similar to this ( i don't know if its the same one exactly ) http://www.amazon.co.uk/Time-Capsule-500GB-Hard-Drive/dp/B00132B0MG
A Network card : Edimax EW-7811UN 150Mbps Wireless Nano USB Adapter
Mac or PC
This is my first real question and i've been searching for an answer so please excuse me if i have done something wrong or haven't put enough detail
Raspberry Pi Forums has some info on how it could be done (note, all examples here are ran on the pi, assuming correct software installed, etc.)
You could stream the video with the following command to get a live stream and use a script on your mac to pull in and save data
raspivid -t -0 -w 1280 -h 720 -fps 25 -b 2000000 -o - | ffmpeg -i - -vcodec copy -an -f flv -metadata streamName=myStream tcp://0.0.0.0:6666
Some investigation into the "tee" command will get the camera piping to a file as well as the stream. This question has an answer which explains tee thusly:
echo "foo bar" | sudo tee -a /path/to/some/file
So, combining the two, this may work for you:
raspivid -t -0 -w 1280 -h 720 -fps 25 -b 2000000 -o - |tee -a /home/pi/my_big_hard_drive/my_video.h264 | ffmpeg -i - -vcodec copy -an -f flv -metadata streamName=myStream tcp://0.0.0.0:6666
Now, you wrap that line up in a script, so that you can start it remotely, like this (easier if you transfer your ssh keys over first, so you don't have to enter passwords):
ssh -f pi#my_pi:/home/pi/bin/my_awesome_streamer.sh
Another script can then be used to kill the raspivid as & when necessary, something simple such as
sudo killall -QUIT raspivid
Should kill the program.
If you want to play with the stream directly on the mac, you can poke around the ssh man page and find out the cryptic flag combination which will allow you to also stream the data directly through the ssh connection to the mac also.

Streaming ffmpeg output over HTTP

As I am trying to connect the VLC Python Bindings with ffmpeg (see Exchange data between ffmpeg and video player) I thought that making ffmpeg to output the RTSP stream to STDOUT and "catching" it with a Python script and sending over HTTP would be a good idea. So I made a tiny HTTP server using SimpleHTTPServer from which I get the STDIN from FFMpeg and "output" it to the web.
This is the syntax I am using:
ffmpeg.exe -y -i rtsp://fms30.mediadirect.ro/live/utv/utv?tcp -acodec copy -vcodec copy -f flv - | \Python27\python.exe -u stdin2http.py
This seems to work, I can access the stream but nor the video, nor audio is playing. I tried with VLC on Windows, VLC and MPlayer on Linux and no success. Simply running
ffmpeg.exe -y -i rtsp://fms30.mediadirect.ro/live/utv/utv?tcp -acodec copy -vcodec copy -f flv - | vlc.exe -
works perfectly. So the problem seems to be when I'm writing the data from stdin to the web server. What I am doing wrong?
I played around with your stdin2http.py script. First, I was able to stream a media file (H.264 .mp4 file in my case) from a simple web server (webfsd) to VLC via HTTP. Then I tried the same thing using 'stdin2http.py < h264-file.mp4'. It didn't take.
I used the 'ngrep' utility to study the difference in the network conversations between the 2 servers. I think if you want to make this approach work, you will need to make stdin2http.py smarter and work with HTTP content ranges (which might involve a certain amount of buffering stdin data in your script in order to deal possible jumps in the stream).

Categories