Capture UDP multicast stream with Python and OpenCV - python

I am working on a small project in Ubuntu 14.04 LTS. I want to capture an UDP multicast stream and get a frame from the stream to do image processing. I use opencv3.3.1 and python3.4 My codes :
import numpy,cv2
cap = cv2.VideoCapture('udp://#239.0.0.1:1234')
print (cap.isOpened())
Right i want to check if the stream is captured first. However, the program freeze at the VideoCapture line. In my opencv build, i have ffmpeg and gstreamer installed. Can anyone help?

Related

How to fix "VIDEOIO ERROR: V4L: can't find camera device" when opening a camera with Python OpenCV?

I am trying to use Python OpenCV on a Windows Ubuntu terminal for a research project. However, when I try to open my camera to capture video I get VIDEOIO ERROR: V4L: can't find camera device. How can I access my camera using OpenCV?
I tried to use v4l2-ctl --list-devices and I get Failed to open /dev/video0: No such file or directory. The camera on my computer works when using the camera application.
I am using the following cv2 command to grab the video capture device.
import cv2
stream = cv2.VideoCapture(-1)
Ubuntu for Windows is based on Windows Subsystem for Linux (WSL). WSL does not provide hardware support so it can't provide you with a driver for your camera. See this question as well.

Stream Webcam Feed to Server - webRTC

I have setup a raspberry pi to stream an h264 video feed from a camera to a remote host running Red5 Pro. The program is written in python as uses ffmpeg to transmit with camera feed. On the backend, I use openCV cv::VideoCapture to read the stream and process it. Im currently using RTMP and finding an ~1 sec delay. Along with RTMP, Red5Pro also supports webRTC which has a lower latency.
Can anyone explain how to send a video from a camera to a remote host using webRTC. All the information I found has been for streaming video to a local browser.

OpenCV VideoCapture cant open/read url

My current OpenCV version is 2.4.13. Python version is 2.7.11. My operating system is OS X Yosemite 10.10.5. I've been trying to use VideoCapture() to read url but keep getting this:
WARNING: Couldn't read movie file
VideoCapture() works fine with my laptop's webcam and local files. For example, I can open and play video like this:
cv2.VideoCapture("sg04video.mov")
But I am trying to access my ip camera with OpenCV. At first I thought I had the wrong url or it was some authentication problem. I disabled authentication, tried several camera urls(both rtsp and http) and could open them in VLC player. VLC player can also open other resources like this one: random rtsp video link
I even managed to open the url for capturing camera view image in my browser.
But cv2.VideoCapture cant open any of those urls, both video and image, whether its for my own camera or from the internet. Im sure the problem is not with the ip camera.
I am relatively new to programming and I installed OpenCV following this guide:
Installing OpenCV 2.4.9 on Mac OSX with Python Support
Please help!! Thanks!

Pipe streaming video from FFMPEG into OpenCV

I am using a Logitech C920 webcam with the beaglebone black running debian. I have successfully written an OpenCV python script that allows me get video from the camera and track objects. Unfortunately, the resolution, FPS, and overall video quality are poor using this method. I have found the proper code that allows me to utilize the webcam's H264 feature outside of OpenCV.
I've been trying to figure out how to pipe the output from ffmpeg INTO openCV so that I can utilize the camera's H264 capabilities as my video input for my OpenCV script. My commands look like this:
./capture -F -o -c0|avconv -re -i - -vcodec copy -f rtp rtp://192.168.7.1:1234/|python camera.py
The first part of the pipe is using the linux v4l2 program that allows me to utilize my camera's H264 capabilities. The output from that is then fed to ffmpeg and then streamed to RTP. I want to retrieve the RTP stream in my OpenCV script but I haven't had success in getting it to work. Would I initialize something like the following in my script:
capture = VideoCapture("rtp://192.168.7.1:1234/")
Please let me know if I am even going about this the right way. Is there a way to capture the stream directly from ffmpeg instead of streaming to RTP? Any suggestions are greatly appreciated. Thank you!

Stream mp3 file from Ubuntu source using python program (ideally) to Raspberry Pi

I am a newcomer to Linux and Python and I am trying to attempt to stream an audio file (preferably MP3) from a Ubuntu source computer to a Raspberry Pi running Raspbian for immediate playback.
I have explored various options such as gStreamer, Live555, VLC player and PulseAudio. I have also investigated TCP, RTP and RTSP. However I am struggling to get anything working properly. It seems as though I need to set up a RTSP Server using the computer and the Raspberry Pi as a RTSP client and I am not sure how to do this.
I'm wondering if anyone has any simple instructions or guides as to how to set up even a basic version of this with a specific MP3 file?
You can use netcat and mplayer for this
In the Raspberry Pi
sudo apt-get install mplayer
nc -l -p 1234 | mplayer -cache 8192 -
In your PC
cat your.mp3 | nc [RASPI IP] 1234
It's very crude, however it's very easy. Just take care that you'll need to relaunch netcat (nc) from each side everytime you need to play a new MP3.
Cheers
Source: https://www.linuxquestions.org/questions/slackware-14/send-audio-over-network-888169/

Categories