I have an IP camera that support onvif
I can read video from it with python using cv2 but don't know how read audio?
Any help would be appreciated
Thanks
Related
I got a problem with open cv real-time video capturing from camera. I have tried many solutions/suggestions which are available out there but not working for me. Following are the problems:
cap = cv2.VideoCapture(0)
when I tried to get video with this following error comes up:
[ WARN:0] VIDEOIO ERROR: V4L: can't open camera by index 0
Could not open video device
<VideoCapture 0x7fd99e6b5330>
Even though it is reading the video reference.
And from one solution by https://github.com/skvark/opencv-python/issues/124 I tried with
cap = cv2.VideoCapture(-1)
But this does not worked for me as it gives the error:
Video device not found
By trying sudo modprobe bcm2835-v4l2 command I got modprobe: FATAL: Module bcm2835-v4l2 not found in directory /lib/modules/4.15.0-66-generic this error. I'm not finding a way to get out of it.
It is working fine on my MacBook locally but when I tried to deploy on Ubuntu server I'm facing these errors in loop. Can anyone help me on it? Thanks in advance.
Can you open your webcam through other apps?
If not firslty try to reinstall webcam driver.
if you can do it, either some app is using webcam so cv2 can't get access to it or you did not install 3rdparty libraries to work with cameras/videos. You need to install them properly before building OpenCV Python bindings.
( you can find some info there: https://github.com/opencv/opencv/issues/8471 )
Following is the response from hosting server:
"The directory video0 would not exist because our droplets do not have any peripherals attached to them. The function cv2.VideoCapture() attempts to obtain the video capture from a webcam which a droplet would not have.
If you are wanting to stream a remote feed you should be able to do this via RTSP:
https://stackoverflow.com/questions/29099839/opencv-stream-from-a-camera-connected-to-a-remote-machine"
The problem is with droplet which don't have that functionality which we are trying to achieve.
Thanks to all for your love, support and help. Really appreciated. It might help someone else as well.
I have setup a raspberry pi to stream an h264 video feed from a camera to a remote host running Red5 Pro. The program is written in python as uses ffmpeg to transmit with camera feed. On the backend, I use openCV cv::VideoCapture to read the stream and process it. Im currently using RTMP and finding an ~1 sec delay. Along with RTMP, Red5Pro also supports webRTC which has a lower latency.
Can anyone explain how to send a video from a camera to a remote host using webRTC. All the information I found has been for streaming video to a local browser.
My current OpenCV version is 2.4.13. Python version is 2.7.11. My operating system is OS X Yosemite 10.10.5. I've been trying to use VideoCapture() to read url but keep getting this:
WARNING: Couldn't read movie file
VideoCapture() works fine with my laptop's webcam and local files. For example, I can open and play video like this:
cv2.VideoCapture("sg04video.mov")
But I am trying to access my ip camera with OpenCV. At first I thought I had the wrong url or it was some authentication problem. I disabled authentication, tried several camera urls(both rtsp and http) and could open them in VLC player. VLC player can also open other resources like this one: random rtsp video link
I even managed to open the url for capturing camera view image in my browser.
But cv2.VideoCapture cant open any of those urls, both video and image, whether its for my own camera or from the internet. Im sure the problem is not with the ip camera.
I am relatively new to programming and I installed OpenCV following this guide:
Installing OpenCV 2.4.9 on Mac OSX with Python Support
Please help!! Thanks!
I've been trying to capture a H264 video file and a live video stream accessible via HTTP or RTP, using Python and the picamera module.
Currently, the example I am using is http://blog.miguelgrinberg.com/post/video-streaming-with-flask ,however it works only for JPG.
Is there a similar approach for using HTTP connection and live streaming of H264 video?
I am using a Logitech C920 webcam with the beaglebone black running debian. I have successfully written an OpenCV python script that allows me get video from the camera and track objects. Unfortunately, the resolution, FPS, and overall video quality are poor using this method. I have found the proper code that allows me to utilize the webcam's H264 feature outside of OpenCV.
I've been trying to figure out how to pipe the output from ffmpeg INTO openCV so that I can utilize the camera's H264 capabilities as my video input for my OpenCV script. My commands look like this:
./capture -F -o -c0|avconv -re -i - -vcodec copy -f rtp rtp://192.168.7.1:1234/|python camera.py
The first part of the pipe is using the linux v4l2 program that allows me to utilize my camera's H264 capabilities. The output from that is then fed to ffmpeg and then streamed to RTP. I want to retrieve the RTP stream in my OpenCV script but I haven't had success in getting it to work. Would I initialize something like the following in my script:
capture = VideoCapture("rtp://192.168.7.1:1234/")
Please let me know if I am even going about this the right way. Is there a way to capture the stream directly from ffmpeg instead of streaming to RTP? Any suggestions are greatly appreciated. Thank you!