As I am trying to connect the VLC Python Bindings with ffmpeg (see Exchange data between ffmpeg and video player) I thought that making ffmpeg to output the RTSP stream to STDOUT and "catching" it with a Python script and sending over HTTP would be a good idea. So I made a tiny HTTP server using SimpleHTTPServer from which I get the STDIN from FFMpeg and "output" it to the web.
This is the syntax I am using:
ffmpeg.exe -y -i rtsp://fms30.mediadirect.ro/live/utv/utv?tcp -acodec copy -vcodec copy -f flv - | \Python27\python.exe -u stdin2http.py
This seems to work, I can access the stream but nor the video, nor audio is playing. I tried with VLC on Windows, VLC and MPlayer on Linux and no success. Simply running
ffmpeg.exe -y -i rtsp://fms30.mediadirect.ro/live/utv/utv?tcp -acodec copy -vcodec copy -f flv - | vlc.exe -
works perfectly. So the problem seems to be when I'm writing the data from stdin to the web server. What I am doing wrong?
I played around with your stdin2http.py script. First, I was able to stream a media file (H.264 .mp4 file in my case) from a simple web server (webfsd) to VLC via HTTP. Then I tried the same thing using 'stdin2http.py < h264-file.mp4'. It didn't take.
I used the 'ngrep' utility to study the difference in the network conversations between the 2 servers. I think if you want to make this approach work, you will need to make stdin2http.py smarter and work with HTTP content ranges (which might involve a certain amount of buffering stdin data in your script in order to deal possible jumps in the stream).
Related
I am trying to build a service/application/script on MacBook that can help me share my desktop screen [Mirror my screen] over a local IP address.
Or I want to display it on a webpage using flask framework using python.
I just wish to look at my desktop screen along with the mouse movements on a web-browser, just like a Remote Desktop.
I did try the following command, but when I try to open my browser and go to the IP address it just shows a blank screen
sudo ffmpeg -f avfoundation -framerate 30 -pix_fmt uyvy422 -i "0" -listen 1 -f mp4 -movflags frag_keyframe+empty_moov -preset ultrafast -tune zerolatency "http://localhost:8000"
Please kindly help!!
use WebRTC?
no usr WebRTCCamera only use datachannel
I was trying to test a stream in a docker instance. It was pretty common in work flow
docker pull ubuntu
docker run -it ubuntu /bin/sh
apt-get install -y python python3.6 vlc curl
curl https://bootstrap.pypa.io/get-pip.py > git-pip.py
python get-pip.py
pip install streamlink
useradd vlcuser
su vlcuser
pip install vlc
streamlink https:www//myurl worst
and then it will print something like:
$ streamlink https:www//myurl worst
[cli][info] Found matching plugin twitch for URL https:www//myurl
[cli][info] Available streams: audio_only, 160p (worst), 360p, 480p, 720p (best)
[cli][info] Opening stream: 160p (hls)
[cli][info] Starting player: /usr/bin/vlc
[cli][info] Player closed
[cli][info] Stream ended
[cli][info] Closing currently open stream...
but i cant figure out why the player immediately closes. Is there a way to keep it open?
I was originally having issues with VLC but running it as non root got me to this point. Im just not sure why the stream fails to stay open. As of right now, I am not Authenticated for Twitch etc. I was trying to set it up to be user agnostic as it is just a public stream i wanted to look at
It seems like the trick is to not use VLC at all.
Inside of streamlink there is a param called: --player-external-http which wont open the player but essentially set up a means to forward the stream through.
This will keep the streams open and VLC will not close. Im not sure if it has the same effect as running VLC. I figure syncing onto a stream would count as a view.
I am using a Logitech C920 webcam with the beaglebone black running debian. I have successfully written an OpenCV python script that allows me get video from the camera and track objects. Unfortunately, the resolution, FPS, and overall video quality are poor using this method. I have found the proper code that allows me to utilize the webcam's H264 feature outside of OpenCV.
I've been trying to figure out how to pipe the output from ffmpeg INTO openCV so that I can utilize the camera's H264 capabilities as my video input for my OpenCV script. My commands look like this:
./capture -F -o -c0|avconv -re -i - -vcodec copy -f rtp rtp://192.168.7.1:1234/|python camera.py
The first part of the pipe is using the linux v4l2 program that allows me to utilize my camera's H264 capabilities. The output from that is then fed to ffmpeg and then streamed to RTP. I want to retrieve the RTP stream in my OpenCV script but I haven't had success in getting it to work. Would I initialize something like the following in my script:
capture = VideoCapture("rtp://192.168.7.1:1234/")
Please let me know if I am even going about this the right way. Is there a way to capture the stream directly from ffmpeg instead of streaming to RTP? Any suggestions are greatly appreciated. Thank you!
I have a Raspberry pi model b, and a raspberry pi camera module
I also have either a 3TB external hard drive or a apple time capsule
What I want is to be able to record a video remotely (via ssh to issue commands) and then I want it to record for an unlimited time until I issue a command to stop the recording. I want the video to be streamed and saved directly to the time capsule if possible.
So easy way of explaining what i want
I plug in the raspberry pi and connect to it via ssh
Tell the raspberry pi to start recording a video at 1080p at 30fps
while the video is being recorded it is getting saved directly onto the time capsual
Have a live preview to my mac as the video is getting recorded so i can see if i need to adjust anything
Issue a stop command to end the recording.
Storage space is not really an issue for me.
This is what i have to work with
Raspberry Pi model B
8Gb SD card
something similar to this ( i don't know if its the same one exactly ) http://www.amazon.co.uk/Time-Capsule-500GB-Hard-Drive/dp/B00132B0MG
A Network card : Edimax EW-7811UN 150Mbps Wireless Nano USB Adapter
Mac or PC
This is my first real question and i've been searching for an answer so please excuse me if i have done something wrong or haven't put enough detail
Raspberry Pi Forums has some info on how it could be done (note, all examples here are ran on the pi, assuming correct software installed, etc.)
You could stream the video with the following command to get a live stream and use a script on your mac to pull in and save data
raspivid -t -0 -w 1280 -h 720 -fps 25 -b 2000000 -o - | ffmpeg -i - -vcodec copy -an -f flv -metadata streamName=myStream tcp://0.0.0.0:6666
Some investigation into the "tee" command will get the camera piping to a file as well as the stream. This question has an answer which explains tee thusly:
echo "foo bar" | sudo tee -a /path/to/some/file
So, combining the two, this may work for you:
raspivid -t -0 -w 1280 -h 720 -fps 25 -b 2000000 -o - |tee -a /home/pi/my_big_hard_drive/my_video.h264 | ffmpeg -i - -vcodec copy -an -f flv -metadata streamName=myStream tcp://0.0.0.0:6666
Now, you wrap that line up in a script, so that you can start it remotely, like this (easier if you transfer your ssh keys over first, so you don't have to enter passwords):
ssh -f pi#my_pi:/home/pi/bin/my_awesome_streamer.sh
Another script can then be used to kill the raspivid as & when necessary, something simple such as
sudo killall -QUIT raspivid
Should kill the program.
If you want to play with the stream directly on the mac, you can poke around the ssh man page and find out the cryptic flag combination which will allow you to also stream the data directly through the ssh connection to the mac also.
I am a newcomer to Linux and Python and I am trying to attempt to stream an audio file (preferably MP3) from a Ubuntu source computer to a Raspberry Pi running Raspbian for immediate playback.
I have explored various options such as gStreamer, Live555, VLC player and PulseAudio. I have also investigated TCP, RTP and RTSP. However I am struggling to get anything working properly. It seems as though I need to set up a RTSP Server using the computer and the Raspberry Pi as a RTSP client and I am not sure how to do this.
I'm wondering if anyone has any simple instructions or guides as to how to set up even a basic version of this with a specific MP3 file?
You can use netcat and mplayer for this
In the Raspberry Pi
sudo apt-get install mplayer
nc -l -p 1234 | mplayer -cache 8192 -
In your PC
cat your.mp3 | nc [RASPI IP] 1234
It's very crude, however it's very easy. Just take care that you'll need to relaunch netcat (nc) from each side everytime you need to play a new MP3.
Cheers
Source: https://www.linuxquestions.org/questions/slackware-14/send-audio-over-network-888169/