I am trying to create a program that streams Music (MP3 files) over a UDP connection. So far I have created a program that sends the entire MP3 file over and writes it to a file on the client's machine. The client then plays the file with pygame.mixer.
This obviously is not streaming. I can not for the life of me figure out how to stream the music over a to the client.
If someone could point me on the right direction that would be great.
Live Streaming with udp would mean something like rtsp streaming. Take a look at live555 if you want to do some of that. There is a server available within it [live555mediaServer or some name like that] which you can use for rtsp streaming.
Gstreamer can also allow you to do basic stream using just pure rtp. Something like the following pipelines can allow you to do it.
gst-launch filesrc location=<yourfile> ! mp3parse ! rtpmpapay <someoptions> ! udpsink port=<someport>
and you could recieve it and dejitter it and then depay it and then decode and play it
gst-launch udpsrc port<the-some-port in the sender> ! gstrtpjitterbuffer ! rtpmpapay ! decodebin2 ! queue ! autoaudiosink
Or you could use ffserver to do the streaming. A bit of googling to understand rtp/rtsp would help you understand this stuff. There are plenty of servers already available to send the data out. [Darwin, Live555]
There are other forms of streaming too [rtmp which will need flv files] and smooth streaming and HLS. RTSP is what is the true live streaming protocol.
Related
To keep it simple I'm trying to copy the functionality of this project to use in my own: https://github.com/deepch/RTSPtoWebRTC
Using Go. this program takes an RTSP stream and redirects the raw stream data to a port,
then it generates an SDP packet that a WebRTC client consumes through POST and grabs the video. This results in instant streaming, with no noticeable latency.
Trying to implement this in python has proven to be very difficult. So far the only way I can see to interact with RTSP is to use OpenCV and I can't figure out a way to get the raw stream data from it.
I could some tips on libraries on use that do exactly that, take an RTSP stream and prepare it for WebRTC Travel, really anything.
this is my first question here (as I remember) so if I'm doing something wring with the process please let me know.
Here is the situation:
I have a computer with access to an IP Camera streaming video with rtsp and h264 encoded. This computer is in a local network and not accessible from outside (neither the IP camera). I have a server with a public IP and it is accessible with Internet. What I want, is to forward the rtsp video stream to that server, so it can access to it as the local computer can.
I have tried doing that with ffmpeg CLI, but I got better results with Gstreamer CLI. Here is the commands I got until now:
For the sender:
gst-launch-1.0 rtspsrc location=rtsp://urlToCamera ! udpsink host=127.0.0.1 port=5000
For receiver:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
This was using for now the same computer as sender and receiver, but I tried to change host IP to another computer of mine and it works, but the problem is that when the video's image changes, the video looks pixelated, like if some data was lost. I don't know if that is because of the use of udpsink, I tried to forward it by TCP but it is to slow like if the video was paused, but maybe I didn't do that the correct way of course.
I'm totally new on this topic about video conversion and transport, so the questions are:
Is there a way of improving this to solve the pixelated frames? Like adjusting some value, I saw in other posts that I could change the I-frame interval but I can't find that property in Gstreamer or where to put it.
If you have a different solution of doing this but with python scripts and accesing the forwarded stream with cv2 on the server side would be nice because I'm doing all this to reach that exact functionality. (But using usual cv2 imencode/imdecode functions to send frame by frame take a lot of the band width, that is why we want to forward the video in h264 codec)
If you could give me a direct example or just a guide to a possible solution I would appreciate that so much. Oh, the resolution of the camera is 640x480 with bitrate 1024 and 30FPS by the way.
Thanks in advance.
I have a project that works with video processing. I have many clients who send live video and one service that receives and processes the videos. The client is a Raspberry Pi that captures video and sends it to the service. The service is a Python module, which receives the video from all clients, and process them. Let's suppose that the client sends video at 24 FPS, and the service can process only 8 FPS. So, the service would need to take the newest frame, and drop others. In this case, it would take 1 frame and drop 2. In principle, audio is not required.
So, I want to know if there is a Python library to transmit video live streaming using UDP protocol.
Thanks!
The question is not crystal clear, but I deduce that..
You have several rtsp cameras, sending, say H264 video
In you rasperry pi you have processes that receive and decode the video. In addition to that, you want to do some image analysis, etc. for the decoded (yuv or rgb) bitmaps
Your rasp pi can't keep up with the pace, so the processes analyzing the video will be missing some frames
Right?
There's a python library that can stream video from several sources, decode the video and distribute it among python multiprocesses. Check out the tutorial in
http://www.valkka.fi
The learning curve might be a bit steep, but if you follow the tutorial from lesson 1, you'll be fine.
(disclaimer: I did that)
I am trying to show live webcam video stream on webpage and I have a working draft. However, I am not satisfied with the performance and looking for a better way to do the job.
I have a webcam connected to Raspberry PI and a web server which is a simple python-Flask server. Webcam images are captured by using OpenCV and formatted as JPEG. Later, those JPEGs are sent to one of the server's UDP ports. What I did up to this point is something like a homemade MJPEG(motion-jpeg) streaming.
At the server-side I have a simple python script that continuously reads UDP port and put JPEG image in the HTML5 canvas. That is fast enough to create a perception of a live stream.
Problems:
This compress the video very little. Actually it does not compress the video. It only decreases the size of a frame by formatting as JPEG.
FPS is low and also quality of the stream is not that good.
It is not a major point for now but UDP is not a secure way to stream video.
Server is busy with image picking from UDP. Needs threaded server design.
Alternatives:
I have used FFMPEG before to convert video formats and also stream pre-recorded video. I guess, it is possible to encode(let say H.264) and stream WebCam live video using ffmpeg or avconv. (Encoding)
Is this applicable on Raspberry PI ?
VLC is able to play live videos streamed on network. (Stream)
Is there any Media Player to embed on HTML/Javascript to handle
network stream like the VLC does ?
I have read about HLS (HTTP Live Stream) and MPEG-DASH.
Does these apply for this case ? If it does,how should I use them ?
Is there any other way to show live stream on webpage ?
RTSP is a secure protocol.
What is the best practice for transport layer protocol in video
streaming ?
I will try to answer as many of your listed "Problems" as I can:
FPS is low and also quality of the stream is not that good.
Since you are using Flask, you can use the immensely powerful url_for keyword to display your streamed frames directly into an img tag. Here is how you can achieve this:
On the webpage, add <img src="{{url_for('video_feed')}}" where you want your feed to be streamed.
At the backend, do the following:
def gen(camera):
frame = camera.read()
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n\r\n')
#app.route('/video_feed')
def video_feed():
return Response(gen(), mimetype='multipart/x-mixed-replace; boundary=frame')
Now, the img tag will display the image present on the endpoint video_feed which is of type multipart. Hence, it will keep asking for newer parts (newer frames) in your case.
I have found this approach to be quite fast!
Credits: Miguel Grinberg's blog post
It is not a major point for now but UDP is not a secure way to stream video.
Most video streaming devices don’t encrypt the video they stream, because doing so is computationally expensive. So while you might connect to an embedded web server on the device via HTTPS, and you probably have to log into the device to control it, all that security is limited to the “control plane.” The video itself will almost certainly be transmitted unencrypted. If it complies with an open standard, it is probably being sent over RTP/RTSP or over HTTP Live Streaming (HLS).
Source: Thoughts on Streaming Videos Securely
Server is busy with image picking from UDP. Needs threaded server design.
Using the above approach, I was able to interact with the server even while streaming the video. With the Flask development web server, you can add threaded=True to your app.run() call, or --with-threads if you use the Flask CLI.
You can also use Gunicorn in combination with gevent or eventlet to have further control over your threads.
You could use FFmpeg to mux the video stream in to H.264 in a mp4 container and then that can be directly used in a HTML5 video element.
I am doing a project on media (more like a video conferencing).
The problem is , although I am able to send text/string data from one peer to another, I am still not sure about video files.
Using gstreamer, I am able to capture video stream from my webcam and doing the encoding/coding (H.264) , I am able to write the video stream into actual mp4 contanier directly using a file sink
Now my problem is, I am not sure on reading the video files as it contains both audio and video streams, convert into transmission stream to transmit it using packets
(I am able to send a very small jpeg file although).
I am using socket module and implementing UDP
If you are to send a video(with audio) to a peer in the network, you would better use RTP(Real time Transfer Protocol) which works on top of UDP. RTP provides feature of timestamps and profile which help you syncronize the audio and video sent through two ports.