I would like to make a streaming server with python/twisted, which receives a WebRTC video stream and then applys some OpenCV algorithms to it.
However I cannot find a python module for WebRTC. How can I send and receive a WebRTC video stream with python/twisted?
Thanks!
I have started putting together the basic blocks needed to create a Python WebRTC endpoint.
One is an asyncio-based Interactive Connectivity Establishment module:
https://github.com/jlaine/aioice
Another one is a Python binding for libsrtp:
https://github.com/jlaine/pylibsrtp
We also need SRTP keying support in the OpenSSL bindings:
https://github.com/pyca/cryptography/pull/4099
On top of this, we can then build an asyncio-based WebRTC implementation:
https://github.com/jlaine/aiortc
I have been able to get both Chrome and Firefox to establish an audio and video stream to a Python-based server.
What you can do is take screen shots continuously and push them to a websocket and allow your twisted server to take a gander at each one as it comes in.
I have modified some common recorders and my version takes Jpeg images and pushes them over a websocket. Feel free to use and modify how you want so that it fits your needs. Source code here. The example I use is pushing down to a libwebsocket server built in C but the same javascript could be used to send to any websocket server.
I've had a similar issue and ended up creating a server that launches a headless chrome instance from which I can access the WebRTC streams, record chunks with a MediaRecorder and finally forward those chunks on via a WebSocket.
I'd love a python based solution so I wouldn't need the intermediary server launching headless chrome instances but haven't been able to find one.
I've been using Node.js and Puppeteer but one could launch the browser instances from your python server and then send the decoded data back via plain old sockets or whatever else tickles your fancy.
Related
Recently i have been working on a project to stream Hololens video and audio to python server.
That is sending hololens video frame to python server and use python to do image processing.
On the Hololens i use unity as development platform, and i want to use webrtc (base on this repo Mixed-reality webrtc).
I think i just know the concept of how webrtc working. But to the code, i just don't have idea where to start implementing this technique.
Hope anyone can give me come hint.
I am not familiar with Python, but your Python server as a video receiver it should follow WebRTC specification. I believe there are reliable Python libraries related to WebRTC in Github.
For HoloLens side, since you mentioned that you are using the Unity for development, This step-by-step guidance shows you how to implement Mixed-Reality WebRtc in Unity.
In short, we should add a PeerConnection component to a new GameObject in your scene. Then add the NodeDssSignaler component and configure it to connect to a Node-DSS server. Next, create a new game object and add a WebcamSource component, this component can generate video frames from a local video capture device (webcam), we need to config it. Finally, assign the Video Sender property of the PeerConnection component to the WebcanSource component created previously.
I've build before server-client programs (both sides where build in python by far).
Recently I started building app using swift and my goal is to add a backend to my apps using python (My app is a chat app)
I searched in the Internet a tutorials to do so, and I only saw two options to communicate between server side and mobile application, the first one is to create an API (REST) (request - response) - I can't use this solution because I want a real-time chat.
And the second option was web-sockets (socket.IO).
SO, my question is why not use the simple socket technology (like I used to use when it was only python server side to python client side -> import sockets) - no sockets over web
following Features You will get if you are using Socket.io or socketcluster.io (which is developed on the top of Socket IO)
scalability :- It will scale horizontally adding more nodes (scale-out) & Linearly(scale-up)
Reduces Payload size as message payload is compressed
Authorisation via middle ware functions
Reconnects Automatically if Connection drops
If You want to use your own implementation then you have to take care of the above features/Solutions to problems which arises when User-base is increases.
My understanding is Socket.IO isn't necessary anymore because all browsers worthwhile constantly keep each other in check. Socket.IO was for when browsers and servers didn't support the same technology. These days, everything is pretty much supported and Socket is perfectly safe to stick to without the use of Socket.IO. More of a breakdown here - https://codeburst.io/why-you-don-t-need-socket-io-6848f1c871cd
I've build a little device based on the raspberry pi. Now I want to configure it using my web server. The idea is that I enter all the details on my django web page and then the device just pulls that off the server.
But there are two problems I'm not sure how to solve:
I have multiple devices for multiple users so some kind of Login must be provided.
The device also sends pictures from time to time. Right now it's using FTP with a general login, but I want to personalize that too for every device. The uploads will need a resume function so http is out!
So the basic question is: Should I get started with sockets or is there a better and safer way to do it? Maybe there is some kind of open source library that's been tested a lot?
Instead of hand coding sockets, I would suggest using HTTP with BASIC authentication to communicate between the device and the web server. You can uniquely assign an id/pwd to each device, and BASIC authentication is well supported by all web servers and client side libraries.
There are some security concerns with BASIC authentication even if you use HTTPS, but it maybe acceptable in your particular case here.
Maybe you could use SSH, with Fabric for instance. Here an example.
I have this AutobahnPython server up and running fine.
https://github.com/tavendo/AutobahnPython/blob/master/examples/websocket/streaming/streaming_server.py
I want to attach a HTML5 Front end for capture of web cam video and audio.
How do I get the HTML5 Blob to send through the socket I just created in HTML5 to the python socket server I also have running?
Is it sendMessage?
https://autobahnpython.readthedocs.org/en/latest/websocketbase.html#autobahn.websocket.WebSocketProtocol.sendMessage
Be prepared, doing what you want, and doing it right (which means flow-control), is an advanced topic. I try to give you a couple of hints. You might be also interested in reading this.
WebSocketProtocol.sendMessage is part of the AutobahnPython API. To be precise, it is part of the message-based basic API. Whereas the streaming server above uses the advanced API for receiving, it uses the basic API for sending (since the sent data is small, and there is no need for flow control)
Now, in your case, the web cam is the "mass data" producer. You will want to flow-control the sending from the JS to the server. Since if you just send out WebSocket messages from JS as fast as you get data from cam, your upstream connection might not keep up, and the browser's memory will just run away. Read about bufferedAmount which is part of the JS WebSocket API.
If you just want to consume data is it flow into your server, above AutobahnPython streaming server example is a good starting point since: you can process WebSocket data as it comes in. Other WebSocket frameworks will first buffer up a complete message until they give the message to you.
If you want to redistribute the data received by your server again to other connected client, you will want flow-control on the server's outgoing leg also. And then you will need the advanced API for sending also. See the reference or the streaming (producer) client examples - you can adjust the code to run inside your server.
Now if above all does not make sense to you .. it's a non-trivial thing. Try reading the first link to the Autobahn forum, and more about flow-control. It is also non-trivial since the JS WebSocket API has only limited machinery for doing this kind of flow-control, without falling back to invent your own scheme at app level. Well. Anyway, hope that helps a little.
I have a big problem, and I am having a hard time solving it. I have a custom made game controller, which outputs some data from it's sensors via serial communication and is connected to PC via serial port. I do the callculation of the current controller position in a Matlab script. I am building a web application that will display the data (position) of the device in a web browser, but can't seem to work out, how to connect my device to the browser. Matlab script sends all the position data to a UDP port with a sampling fequency of 100HZ (100 samples per second). I need to make a persistent connection between a web browser and my matlab script so I will be able to display the data. I am thinking about using web sockets API. but it does not "speak" UDP. So my idea was to somehow read the data with from UDP with a custom Python server and then create a websocket on that Python server and send data received via UDP port to web browser. Oh, and it would be nice if I could communicate in both directions. Will this work? Any ideas on how to do it? How is this usually done, I mean how can one connect let's say some temperature sensor to web browser to display data in real time?
Any answer will be gladly appreciated.
Thanks,
Leon
Note that although the WebSockets protocol is built on TCP sockets, the WebSockets protocol is not raw TCP sockets. A WebSockets connection has an HTTP friendly handshake (with some CORS functionality built-in). WebSockets are also message based (rather than streaming like TCP) so each message has a couple of bytes of framing headers.
You might look at websockify (disclaimer: I made websockify). Websockify is a python server that bridges/proxies between WebSockets and plain TCP sockets. I don't think it would be particularly difficult to adapt it to handle UDP sockets on the backend.
WebSockify (designed to be used together with the included include/websock.js front-end library) supports binary data even over the older Hixie versions of the protocol. This allows it to work with iOS (iPhone,iPad) devices which still only support the older version of the protocol.