I think Faust has this topology like kafka stream:
enter image description here
I'm confused because I use faust,, what do you think?
thank you for your time
Related
I want to broadcast a video call to everyone who can see that person who is on call the same thing does on Instagram.
I have done WebRTC with WebSocket in python and its works like a charm, also done RTMP live streaming with the OBS system to HLS, DASH, and RTMP URL as well and it's all in the live server.
It will be appreciated if guided on how to broadcast that live video call to everyone via RTMP.
I try to read data from LE-01MR electricity meter, and according to it's documentation (https://www.fif.com.pl/en/usage-electric-power-meters/517-electricity-consumption-meter-le-01mr.html), I have to send command 0x03 to be able read data from it. Hovewer, I'm not able to do it, my result is always None, which, according to micropython documentation, tells me that I get timeouted (https://docs.micropython.org/en/latest/library/machine.UART.html#machine-uart). I tested if the wiring is correct with another computer connected to the wires, and I was able to read data from that, so wiring shouldn't be a problem. I connected tx/rx pins from esp8266 to tx/rx (respectively). Here is my code, any help would be appreciated:
import utime
from machine import UART
from machine import Pin
import uos
def test():
print("modbus")
uos.dupterm(None, 1)
modbus = UART(0)
modbus.init(9600, parity=None, stop=1, timeout=500, timeout_char=2, tx=Pin(1), rx=Pin(3))
print("Reading from modbus: {}".format(modbus))
print("Can read: {}".format(modbus.any()))
while True:
modbus.write(b'\x03')
result = modbus.read(4)
print("Value of reading: {}, type of {}".format(result, type(result)))
utime.sleep(0.5)
uart = UART(0, 115200)
uos.dupterm(uart, 1)
Thank you once again!
I think your understanding of the task is a bit off the mark.
Modbus is a full-blown protocol, what you need to do is send Modbus command 03 (read holding registers). What you are doing is just sending hex 0x03.
You should probably start by finding a good Modbus tutorial on YouTube or Google.
After you do that you will probably realize you need a Modbus library for your microcontroller. You might want to start here. I cannot guarantee it will work with your setup but it looks promising. You can of course write your own implementation of the Modbus protocol if you feel like doing it.
If you need more details after you start playing with the library feel free to edit your question.
Good luck and have fun with your project.
i used Wowza Streaming Engine for stream the camera of my drone using RTMP protocol , and i Convert RTMP to WebRTC With Wowza Streaming Engine (with UDP protocol ) to reduce the Latency , this step work perfectly ,you can see it in this IMAGE
Now i want to show this screen video of WebRTC with python language using opencv : like that
cap = cv2.VideoCapture('the URL of WebRTC')
how can be get the URL of WebRTC ???
Please i need your help .
I would recommend taking a look at aiortc library. You'll need something to create Python bindings to WebRTC unless you want to do that yourself. I have not used this library myself, but it appears to give you what you are looking for. Many other libraries only cover audio, not video.
this is my first question here (as I remember) so if I'm doing something wring with the process please let me know.
Here is the situation:
I have a computer with access to an IP Camera streaming video with rtsp and h264 encoded. This computer is in a local network and not accessible from outside (neither the IP camera). I have a server with a public IP and it is accessible with Internet. What I want, is to forward the rtsp video stream to that server, so it can access to it as the local computer can.
I have tried doing that with ffmpeg CLI, but I got better results with Gstreamer CLI. Here is the commands I got until now:
For the sender:
gst-launch-1.0 rtspsrc location=rtsp://urlToCamera ! udpsink host=127.0.0.1 port=5000
For receiver:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
This was using for now the same computer as sender and receiver, but I tried to change host IP to another computer of mine and it works, but the problem is that when the video's image changes, the video looks pixelated, like if some data was lost. I don't know if that is because of the use of udpsink, I tried to forward it by TCP but it is to slow like if the video was paused, but maybe I didn't do that the correct way of course.
I'm totally new on this topic about video conversion and transport, so the questions are:
Is there a way of improving this to solve the pixelated frames? Like adjusting some value, I saw in other posts that I could change the I-frame interval but I can't find that property in Gstreamer or where to put it.
If you have a different solution of doing this but with python scripts and accesing the forwarded stream with cv2 on the server side would be nice because I'm doing all this to reach that exact functionality. (But using usual cv2 imencode/imdecode functions to send frame by frame take a lot of the band width, that is why we want to forward the video in h264 codec)
If you could give me a direct example or just a guide to a possible solution I would appreciate that so much. Oh, the resolution of the camera is 640x480 with bitrate 1024 and 30FPS by the way.
Thanks in advance.
I am trying to create a program that streams Music (MP3 files) over a UDP connection. So far I have created a program that sends the entire MP3 file over and writes it to a file on the client's machine. The client then plays the file with pygame.mixer.
This obviously is not streaming. I can not for the life of me figure out how to stream the music over a to the client.
If someone could point me on the right direction that would be great.
Live Streaming with udp would mean something like rtsp streaming. Take a look at live555 if you want to do some of that. There is a server available within it [live555mediaServer or some name like that] which you can use for rtsp streaming.
Gstreamer can also allow you to do basic stream using just pure rtp. Something like the following pipelines can allow you to do it.
gst-launch filesrc location=<yourfile> ! mp3parse ! rtpmpapay <someoptions> ! udpsink port=<someport>
and you could recieve it and dejitter it and then depay it and then decode and play it
gst-launch udpsrc port<the-some-port in the sender> ! gstrtpjitterbuffer ! rtpmpapay ! decodebin2 ! queue ! autoaudiosink
Or you could use ffserver to do the streaming. A bit of googling to understand rtp/rtsp would help you understand this stuff. There are plenty of servers already available to send the data out. [Darwin, Live555]
There are other forms of streaming too [rtmp which will need flv files] and smooth streaming and HLS. RTSP is what is the true live streaming protocol.