I'm currently building a Twitch desktop client and I'm trying to get as close as possible to match twitch's website latency. I'm currently about 5-7 seconds behind twitch. I have the following settings enabled in streamlink.
session = streamlink.Streamlink()
# enable low latency
session.set_option("twitch-low-latency", True)
session.set_option("hls-live-edge", 2)
session.set_option("hls-segment-stream-data", True)
session.set_option("hls-playlist-reload-time", "live-edge")
session.set_option("stream-segment-threads", 2)
session.set_option("player", "vlc")
session.set_option("player-args", "--network-caching=200 --live-caching=200 --sout-mux-caching=200")
I'm wondering if anyone more familiar with this can help me reduce the difference even more. Thanks.
References:
https://streamlink.github.io/cli/plugins/twitch.html#low-latency-streaming
https://wiki.videolan.org/VLC_command-line_help
Related
I apologize ahead of time, as this may seem like a ridiculous question. I have used Python for the purposes of regex, automation and leveraging pandas for work. I would now like to run a web scraping script (nothing special, < 50 lines, 24 hours a day, which i think i will need server access for to do so(can't run on laptop if laptop is shut while i am sleeping) for personal use, but I have no idea how to go about this?
Would i need to rent a virtual machine, physical server space? My intent is to run the script and send an email with the results, its all very low tech and low capacity, more of a notification. How does one go about modifying their code to do something like this and then what kind of server access would i need? any suggestions would be helpful!!
UNIX style approach to the problem:
Rent a server ( VPS ) - you can find something for $2.5-$5
Create and upload your scraping script to the server
Create crontab ( tool that's responsible for executing your script on regular basic - say once per hour or whatever you want )
Use https://docs.python.org/3/library/email.html package for sending emails from Python
I am looking for better solution than I have right now, maybe smart people around StackOverflow might help me :)
So let's start with a problem:
I have a live video stream RTSP from my IP camera
I want to blur faces on that stream and output for mobile usage (hls, h.264, whatever)
All this should happen in almost real-time with the minimum of resources consumed
I plan to deploy this later to some cloud, so the less money I'll spend on resources - the better it is
Currently I have solution working next way:
I capture video using OpenCV
I update every frame with Gaussian Blur and save it to some folder
After some amount of frames I create MP4 / AVI / whatever video and make it accessible via HTTP url
All of it is running on Django for now
I know I am doing something wrong.
Looking for responses, thank you!
I would like developers to give a comment about :) thank you
I am building an application that needs an exact timestamp for multiple devices at the same time. Device time cannot be used, because they are not at the same time.
Asking time from the server is Ok but it is slow and depend on connection speed, and if making this as serverless/regioness then serverside timestamp cannot be used because of time-zones?
On demo application, this works fine with one backend at pointed region but still it needs to respond faster to clients.
Here is simply image to see this in another way?
In the image, there are no IDs etc but the main idea is there
I am planning to build this on nodejs server and later when more timing calculations translate it to python/Django pack...
Thank you
Using server time is perfectly fine and timezone difference due to different regions can be easily adjusted in code. If you are using a cloud provider, they typically group their services by region instead of distributing them worldwide, so adjusting the timezone shouldn't be a problem. Anyway, check the docs for the cloud service you are planning to use, they usually clearly document the geographic distribution.
Alternatively, you could consider implementing (or using) Network Time Protocol (NTP) on your devices. It's a fairly simple protocol for synchronizing clocks, if you need a source code example take a look at the source for node-ntp-client , a javascript implementation.
I have an Interactive Brokers [IB] account and am using the IB API to make an automated trading system in python. Version 1.0 is nearing the test stage.
I am thinking about creating a GUI for it so that I can real-time watch various custom indicators and adjust trading parameters. It is all (IB TWS/IB Gateway and my app) running on my local windows 10 pc (I could run it on Ubuntu if that made it easier) with startup config files presently being the only way to adjust parameter and then watch the results scroll by on the console window.
Eventually I would like to run both IB TWS/IB Gateway and the app on Amazon EC2/AWS and access it from anywhere. I only mention this as may be a consideration on how to setup the GUI now to avoid having to redo it then.
I am not going to write this myself and will contract someone else to do it. After spending 30+ hrs researching this I still really have no idea on what the best way would be to implement this (browser based, standalone app, etc.) and/or what skills the programmer would need for me to describe the job.
An estimate on how long it would take to get a bare bones GUI real-time displaying data from my app and real-time sending inputs back to my app would be additionally helpful.
The simplest and quickest way will probably be to add GUI directly to your Python App. If you don't need it to be pretty or to run on mobile, I'd say go with TKinter for simplicity. Then, connect to wherever the App is located and control it remotely.
Adding another component that will communicate with your Python App introduces a higher level of complexity which I think is redundant in this case.
You didn't specify in details what kind of data you will require the app to display. If this includes any form of charting, I'd use an existing charting software such as Ninjatrader / Multicharts / Sierracharts to run my indicators and see the positions status, and restrict the GUI of the python app to adjusting the trading parameters and reporting numerical stats.
I'm doing a small web application which might need to eventually scale somewhat, and am curious about Google App Engine. However, I am experiencing a problem with the development server (dev_appserver.py):
At seemingly random, requests will take 20-30 seconds to complete, even if there is no hard computation or data usage. One request might be really quick, even after changing a script of static file, but the next might be very slow. It seems to occur more systematically if the box has been left for a while without activity, but not always.
CPU and disk access is low during the period. There is not allot of data in my application either.
Does anyone know what could cause such random slowdowns? I've Google'd and searched here, but need some pointers.. /: I've also tried --clear_datastore and --use_sqlite, but the latter gives an error: DatabaseError('file is encrypted or is not a database',). Looking for the file, it does not seem to exist.
I am on Windows 8, python 2.7 and the most recent version of the App Engine SDK.
Don't worry about it. It (IIRC) keeps the whole DB (datastore) in memory using a "emulation" of the real thing. There are lots of other issues that you won't see when deployed.
I'd suggest that your hard drive is spinning down and the delay you see is it taking a few seconds to wake back up.
If this becomes a problem, develop using the deployed version. It's not so different.
Does this happen in all web browsers? I had issues like this when viewing a local app engine dev site in several browsers at the same time for cross-browser testing. IE would then struggle, with requests taking about as long as you describe.
If this is the issue, I found the problems didn't occur with IETester.
Sorry if it's not related, but I thought this was worth mentioning just in case.