Running multiple Node.js or Python scripts - python

I was wandering if there was a way I could easily use Windows Task Scheduler in order to run multiple scripts at the same time. I am wanting to host multiple discord bots on a spare PC that would each have their own bot key for different discord servers; my current understanding is that you cannot easily run multiple node.js bots like this (currently I have 1 in the scheduler and the other I have to run manually) but I was wandering if this is something that can be done in Python or if I can make it happen with node.

I am not really into Python, don't fall for the name hehe
Anyway, logically speaking this is a threading problem, did you try enabling multiple threading in the program you are using? or dynamically allow the program/code to create multiple threads to handle multiple tasks related to the Node.js file you are using!?
Best Regards

Related

Execute several scripts at the same time in the background

I have a page where the user selects a Python script, and then this script executes.
My issue is that some scripts take a while to execute (up to 30m) so I'd like to run them in the background while the user can still navigate on the website.
I tried to use Celery but as I'm on Windows I couldn't do better than using --pool=solo which, while allowing the user to do something else, can only do so for one user at a time.
I also saw this thread while searching for a solution, but didn't manage to really understand how it worked nor how to implement it, as well as determine if it was really answering my problem...
So here is my question : how can I have multiple thread/multiple processes on Celery while on Windows ? Or if there's another way, how can I execute several tasks simultaneously in the background ?
Have you identified whether your slow scripts belong to CPU-bound tasks or I/O bound tasks?
if they're I/O bound, you can use eventlet and gevent based on Strategy 1 in the blog from distributedpython.com
but if they're CPU bound, you may have to think of using the ways like a dedicated Celery windows box (or windows Docker container) to workaround Celery billiard issue on Windows by setting the environment variable (FORKED_BY_MULTIPROCESSING=1) based on Strategy 2 in the blog from distributedpython.com

Send values from one python script to another one already running

I'm trying to understand how I can develop a system that is made from two python script running in a linux box.
The first one starts on system boot and always run, basically connects to a MQTT server and wait for the other python script. The second one is called from the command line, doing some works and then pass some data, basically three strings, to the first one and than exits.
Which way is the "correct" one to be used in this situation to pass data from script two to script one?
There are multiple ways to accomplish this (blocking and non-blocking). Depending on your MQTT server you could use it in a pub/sub approach to pass the data from the temporary script to the running one.
If that is not possible another pub/sub server could be used as e.g. redis. Especially the pub/sub functionality of redis is very useful for this. redis is well supported in python.
Another more lightweight possibility is to use a First In First Out (FIFO) queue c.f. article on using fifos in python or blocking vs. non-blocking fifos
FIFOs are easy to use if both processes run on the same computer. redis would be preferable if both scripts run on different computers.
There are more complex packages for inter-process communication around, e.g. rabbitMQ, zeroMQ,.. but they might be overkill for your use case...

Simple websocket server in Python for publishing

I have a running CLI application in Python that uses threads to execute some workers. Now I am writing a GUI using electron for this application. For simple requests/responses I am using gRPC to communicate between the Python application and the GUI.
I am, however, struggling to find a proper publishing mechanism to push data to the GUI: gRPCs integrated streaming won't work since it uses generators; as already mentioned my longer, blocking tasks are executed using threads (subclasses of threading.Thread). Also I'd like to emit certain events (e.g., the progress) from within those threads.
Then I've found the Flasks SocketIO implementation, which is, however, a blocking execution and thus not really suited for what I have in mind - I'd have to again execute two processes (Flask and my CLI application)...
Another package I've found is websockets but I can't get my head around how I could implement this producer() function that they mention in the patterns.
My last idea would be to deploy a broker-based message system like Redis or simply fall back to the brokerless zmq, which is a bit of a hassle to setup for the GUI application.
So the simple question:
Is there any easy framework that allows to create a server-"task" in a Python that I can pass messages to publish to?
For anyone struggling with concurrency in python:
No, there isn't any simple framework. IMHO pythons' concurrency handling is a bit of a mess (compared to other languages like golang, where concurrency is built in). There's multiple major packages implementing this, one of them asyncio, but most of them are incompatible. I've ended up using a similar solution like proposed in this question.

Is there a way to keep Telegram bot running when closing Python? [duplicate]

This question already has answers here:
How to do parallel programming in Python?
(10 answers)
Closed 5 years ago.
I've built a very simple Telegram bot by following this tutorial. So I have a file containing Python code, and when I run that code, the bot will echo what I say.
Is it true that the bot will only work when I have Python on and the code running? Would this mean that I cannot run any other script in Python at the same time, and neither can close Python down if I want my bot to keep working?
Is there any way to get around this, so that the bot will always be 'on'?
A Telegram bot is a Python program. When you run it, it do what it is supposed to do, then, if you stop the program, the bot stop to work. The problematic is common to all programs, particularily on a server. Think about Nginx, Apache, ssh, etc. Thay are all programs, and they all stop to do their job when they are closed.
If you want to make sure your bot will run always, you have to daemonize it. There is a lot of solutions.
You could transform your script to be a daemon, so when you launch it, it go directly to the background and continue to run until the server is shut down (or the program crash). But in that case, do your bot will re-run if you (or somebody else) restart the computer (server) ? There is some python libraries for this purpose, like daemonize.
Another common solution is to run your bot in a process manager. You can check supervisorctl for example, or you could decide to create a script to run your program from System V, UpStart or Systemd... This suppose you want to deploy your bot on a dedicated server or a VPS. This will be covered by the part 3 of the tutoriel you followed:
The next and final part of this series will [...] be demonstrating how to deploy the Bot to a VPS.
You could also consider encapsulating your bot into an image or a container (Docker, etc.) to run it on a compatible platform.
You should not have a problem running two consoles in Python, on your computer, at least. Your code should only run when Python is open on your computer, correct. As Eli correctly pointed out, a daemon would be suitable if you wanted to host locally.
However, what gets difficult is if you want to have it continuously running online. For example, with Reddit bots that search and post comments on posts, you need to host these through some cloud based service. I suggest using Amazon Web Services, which has a free trial giving you more than enough for basic Python needs. Some people will also use Heroku. Pretty much you're able to save the state of your current Python window, and it will run constantly.
I'd check out this post to see how to set up "screen" in AWS.

What are some good examples of processes to automate for a python beginner to start with?

Trying to get some initial bearings on useful processes that a basic working knowledge of python can assist with or make less tedious. Specifically, processes that can be executed on the command line in a Linux environment. An example or two of both the tedious process as well as sample code to use as a starting point would be greatly appreciated.
What you want to automate depends on what you are doing manually and what your role is ? If you are a system administrator (say) and if you have shell scripts written to automate some of the tasks (like server management, user account creation etc.) you can port them to Python.

Categories