Run python script via different server - python

I have two servers A and B.
On the server A there are running multiple python scripts.
I want certain operations of those scripts to run from the IP of the server B.
Those operations are using requests and urllib.request
I don't want to rewrite the whole application so that it runs on server B.
Is it possible to continue running from server A but for some of the scripts to make the requests via server B? What technologies should I look into using?

It looks like a Celery use case. Celery is a library used to execute Python code on worker through Redis or RabbitMQ, it's pretty useful when you want to execute some function in background.
https://github.com/celery/celery
Does this solution fit with your problem ?

Related

Proper architecture/pattern to use redis-rq as remote task runner

I am doing research that requires me to run multiple experiments with many permutations of parameters. My main issue is that I would like to free up my main machine for my daily tasks and offload my experiments to other machines (I have two extra laptops).
Currently I am using redis-rq to queue and run tasks on those "remote servers". Once redis is running on my main machine and my remote servers, I simply run my code which queues the tasks to the specific redis-rq port on my remote (via ssh). This seems to work fine except that I need to make sure to push my code to my remote server before doing this otherwise the task will fail. The remote will essentially have old code on it.
I have two questions:
Does this pattern make sense or is there a better way for me to offload tasks to remote servers?
Is there a way I can ensure the remote always has up-to-date code when I start queueing tasks? (currently thinking of including a function in my code that will copy the current directory to the remote via scp)
Thanks for your help with this,
NB: my code is all in python and would prefer to keep it that way

How to start asyncio server on remote server with Python?

I have a virtual server available which runs Linux, having 8 cores. 32 GB RAM, and 1 TB in addition. It should be a development environment. (same for test and prod) This is what I could get from IT. Server can only be accessed via so-called jump servers by putty or direct tcp/ip ports (ssh is a must).
The application I am working on starts several processes via multiprocessing. In every process an asyncio event loop is started, and an asyncio socket server in some cases. Basically it is a low level data streaming and processing application (unfortunately no kafka or similar technology available yet). The live application runs forever, no or limited interaction with the user (reads/processes/writes data).
I assume, IPython is an option for this, but - and maybe I am wrong - I think it starts new kernels per client request, but I need to start new process from the main code w/o user interaction. If so, this can be an option for monitoring the application, gathering data from it, sending new user commands to the main module, but not sure how to run processes and asyncio servers remotely.
I would like to understand how these can be done on the given environment. I do not know where to start, what alternatives there are. And I do not understand ipython properly, their page is not obviuos to me yet.
Please help me out! Thank you in advance!
After lots of research and learning I came across to a possible solution in our "sandbox" environment. First, I had to split the problem into several sub-problems:
"remote" development
paralllization
scheduling and executing parallel codes
data sharing between these "engines"
controlling these "engines"
Let's see in details:
Remote development means you want write your code on your laptop, but the code must be executed on a remote server. Easy answer is Jupyter Notebook (or equivalent solution) although it has several trade-offs, also other solutions are available, but this was faster to deploy and use and had the least dependency, maintenance, etc.
parallelization: had several challenges with iPython kernel when working with multiprocessing, so every code that must run parallel will be written in separated Jupyter Notebook. In a single code I can still use eventloop to get async behaviour
executing parallel codes: there are several options I will use :
iPyParallel - "workaround" for multiprocessing
papermill - execute JNs with parameters from command line (optional)
using %%writefile magic command in Jupyter Notebook - create importables
os task scheduler like cron.
async with eventloops
No option yet: docker, multiprocessing, multithreading, cloud (aws, azure, google...)
data sharing: selected ZeroMQ, took time to learn but was simpler and easier than writing everything on pure sockets. There are alternatives but come with extra dependency, and some very useful benefit (will check them later): RabbitMQ, Redis message broker, etc. The reasons for preferring ZMQ: fast, simple, elegant, and just a library. (Knonw risk: our IT will prefer RabbitMQ, but that problem comes later :-) )
controlling the engines: now this answer is obvious: separate python code (can be tested as JN code but easy to turn into pure .py and schedule it). This one can communicate with the other modules via ZMQ sockets: healthcheck, sending new parameters, commands, etc....

Get nodejs server to trigger a python script

I have a node.js server running on a Raspberry Pi 3 B+. (I'm using node because I need the capabilities of a bluetooth library that works well).
Once the node server picks up a message from a bluetooth device, I want it to fire off an event/command/call to a different python script running on the same device.
What is the best way to do this? I've looked into spawning child processes and running the script in them, but that seems messy... Additionally, should I set up a socket between them and stream data through it? I imagine this is done often, what is the consensus solution?
Running a child process is how you would run a python script. That's how you do it from nodejs or any other program (besides a python program).
There are dozens of options for communicating between the python script and the nodejs program. The simplest would be stdin/stdout which are automatically set up for you when you create the child process, but you could also give the nodejs app a local http server that the python script could communicate with or vice versa.
Or, set up a regular socket between the two.
If, as you now indicate in a comment, your python script is already running, then you may want to use a local http server in the nodejs app and the python script can just send an http request to that local http server whenever it has some data it wants to pass to the nodejs app. Or, if you primarily want data to flow the opposite direction, you can put the http server in the python app and have the nodejs server send data to the python app.
If you want good bidirectional capabilities, then you could also set up a socket.io connection between the two and then you can easily send messages either way at any time.

Is there a way run a python script on another server?

I have a Djnago web application on IIS 6 on one server. Is there a way that from this website I call another python script that is on another server in such a way that that script just run itself there?
Calling or Runnig that script in the usual way as internet says, is not working.
I always get the error of os.getcwd() and it also doesn't allow to change that directory.
I just want to run that python script there on that server from this server.
Can anyone help?
Normally, I would recommend using a framework like fabric or winrm if you want to run a python script on another server. Those frameworks use ssh or windows remoting functionality, respectively, to allow a python program to execute other commands (including python scripts) on other systems. If the target machine is a windows machine, be forewarned that you can run into all sorts of UAC issues doing normal operations.

How to check if a the same python script is running in another computer in a small network?

I'm going to distribute my "handy" python script to my co workers in the same department as me. The script works with MySQL database, they are all in the same network as me.
But I don't want the script to run at the same time because it will cause problem with the database.
So I decided that in the distributed script it would check first if there are other of the same script running on other computers in my department, If there aren't any then it would continue to run , it would not run is there are already other same script was running in the other departments computer.
Look into the Python sockets module. You can send Network messages through it.
Consider using a mutex on the database if it is only able to be handled by a single instance of the script.
It would save you trawling the network constantly looking for an instance of the script running.

Categories