Linux/pm2 is killing my Flask service using Python's multiprocessing library - python

I have a Flask service running on a particular port xxxx. Inside this flask service is an endpoint:
/buildGlobalIdsPool
This endpoint uses Python's multiprocessing library's Pool object to run parallel processes of a function:
with Pool() as p:
p.starmap(api.build_global_ids_with_recordlinkage, args)
We use pm2 process manager on a Linux server to manage our services. I am hitting the endpoint from Postman and everything works fine up until the code above is reached. As soon as processes are supposed to spawn, pm2 will kill the main Flask process, but the spawned processes will persist (I check using lsof -i:xxxx and I see multiple python3 processes running on this port). This happens whether I run the service using pm2 or if I simply run python3 app.py. My program works on my local Windows 10 machine.
Just curious what I could be missing that is native to Linux or pm2 that is killing this process or not allowing multiple processes on the same port, while my local machine handles the program just fine.
Thanks!

Related

fork process under uwsgi, django

I need to execute some slow tasks upon receiving a POST request.
My server runs under UWSGI which behaves in a weird manner
Localhost (python manage.py runserver):
when receiving request from browser, I do p = Process(target=workload); p.start(); return redirect(...). Browser immediately follows the redirect, and working process starts in the background.
UWSGI (2 workers):
Background process starts, but Browser doesn't get redirected. It waits until the child exit.
Note, I have added close-on-exec=true (as advised in documentation and in Running a subprocess in uwsgi application) parameter in UWSGI configuration, but that has no visible effect, application waits for child's exit
I imagine Python gets confused since the interpreter multiprocessing.Process() is defaulting to is the the uwsgi binary, not the regular Python interpreter.
Additionally, you may be using the fork spawn method (depending on OS) and forking the uWSGI worker isn't a great idea.
You will likely need to call multiprocessing.set_executable() and multiprocessing.set_spawn_method() when you're running under uWSGI with something like this in e.g. your Django settings.py:
import multiprocessing
import sys
try:
import uwsgi # magic module only available when under uwsgi
except ImportError:
uwsgi = None
if uwsgi:
multiprocessing.set_spawn_method('spawn')
multiprocessing.set_executable(os.path.join(sys.exec_prefix, 'python'))
However, you may want look into using e.g. uWSGI's spooler system or some other work queue/background task system such as Huey, rq, Minique, Celery, etc.

Killing a Python script from another script spawned from it

I have two Python scripts on a Linux system. Let's call them service and killer. Service is run as a systemd service and killer as a script. Killer exists to perform certain tasks that can't be executed while service is running due to limited hardware resources and the will to keep the code simple.
What I need is to be able to start killer from service and then have killer kill service without dying itself (as a child process). How can I do that?
These are what I have tried so far (without success):
import subprocess
subprocess.call("killer.py")
import subprocess
subprocess.Popen(["killer.py"])
import sh
killer = sh.command("killer.py")
killer()

Is the python subprocess blocking the IO?

I am using CherryPy as a web server, after my web server request, it may run a long long process. I don't want the web server busy on handling the process, so I separate the execution on in a separate script, and using a subprocess to call this script. But it seems that the 'subprocess' will wait the process finish. Can I do something that after the computer called the subprocess, it executed in the background on it own? Thanks.

Run rq as daemon on server

In python, I am using rq for the use of background processes. But as I am running same thing on my server also, I want it to run as daemon, apart from unix command does rq provides something to make it daemon, like in ruby we have gem called sidekiq, it provides all the option for running environment, log file or daemon also.
I tried unix command rqworker & but it doesn't seem to be working properly.
In your case nohup rq worker & will fit your needs (it will work in the server even if you close the ssh connection).
But if you really want to run your program as a daemon, have a look to:
https://pypi.python.org/pypi/python-daemon/

Starting a python script on a remote machine which starts a bash script

I have what I believe to be a fairly unique problem for a script I use to stand up webservers on remote machines.
I have a controller script which after checking a ledger initiates a "builder" script on a remote machine. Part of this builder script calls a bash script which starts a process I want to continue running after both scripts are finished.
My only problem is that the builder script seems to finish (gets to the last line) but doesn't seem to return control to the controller script.
For the record I am using subprocess.call in the controller script (to initiate a ssh call) to start the builder script on the remote machine. I have toyed with various ways of initiating the bash script in the builder script but it seems the builder won't return control to the controller until kill the processes spawned by the bash script.
Things I have tried:
pid=os.spawnl(os.P_NOWAIT,dest+'/start_background_script.sh')
pid=subprocess.Popen([dest+'/start_background_script.sh'])
os.system(dest+'/start_background_script.sh &')
pid=os.spawnl(os.P_NOWAIT,dest+'/start_background_script.sh')
The bash script is written to that you execute it and it backgrounds two processes and then returns control.
Any recommendations?
Sound like a job for fabric to me.
Fabric wraps the handling of shell-calls on remote (and also local) machines for you.

Categories