I am working on a rather complex python multiprocessing codebase. It is an IOT type problem where multiple processes need to be active simultaneously to receive data. There is no set kill flag / kill condition (time, jobs etc). Instead kill is accomplished by switching a flag referenced by all processes which interrupts their run loops.
The issue I am having is that I am nesting multiple packages and some are containing their own run loops which are not terminated and block the flag check for termination. Correcting this may require a restructuring of the code base.
What I am currently looking for is an external (outside of the program) way to see which processes are running and failing to shutdown. If the tool can also show why all the better. I am welcome to any bash tricks or other methods people know for debugging python multiprocessing.
Related
I have simple python code which is using 2 processes one is the main process and another which is created by multiprocessing module. Both processes runs in infinite loop. I want my python code to never crash/hang/freeze. I've already handled most of the errors/exceptions. FYI its a IOT project and I'm running this code as launcher in /etc/rc.local path. I tried using pid module from python as given here
Accoring to the link given the pid module works as below.
from pid import PidFile
with PidFile():
do_something()
My question is, does above logic meets my requirements or do I need to put some more logic like checking the existance of pid file and then decide to kill/stop/restart the process (or code itself) if any of the two processes freezes due to any bugs from code.
Please suggest is there any other way to achieve this, if pid module is not suitable for my requirement.
Hi I resolved this issue by creating a separate python scripts for both tasks rather using of multiprocessing modules such as queue. I suggest not to use multiprocessing queue inside infinite loops as it freezes the process/processes after some time.
I want a python script that terminate an aplications when I RUN another program.
Example: if I run vlc media player that will close kodi.exe
Or any other options to do this ?
It seems to me that what you have to do is enumerate processes by name, determine which to kill, and then kill it (the last part may be cumbersome a bit).
Did you have a chance to look at the following questions and their answers (they illustrate how to handle all of the steps above)?
Determining running programs in Python
List running processes on 64-bit Windows
Is it possible to kill a process on Windows from within Python?
Sometimes I run many instances of a python script simultaneously. To do it anagrammatically I use tmux (a terminal multiplexer), and when I fill I'm done, or I when I have to fix something, then I kill the tmux session instead of exiting each of the (up to 100) script manually.
Killing the tmux session actually kills the bash processes which are parents of the python processes that were executed from them. If I understand correctly, it means a SIGHUP signal is sent to all of the python processes.
It cleans everything quite quickly - memory is freed (it seems), cpu is freed, sockets are closed and apparently ports are freed. The advantage is that it is a much quicker and simpler task than exiting each of the scripts.
My question is: are there any possible consequences to such a habit? If I don't care about the output of the script itself - may it cause any other damage, such as making the OS dirtier, heavier, etc? Is there a better practice?
The SIGHUP handler is called. If no SIGHUP handler is installed, then the default action as shown by the signal(7) man page is invoked.
To be certain that your scripts close all files, release all resources, etc., install a SIGHUP handler that performs the appropriate actions.
How to start an always on Python Interpreter on a server?
If bash starts multiple python programs, how can I run it on just one interpreter?
And how can I start a new interpreter after tracking number of bash requests, say after X requests to python programs, a new interpreter should start.
EDIT: Not a copy of https://stackoverflow.com/questions/16372590/should-i-run-1000-python-scripts-at-once?rq=1
Requests may come pouring in sequentially
You cannot have new Python programs started through bash run on the same interpreter, each program will always have its own. If you want to limit the number of Python programs running the best approach would be to have a Python daemon process running on your server and instead of creating a new program through bash on each request you would signal the daemon process to create a thread to handle the task.
To run a program forever in python:
while True :
do_work()
You could look at spawning threads for incoming request. Look at threading.Thread class.
from threading import Thread
task = new Thread(target=do_work, args={})
task.start()
You probably want to take a look at http://docs.python.org/3/library/threading.html and http://docs.python.org/3/library/multiprocessing.html; threading would be more lightweight but only allows one thread to execute at a time (meaning it won't take advantage of multicore/hyperthreaded systems), while multiprocessing allows for true simultaneous execution but can be a bit less lightweight than threading if you're on a system that doesn't utilize lightweight subprocesses and may not be as necessary if the threads/processes spend lots of time doing I/O requests.
I'm hosting Python script with Python for Delphi components inside my Delphi application. I'd like to create background tasks which keep running by script.
Is it possible to create threads which keep running even if the script execution ends (but not the host process, which keeps going on). I've noticed that the program gets stuck if the executing script ends and there is thread running. However if I'll wait until the thread is finished everything goes fine.
I'm trying to use "threading" standard module for threads.
Python has its own threading module that comes standard, if it helps. You can create thread objects using the threading module.
threading Documentation
thread Documentation
The thread module offers low level threading and synchronization using simple Lock objects.
Again, not sure if this helps since you're using Python under a Delphi environment.
If a process dies all it's threads die with it, so a solution might be a separate process.
See if creating a xmlrpc server might help you, that is a simple solution for interprocess communication.
Threads by definition are part of the same process. If you want them to keep running, they need to be forked off into a new process; see os.fork() and friends.
You'll probably want the new process to end (via exit() or the like) immediately after spawning the script.