I have a flask application in which I have to run a bash script. On visiting a particular link (let say localhost:5000/B) I execute the script using python's subprocess library. Then used wait() function on subprocess. So that the script is finished before doing other tasks which depend on script. After finishing those remaining tasks I return the results in response by rendering a template.
Sometimes I might go back from page or press cancel button( on top side of browser). In that case I want to terminate that script from running even if it has not completed. I have added javascript in page so that when I go back from page it makes a GET request to server at link localhost:5000/C. In the function handling this, I terminate the subprocess.
But due to some reasons it does not work, even after using kill() or terminate() method.
Can we terminate a subprocess on which we have used wait() or not?
If there is a better way of doing this thing kindly let me know.
Thanks
The subprocess was creating another subprocess during the execution and it was creating problems.
Related
Is there a way to write a python script that spawns a subprocess which calls and then waits for a server response before calling shell commands?
I need to log into a server on one terminal, then interact with another server in a separate window, and in that separate window I need to wait for a response.
I know I can run a separate file with a subprocess, but for the assignment I'm doing I need to only use one file.
You can create some WAIT_KEY which indicates that you are waiting for the server's response and run a while loop with time.sleep for saving resources. It should also work well if you process smth in a separate thread.
import time
while WAIT_KEY:
time.sleep(0.1)
P.S. Instead of WAIT_KEY you should call the method which returns the state of server's responce.
P.P.S. The smaller sleeping time you pick the more resources will be wasted in the while loop
I am creating a program in Python that listens to varios user interactions and logs them. I have these requirements/restrictions:
I need a separate process that sends those logs to a remote database every hour
I can't do it in the current process because it blocks the UI.
If the main process stops, the background process should also stop.
I've been reading about subprocess but I can't seem to find anything on how to stop both simultaneously. I need the equivalent of spawn_link if anybody know some Erlang/Elixir.
Thanks!
To answer the question in the title (for visitors from google): there are robust solutions on Linux, Windows using OS-specific APIs and less robust but more portable psutil-based solutions.
To fix your specific problem (it is XY problem): use a daemon thread instead of a process.
A thread would allow to perform I/O without blocking GUI, code example (even if GUI you've chosen doesn't provide async. I/O API such as tkinter's createfilehandler() or gtk's io_add_watch()).
I have an app that embeds python scripting.
I'm adding calls to python from C, and my problem is that i need to suspend the script execution let the app run, and restore the execution from where it was suspended.
The idea is that python would call, say "WaitForData" function, so at that point the script must suspend (pause) the calls bail out so the app event loop would continue. When the necessary data is present, i would like to restore the execution of the script, it is like the python call returns at that point.
i'm running single threaded python.
any ideas how can i do this, or something similar, where i'll have the app event loop run before python call exits?
Well, the only way I could come up with is to run the Python engine on a separate thread. Then the main thread is blocked when the python thread is running.
When I need to suspend, I block the Python thread, and let the main thread run. When necessary, the OnIdle of the main thread, i block it and let the python continue.
it seems to be working fine.
Using subprocess.Popen(), I'm launching a process that is supposed to take a long time. However, there is a chance that the process will fail shortly after it launches (producing a return code of 1). If that happens, I want to intercept the failure and present an explanatory message to the user. Is there a way to "listen" to the process and respond if it fails? I can't just use Popen.wait() because my python program has to keep running.
The hack I have in place right now is to time.sleep() my python program for .5 seconds (which should be enough time for the subprocess to to fail if it's going to do so). After the python program resumes, it polls the subprocess to determine if it has failed or not.
I imagine that a better solution might use threading and Popen.wait(), but I'm a relative beginner to python.
Edit:
The subprocess is a Java daemon that I'm launching. If another instance of the daemon is already running on the system, the Java subprocess will exit with a return code of 1, and I want to intercept the messy Java exception stack trace and present an understandable error message to the user.
Two approaches:
Call Popen.wait() on a thread as you suggested yourself, then call an error handler function if the exit code is non-zero. Make sure that the error handler is thread safe, preferably by dispatching the error message to the main thread if your application has an event loop.
Rewrite your application to use an event loop that already supports monitoring child processes, such as pyev. If you just want to monitor one subprocess, this is probably overkill.
I used the spawn function from the following post:
Indefinite daemonized process spawning in Python
I'm writing a cgi script that takes in inputs, manipulates them, and then outputs a success page. One of the manipulative functions calls an executable that takes a little while to finish. As a result, when an individual submits a request, it simple hangs on the html page until completed.
In my def main() function, I do the following:
def main():
<call a bunch of little functions here>
print <All the success information here>
<spawn the daemon process here>
The issue is that with that ordering, it prints the success information 3 times probably because of the forking. (but executable is running in the background as it should).
If i put the daemon process before the html printing, it causes it to hang as it used to defeating the purpose of spawning the background process.
Does anyone have any ideas?
Also, quick theory question about forking, when fork is called, does it re-run the entire function it was called from again? So if I spawn the daemon process, will the forked processes, spawn the ?