I would like to give the user the possibility to have a function being executed every 30 minutes and also be able to stop this whenever he wants. (The user interacts with my application with a web frontend.)
How would one do so with Python?
What I thought of
One possibility I thought of is subprocess + infinite loop with time.sleep:
The Python function gets it's own script whatever.py which has a command line parameter stop_filename
As soon as the user wants to start this "cron" job, subprocess creates a new process of whatever.py with a stop_filename = "kill_job/{}".format(uuid.uuid4())
When the user wants to stop the process, the file stop_filename is created. The process always checks if this file exists before the function is executed and termines if the file does exist.
I store the generated stop_filename in the database for each process so that the user only needs to know which "cron job" he wants to kill.
Although this will work, there are a couple of things I don't like about it:
The process killing might take 30 minutes
The process could be dead before and I don't know how to check that
It seems to be too complicated.
Related
I'm trying to make a program (with GUI) that shows some informations.
I would like to update these informations from outside events.
e.g.
I have another script that do his tasks once a day (once a day a NEW process runs and does it's job), after this job completes I would like to update the informations in my GUI,that is always displayed on my monitor,without closing and reopening it.
Is there a way to retrieve the GUI process from outside and run functions inside it? (or a better way if u can help me)
I dont even know where to start, I feel that is something with threading,but dont know how to proper do it.
I have a program that produces a csv file and right at the end I am using os.startfile(fileName) but then due to the program finishing execution the opening file just closes also, same happens if I add a sleep after also, file loads up then once the sleep ends it closes again?
Any help would be appreciated.
From the documentation for os.startfile:
startfile() returns as soon as the associated application is launched. There is no option to wait for the application to close, and no way to retrieve the application’s exit status.
When using this function, there is no way to make your script wait for the program to complete because you have no way of knowing when it is complete. Because the program is being launched as a subprocess of your python script, the program will exit when the python script exits.
Since you don't say in your question exactly what the desired behavior is, I'm going to guess that you want the python script to block until the program finishes execution (as opposed to detaching the subprocess). There are multiple ways to do this.
Use the subprocess module
The subprocess module allows you to make a subprocess call that will not return until the subprocess completes. The exact call you make to launch the subprocess depends heavily on your specific situation, but this is a starting point:
subprocess.Popen(['start', fileName], shell=True)
Use input to allow user to close script
You can have your script block until the user tells the python script that the external program has closed. This probably requires the least modification to your code, but I don't think it's a good solution, as it depends on user input.
os.startfile(fileName)
input('Press enter when external program has completed...')
I have a program that constantly runs if it receives an input, it'll do a task then go right back to awaiting input. I'm attempting to add a feature that will ping a gaming server every 5 minutes, and if the results every change, it will notify me. Problem is, if I attempt to implement this, the program halts at this function and won't go on to the part where I can then input. I believe I need multithreading/multiprocessing, but I have no experience with that, and after almost 2 hours of researching and wrestling with it, I haven't been able to figure it out.
I have tried to use the recursive program I found here but haven't been able to adapt it properly, but I feel this is where I was closest. I believe I can run this as two separate scripts, but then I have to pipe the data around and it would become messier. It would be best for the rest of the program to keep everything on one script.
'''python
def regular_ping(IP):
last_status = None
while True:
present_status = ping_status(IP) #ping_status(IP) being another
#program that will return info I
#need
if present_status != last_status:
notify_output(present_status) #notify_output(msg) being a
#program that will notify me of
# a change
last_status = present_status
time.sleep(300)
'''
I would like this bit of code to run on its own, notifying me of a change (if there is one) every 5 minutes, while the rest of my program also runs and accepts inputs. Instead, the program stops at this function and won't run past it. Any help would be much appreciated, thanks!
You can use a thread or a process for this. But since this is not a CPU bound operation, overhead of dedicating a process is not worth it. So a thread would be enough. You can implement it as follows:
import threading
thread = threading.Thread(target=regular_ping, args=(ip,))
thread.start()
# Rest of the program
thread.join()
I am trying to build a node app which calls python script (takes a lot of time to run).User essentially chooses parameters and then clicks run which triggers event in socket.on('python-event') and this runs python script.I am using sockets.io to send real-time data to the user about the status of the python program using stdout stream I get from python.But the problem I am facing is that if the user clicks run button twice, the event-handdler is triggered twice and runs 2 instances of python script which corrupts stdout.How can I ensure only one event-trigger happens at a time and if new event trigger happens it should kill previous instance and also stdout stream and then run new instance of python script using updated parameters.I tried using socket.once() but it only allows the event to trigger once per connection.
I will use a job queue to do such kind of job, store each job's info in a queue, so you can cancel it and get its status. You can use a node module like kue.
first of all a short overview over my current goal:
I want to use a scheduler to execute a simple python program every second. This program reads some data and enter the results inside a database. Because the scheduled task will operate over several days on a raspberry pie the process should start in the background. Therefore I want to create a python file which can start, stop and get the current status from the background job. Furthermore it should be possible to exit and reenter the control file without stopping the background job.
Currently I tried apscheduler to execute the python file every second. The actual problem is, that I can't access the current python file, to control the status, from another external file. Overall I found no real solution how I can control a subprocess form an external file and after find the same subprocess again after restarting the controlling python file.
EDIT:
So overall as far I got it now I'm able to find the current process with his pid. With that im able to send send a terminatesignal to the current process. Inside my scheduled file I'm able to catch these signals and shut down the program on a normal way.
To control (start, restart, stop, schedule) the background process use subprocess. Here is example of subrocess' popen with timeout.
To pass some data between the scheduler and the background job use one of IPC mechanisms, for example sockets.