I'm writing a small application that uses an "index-file" to open folders in explorer from just a few button presses. Anyway I would like to update that index file in a "background process" every time the applications shuts down. Updating the index file means scanning through our network and for some remote users it could take a few minutes. That's why I would like it to hide the console during the scanning process in order to avoid the process being aborted by user.
I tried several things similar to:
#these are just dummy lines
path = get_user_input()
subprocess.Popen(r'explorer "%s"' % path)
#Here I start my update process
multiprocessing.Process(target=update_index).start()
#end of script, now I want that process to continue until finished while main console closes. I only seem to get one or the other.
I also tried using:
DETACHED_PROCESS = 0x00000008
CREATE_NO_WINDOW = 0x08000000
subprocess.Popen(command, shell=True, stdin=None, stdout=None,
stderr=None,
creationflags=DETACHED_PROCESS|CREATE_NO_WINDOW)
and managed to get a separate console window but still no way from preventing the user for closing down the process.
Also keep in mind I would like to distribute this script with something like py2exe later to make it accessible for those without python so I guess using pythonw.exe is out of question. or?
That's not really the answer you're looking for, but you could redesign your system architecture: Write your index updater as a server process that's communicating with your actual application over sockets. Then you just have the index updater server process run continuously (maybe even on another machine) and have the index updater process do all the time-consuming work.
If you just want to perform background tasks that happen at certain intervals, then use cron. If you want to run a command in the background and keep it running even if you logout of the console, use nohup.
Related
OS: Windows 10
Python: 3.5.2
I am trying to open calc.exe do some actions and than close it.
Here is my code sample
import subprocess, os, time
p = subprocess.Popen('calc.exe')
#Some actions
time.sleep(2)
p.kill()
So this is not working for calc.exe, it just opens the calculator, but does not close it, But same code is working fine for "notepad.exe".
I am guessing that there is a bug in subprocess lib for process kill method. so the notepad.exe process name in task manager is notepad.exe, but the calc.exe process name is calculator.exe, so I am guessing it is trying to kill by name and do not find it.
There's no bug in subprocess.kill. If you're really worried about that, just check the source, which is linked from the docs. The kill method just calls send_signal, which just calls os.kill unless the process is already done, and you can see the Windows implementation for that function. In short: subprocess.Process.kill doesn't care what name the process has in the kernel's process table (or the Task Manager); it remembers the PID (process ID) of the process it started, and kills it that way.
The most likely problem is that, like many Windows apps, calc.exe has some special "single instance" code: when you launch it, if there's already a copy of calc.exe running in your session, it just tells that copy to come to the foreground (and open a window, if it doesn't have one), and then exits. So, by the time you try to kill it 2 seconds later, the process has already exited.
And if the actual running process is calculator.exe, that means calc.exe is just a launcher for the real program, so it always tells calculator.exe to come to the foreground, launching it if necessary, and then exits.
So, how can you kill the new calculator you started? Well, you can't, because you didn't start a new one. You can kill all calc.exe and/or calculator.exe processes (the easiest way to do this is with a third-party library like psutil—see the examples on filtering and then kill the process once you've found it), but that will kill any existing calculator process you had open before running your program, not just the new one you started. Since calc.exe makes it impossible to tell if you've started a new process or not, there's really no way around that.
This is one way to kill it, but it will close every open calculator.
It calls a no window command prompt and gives the command to close the Calculator.exe process.
import subprocess, os, time
p = subprocess.Popen('calc.exe')
print(p)
#Some actions
time.sleep(2)
CREATE_NO_WINDOW = 0x08000000
subprocess.call('taskkill /F /IM Calculator.exe', creationflags=CREATE_NO_WINDOW)
I'm trying to build a todo manager in python where I want to continuously run a process in the bg that will alert the user with a popup when the specified time comes. I'm wondering how I can achieve that.
I've looked at some of the answers on StackOverflow and on other sites but none of them really helped.
So, What I want to achieve is to start a bg process once the user enters a task and keep on running it in the background until the time comes. At the same time there might be other threads running for other tasks as well that will end at their end times.
So far, I've tried this:
t = Thread(target=bg_runner, kwargs={'task': task, 'lock_file': lock_file_path})
t.setName("Get Done " + task.
t.start()
t.join()
With this the thread is continuosly running but it runs in the foreground and only exits when the execution is done.
If I add t.daemon = True in the above code, the main thread immediately exits after start() and it looks like the daemon is also getting killed then.
Please let me know how this can be solved.
I'm guessing that you just don't want to see the terminal window after you launch the script. In this case, it is a matter of how you execute the script.
Try these things.
If you are using a windows computer you can try using pythonw.exe:
pythonw.exe example_script.py
If you are using linux (maybe OSx) you may want to use 'nohup' in the terminal.
nohup python example_script.py
More or less the reason you have to do this comes down to how the Operating system handles processes. I am not an expert on this subject matter, but generally if you launch a script from a terminal, that script becomes a child process of the terminal. So if you exit that terminal, it will also terminate any child processes. The only way to get around that is to either detach the process from the terminal with something like nohup.
Now if you end up adding the #!/usr/bin/env python shebang line, your os could possibly just run the script without a terminal window if you just double click the script. YMMV (Again depends on how your OS works)
The first thing you need to do is prevent your script from exiting by adding a while loop in the main thread:
import time
from threading import Thread
t = Thread(target=bg_runner, kwargs={'task': task, 'lock_file': lock_file_path})
t.setName("Get Done " + task)
t.start()
t.join()
while True:
time.sleep(1.0)
Then you need to put it in the background:
$ nohup python alert_popup.py >> /dev/null 2>&1 &
You can get more information on controlling a background process at this answer.
is there a way to restart another script in another shell?
i have script that sometimes stuck waiting to read email from gmail and imap. from another script i would like to restart the main one but without stopping the execution of the second
i have tried:
os.system("C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py")
process = subprocess.Popen(["python", "C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
but both run the main in the second's shell
EDIT:
sorry, for shell i mean terminal window
After your last comment and as the syntax show that you are using Windows, I assume that you want to launch a Python script in another console. The magic word here is START if you want that the launching execute in parallel with the new one, or START /W if you want to wait for the end of the subprocess.
In your case, you could use:
subprocess.call(["cmd.exe", "/c", "START", "C:\Path\To\PYTHON.EXE",
"C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
Subprocess has an option called shell which is what you want. Os calls are blocking which means that only after the command is completed will the interpreter move to the next line. On the other hand subprocess popens are non blocking, however both these commands will spawn off child process from the process running this code. If you want to run in shell and get access shell features to execute this , try the shell = True in subprocess.
I could try and explain everything you need but I think this video will do it better: Youtube Video about multithreading
This will allow you to run 2 things f.e.
Have 1 run on checkin email and the other one on inputs so it wont stop at those moments and making multiple 'shelves' possible, as they are parallel.
If you really want to have a different window for this, i am sorry and I can not help.
Hope this was were you were looking for.
first of all a short overview over my current goal:
I want to use a scheduler to execute a simple python program every second. This program reads some data and enter the results inside a database. Because the scheduled task will operate over several days on a raspberry pie the process should start in the background. Therefore I want to create a python file which can start, stop and get the current status from the background job. Furthermore it should be possible to exit and reenter the control file without stopping the background job.
Currently I tried apscheduler to execute the python file every second. The actual problem is, that I can't access the current python file, to control the status, from another external file. Overall I found no real solution how I can control a subprocess form an external file and after find the same subprocess again after restarting the controlling python file.
EDIT:
So overall as far I got it now I'm able to find the current process with his pid. With that im able to send send a terminatesignal to the current process. Inside my scheduled file I'm able to catch these signals and shut down the program on a normal way.
To control (start, restart, stop, schedule) the background process use subprocess. Here is example of subrocess' popen with timeout.
To pass some data between the scheduler and the background job use one of IPC mechanisms, for example sockets.
I have reports that I am sending to a system that requires the reports be in a readable PDF format. I tried all of the free libraries and applications and the only one that I found worked was Adobe's acrobat family.
I wrote a quick script in python that uses the win32api to print a pdf to my printer with the default registered application (Acrobat Reader 9) then to kill the task upon completion since acrobat likes to leave the window open when called from the command line.
I compiled it into an executable and pass in the values through the command line
(for example printer.exe %OUTFILE% %PRINTER%) this is then called within a batch file
import os,sys,win32api,win32print,time
# Command Line Arguments.
pdf = sys.argv[1]
tempprinter = sys.argv[2]
# Get Current Default Printer.
currentprinter = win32print.GetDefaultPrinter()
# Set Default printer to printer passed through command line.
win32print.SetDefaultPrinter(tempprinter)
# Print PDF using default application, AcroRd32.exe
win32api.ShellExecute(0, "print", pdf, None, ".", 0)
# Reset Default Printer to saved value
win32print.SetDefaultPrinter(currentprinter)
# Timer for application close
time.sleep(2)
# Kill application and exit scipt
os.system("taskkill /im AcroRd32.exe /f")
This seemed to work well for a large volume, ~2000 reports in a 3-4 hour period but I have some that drop off and I'm not sure if the script is getting overwhelmed or if I should look into multithreading or something else.
The fact that it handles such a large amount with no drop off leads me to believe that the issue is not with the script but I'm not sure if its an issue with the host system or Adobe Reader, or something else.
Any suggestions or opinions would be greatly appreciated.
Based on your feedback (win32api.ShellExecute() is probably not synchronous), your problem is the timeout: If your computer or the print queue is busy, the kill command can arrive too early.
If your script runs concurrently (i.e. you print all documents at once instead of one after the other), the kill command could even kill the wrong process (i.e. an acrobat process started by another invocation of the script).
So what you need it a better synchronization. There are a couple of things you can try:
Convert this into a server script which starts Acrobat once, then sends many print commands to the same process and terminates afterwards.
Use a global lock to make sure that ever only a single script is running. I suggest to create a folder somewhere; this is an atomic operation on every file system. If the folder exists, the script is active somewhere.
On top of that, you need to know when the job is finished. Use win32print.EnumJobs() for this.
If that fails, another solution could be to install a Linux server somewhere. You can run a Python server on this box which accepts print jobs that you send with the help of a small Python script on your client machine. The server can then print the PDFs for you in the background.
This approach allow you to add any kind of monitoring you like (sending mails if something fails or send a status mail after all jobs have finished).