I'm trying to kill a secondary task of a process using powershell, batch, python...anything I can save as script and run it remotely. TaskManager picture as following:
I'd like to kill the one with longer title leaving the "SAP Logon 740" open. Every task of the tree have the same PID, so I can't just kill the process.
I guess this is posible, because I can do it manually going to Task MAnager, expanding the process and ending that specific task but everything I've found consist in killing the process, which isn't possible in my case.
I've so far tried with tasklist/taskkill, powershell (Get-Process, Get-Object Win32_Process...) but I haven't been able to find how to.
Here you have the output of TaskList (Status=Running)
Only one of the task (the one which is front) is showing there.
As you have used the powershell tag, and even ran your tasklist command using powerhell.exe, I have decided to provide an examples using it.
If your criteria is to stop the process named saplogon with the longest window title string:
GPs saplogon|Sort{$_.mainWindowTitle.Length}|Select -L 1|SpPs -Wh
If your criteria is to stop all processes named saplogon except for the one with the shortest window title string:
GPs saplogon|Sort{$_.mainWindowTitle.Length}|Select -Skip 1|SpPs -Wh`
If you're happy with the output, you can remove -Wh, (-WhatIf), to actually perform the task. If needed you could even replace that with the -F, (-Force) option, if necessary.
Related
I am creating a subprocess using this line of code:
p = subprocess.Popen(["doesItemExist.exe", id], shell=False)
and when I run the script while I have the Task Manager open, I can see that it creates two processes and not one. The issue is that when I go to kill it, it kills one (using p.kill()), but not the other. I've tried looking online but the only examples I find are about shell=True and their solutions don't work for me. I've confirmed that that line only gets called once.
What can I do? Popen is only giving me back the one pid so I don't understand how to get the other so I can kill both.
I ended up being able to deal with this issue by creating a clean up function which just uses the following:
subprocess.run(["taskkill", "/IM", "doesItemExist.exe", "/F"], shell=True)
This will kill any leftover tasks. If anyone uses this, be careful that your exe has a unique name to prevent you from killing anything you don't mean to. If you want to hide the output/errors, just set the stdout and stderr to subprocess.PIPE.
Also, if there is no process to kill it will report that as an error.
I'm writing a small application that uses an "index-file" to open folders in explorer from just a few button presses. Anyway I would like to update that index file in a "background process" every time the applications shuts down. Updating the index file means scanning through our network and for some remote users it could take a few minutes. That's why I would like it to hide the console during the scanning process in order to avoid the process being aborted by user.
I tried several things similar to:
#these are just dummy lines
path = get_user_input()
subprocess.Popen(r'explorer "%s"' % path)
#Here I start my update process
multiprocessing.Process(target=update_index).start()
#end of script, now I want that process to continue until finished while main console closes. I only seem to get one or the other.
I also tried using:
DETACHED_PROCESS = 0x00000008
CREATE_NO_WINDOW = 0x08000000
subprocess.Popen(command, shell=True, stdin=None, stdout=None,
stderr=None,
creationflags=DETACHED_PROCESS|CREATE_NO_WINDOW)
and managed to get a separate console window but still no way from preventing the user for closing down the process.
Also keep in mind I would like to distribute this script with something like py2exe later to make it accessible for those without python so I guess using pythonw.exe is out of question. or?
That's not really the answer you're looking for, but you could redesign your system architecture: Write your index updater as a server process that's communicating with your actual application over sockets. Then you just have the index updater server process run continuously (maybe even on another machine) and have the index updater process do all the time-consuming work.
If you just want to perform background tasks that happen at certain intervals, then use cron. If you want to run a command in the background and keep it running even if you logout of the console, use nohup.
is there a way to restart another script in another shell?
i have script that sometimes stuck waiting to read email from gmail and imap. from another script i would like to restart the main one but without stopping the execution of the second
i have tried:
os.system("C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py")
process = subprocess.Popen(["python", "C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
but both run the main in the second's shell
EDIT:
sorry, for shell i mean terminal window
After your last comment and as the syntax show that you are using Windows, I assume that you want to launch a Python script in another console. The magic word here is START if you want that the launching execute in parallel with the new one, or START /W if you want to wait for the end of the subprocess.
In your case, you could use:
subprocess.call(["cmd.exe", "/c", "START", "C:\Path\To\PYTHON.EXE",
"C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
Subprocess has an option called shell which is what you want. Os calls are blocking which means that only after the command is completed will the interpreter move to the next line. On the other hand subprocess popens are non blocking, however both these commands will spawn off child process from the process running this code. If you want to run in shell and get access shell features to execute this , try the shell = True in subprocess.
I could try and explain everything you need but I think this video will do it better: Youtube Video about multithreading
This will allow you to run 2 things f.e.
Have 1 run on checkin email and the other one on inputs so it wont stop at those moments and making multiple 'shelves' possible, as they are parallel.
If you really want to have a different window for this, i am sorry and I can not help.
Hope this was were you were looking for.
Inside Task Manager [Windows 8+], in the "Processes" tab it lists all the processes currently running. If we open 2 windows of MS Word, it will only appear once, however this is actually a group and can be expanded to be able to see and end both tasks separately.
This is great, however it DOES NOT carry over to the "Details" tab where WINWORD.EXE is listed but only 1 occurrence! And thus only 1 PID! Sharing a PID is an issue because an attempt to close it results in the entire thingbeing closed.
I want to only kill a specific word window, not ALL word windows which has been happening when I try to kill windows programatically (currently i'm using taskkill through an import os in python, any other way to do it without additional modules would be alright as well).
Right now when I run taskkill.... "WordDoc.docx" it kills every open word document which is extremely annoying and potentially data losing. Is there a way to be able to kill "proccesses" like how it is done in task manager?
Thank you
PS I am not using /T so that is not the issue
When closing a single window of a process on the Process tab, Task Manager does not kill the process the window belongs to, but just sends a WM_CLOSE message to that window. You will notice that the Word window is not "killed" as you will still get a prompt to save and unsaved changes in your Word document.
You can do the same as Task Manager using the following code, which enumerates all top-level windows, and then sends WM_CLOSE if the window title matches a desired value:
import win32gui
def enumHandler(hwnd, lParam):
if win32gui.IsWindowVisible(hwnd):
if 'My Word Document' in win32gui.GetWindowText(hwnd):
win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
win32gui.EnumWindows(enumHandler, None)
first of all a short overview over my current goal:
I want to use a scheduler to execute a simple python program every second. This program reads some data and enter the results inside a database. Because the scheduled task will operate over several days on a raspberry pie the process should start in the background. Therefore I want to create a python file which can start, stop and get the current status from the background job. Furthermore it should be possible to exit and reenter the control file without stopping the background job.
Currently I tried apscheduler to execute the python file every second. The actual problem is, that I can't access the current python file, to control the status, from another external file. Overall I found no real solution how I can control a subprocess form an external file and after find the same subprocess again after restarting the controlling python file.
EDIT:
So overall as far I got it now I'm able to find the current process with his pid. With that im able to send send a terminatesignal to the current process. Inside my scheduled file I'm able to catch these signals and shut down the program on a normal way.
To control (start, restart, stop, schedule) the background process use subprocess. Here is example of subrocess' popen with timeout.
To pass some data between the scheduler and the background job use one of IPC mechanisms, for example sockets.