terminating application through python script - python

How to close (terminate) Windows applications using Python script? When I switch-on my PC, I find many applications like MSOSYNC.exe, ONENOTEM.exe etc. along with many others, running, which are not very useful. I want to close those? I tried "subprocess" module and some other, they not working. Which method should I use?

You're using the Popen class to construct a new object in you example already. It has methods to deal with it. Read the documentation
import subprocess
proc = subprocess.Popen(['c:\\windows\\system32\\notepad.exe','C:\\file1.txt'])
proc.terminate()

Related

how can I run python file from another file, then have the new file restart the first file?

So far I don't think this is actually possible, but basically what I am trying to do is have one python program call another and run it, like how you would use import.
But then I need to be able to go from the second file back to the beginning of the first.
Doing this with import doesn't work because the first program never closed and will be still running, so running it again will only return to where it left off when it ran the second file.
Without understanding a bit more about what you want to do, I would suggest looking into the threading or multiprocessing libraries. These should allow you to create multiple instances of a program or function.
This is vague and I'm not quite sure what you're trying to do, but you can also explore the Subprocess module for Python. It will allow you to spawn new processes similarly to if you were starting them from the command-line, and your processes will also be able to talk to the child processes via stdin and stdout.
If you don't want to import any modules:
exec("file.py")
Otherwise:
import os
os.system('file.py')
Or:
import subprocess
subprocess.call('file.py')

Python: Use multiple python windows for a same program

Is there a way to have a python program being able to open and manage multiple terminal windows? Like, in my script, making two windows open, and when I write something in the first one, like with
d = input()
it prints it in the second one? I'd like to avoid using sockets if possible, and avoid using python GUI libraries like Tkinter... But if there's no other way it's okay, just avoid it if possible.
Thanks.
Yes you can!!
1) you can save your input_output data in a file and use it on another python script ( realtime ).
2) you can use multiprocessing module for handling multi process read more on:
https://pymotw.com/2/multiprocessing/basics.html
3) you can use Multithreaded module for handling multi thread, read more on :https://www.tutorialspoint.com/python/python_multithreading.htm
4) you can use sys and subprocess modules for using pip port

Can I control a powershell "session" from Python

I want to initalise a powershell subprocess and be able to send commands to it and then receive its output.
Slightly different from the usual subprocess.Popen(['powershell', '...']) use case because ideally I would like my powershell session to persist whilst my script is running.
This is because I might need to run 100+ commands (Get-ADUser, etc..) and the startup costs of loading the powershell console and loading the Active directory module is quite significant.
Roughly speaking I was imagining something like the following (though I'm aware this does not work):
class ActiveDirectory:
def __init__(self):
self.proc = subprocess.Popen(['powershell'], stdin.., stdout..)
def run_command(self, command):
output = self.proc.communicate(command)
return output
I can't find an obvious way to do this using the subprocess module.
Does anybody have any idea how subprocess can be manipulated to achieve this? Or is there any other tool that provide a richer interface into sub processes? Possibly twisted.reactor?
Thanks,
O.
You can try by combining this 2 post:
hope it helps:
Keep a subprocess alive and keep giving it commands? Python
Run PowerShell function from Python script

Feasibility of using pipe for ruby-python communication

Currently, I have two programs, one running on Ruby and the other in Python. I need to read a file in Ruby but I need first a library written in Python to parse the file. Currently, I use XMLRPC to have the two programs communicate. Porting the Python library to Ruby is out of question. However, I find and read that using XMLRPC has some performance overhead. Recently, I read that another solution for the Ruby-Python conundrum is the use of pipes. So I tried to experiment on that one. For example, I wrote this master script in ruby:
(0..2).each do
slave = IO.popen(['python','slave.py'],mode='r+')
slave.write "master"
slave.close_write
line = slave.readline
while line do
sleep 1
p eval line
break if slave.eof
line = slave.readline
end
end
The following is the Python slave:
import sys
cmd = sys.stdin.read()
while cmd:
x = cmd
for i in range(0,5):
print "{'%i'=>'%s'}" % (i, x)
sys.stdout.flush()
cmd = sys.stdin.read()
Everything seems to work fine:
~$ ruby master.rb
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
My question is, is it really feasible to implement the use of pipes for working with objects between Ruby and Python? One consideration is that there may be multiple instances of master.rb running. Will concurrency be an issue? Can pipes handle extensive operations and objects to be passed in between? If so, would it be a better alternative for RPC?
Yes. No. If you implement it, yes. Depends on what your application needs.
Basically if all you need is simple data passing pipes are fine, if you need to be constantly calling functions on objects in your remote process then you'll probably be better of using some form of existing RPC instead of reinventing the wheel. Whether that should be XMLRPC or something else is another matter.
Note that RPC will have to use some underlying IPC mechanism, which could well be pipes. but might also be sockets, message queues, shared memory, whatever.

Monitor Process in Python?

I think this is a pretty basic question, but here it is anyway.
I need to write a python script that checks to make sure a process, say notepad.exe, is running. If the process is running, do nothing. If it is not, start it. How would this be done.
I am using Python 2.6 on Windows XP
The process creation functions of the os module are apparently deprecated in Python 2.6 and later, with the subprocess module being the module of choice now, so...
if 'notepad.exe' not in subprocess.Popen('tasklist', stdout=subprocess.PIPE).communicate()[0]:
subprocess.Popen('notepad.exe')
Note that in Python 3, the string being checked will need to be a bytes object, so it'd be
if b'notepad.exe' not in [blah]:
subprocess.Popen('notepad.exe')
(The name of the file/process to start does not need to be a bytes object.)
There are a couple of options,
1: the more crude but obvious would be to do some text processing against:
os.popen('tasklist').read()
2: A more involved option would be to use pywin32 and research the win32 APIs to figure out what processes are running.
3: WMI (I found this just now), and here is a vbscript example of how to query the machine for processes through WMI.
Python library for Linux process management

Categories