How to find process ids of running applications using python3 [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Let's suppose 3 applications are open on windows 7.
First,
I want to print the process ids of the running applications.
Second,
I want to kill the selected application.
How can this be done using python.
The purpose is to make an application which kills the selected process.

This will find all the chrome pids and kill them, it is cross platform:
import psutil
for p in psutil.process_iter():
if p.name == "chrome":
print (p.pid)
p.kill()
There are lots of examples here

Related

How do I prevent users's shutdown in WINDOWS? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 days ago.
Improve this question
I want to design a program that can stop the user from shutting down.
At first I tried to create an unsaved text document, but the user still had the option to force a shutdown. I also know that the shutdown -a commands can prevent a shutdown, so I wrote the following simple python program:
from os import system
while True:
system('shutdown -a')
But when it is running, the system can still shut down successfully, why is this? I hope this shutdown restriction can be turned on or off at any time.It would be nice if you could do this in python,thank you!

Python thread management modules [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
what is the best module in python for multi thread management ?
1.Create N number of threads
2.Assigning work to thread if it is free
3.Getting the status of the thread
4.killing a thread
You could use the threading module.
https://docs.python.org/3/library/threading.html#module-threading
You could also use concurrent.futures
https://docs.python.org/3/library/concurrent.futures.html
Keep in mind that Python doesn't support true threading because of the global interpreter lock. As a result you would typically use threading for IO bound tasks.
https://wiki.python.org/moin/GlobalInterpreterLock
If you could be a little more specific with what you are trying to accomplish in # 1-4 I could give an applicable example.

Run command in background with Fabric [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to run a background script with Fabric using Bash's '&' operator. Is there a reason the following doesn't work? I executed this command on the server itself and it runs fine.
#task
def run_script():
sudo('sh /home/ubuntu/wlmngcntl.sh start &', user='myuser')
I don't want to use something heavy like Celery to do this simple thing. I don't need to capture the output at all, all I want is for the task to execute this and return after.
This isn't a Fabric thing, but a Linux thing. When you close a session, the processes connected to that session are terminated.
This question has a lot of info... https://askubuntu.com/questions/8653/how-to-keep-processes-running-after-ending-ssh-session
You could use the following (from that answer)
sudo('nohup sh /home/ubuntu/wlmngcntl.sh start &', user='myuser')

Does anybody know how to run parallel a Python application? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I would like to use Python for my project but I need to distribute the computation on a set of resources.
You can try with pyCOMPSs, which is the python version of COMP superscalar.
More info here
With this programming model you define which methods are candidate to be executed remotely defining the direction of the method parameters (IN OUT or INOUT). Then the main code is programmed in a sequential fashion and the runtime analyses dependencies between these methods detecting which can be executed in parallel. The runtime also spawn the exectution transparently to the remote hosts and do all data transfers.

Python workers in Rails [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm building rails application, I have workers from other project, they called ironworker and written on Python. Is it possible to use this workers in Rails application?
As one of solution, I'm going to use other worker - resque, but can I execute Python scripts?
Thanks for help, I have no idea from what I should start
As I understand from the docs of ironworker, python workers can be run from command line.
exec 'hello_worker.py'
This post explain how you can do it:
Calling shell commands from Ruby.
For example you can call python workers in your rescue worker:
class ImageConversionJob
def work
system("exec 'hello_worker.py'")
end
end

Categories