Looking at the task manager - python

I'm writing a program that is supposed to synchronize several different parts,
including hardware. This is done by using a python script that communicates with
other programs.
I've found out that something I need for synchronization is for the main script
to be able to tell if another particular program is running, or if it stops.
I imagine it would look someting like:
#checking if a program runs
if is_running(program):
statements
#Waiting for a program to stop
while is_running(program):
pass
Does anyone know? I'm using Python 2.7 on Windows 7.

This question is pretty similar to your situation, and suggests using WMI which will run on python 2.4 to 3.2 and Windows 7, or using the builtin wmic to get the list of proc.
If you care about making the code cross platform, you could also use psutil, which works on "Linux, Windows, OSX, FreeBSD and Sun Solaris, both 32-bit and 64-bit architectures, with Python versions from 2.4 to 3.4."

Related

Testing Windows' implementation of Python multiprocessing on Linux

CPython's multiprocessing package is implemented fairly differently on Windows and on Linux, as a Windows implementation cannot rely on fork(2). However, it seems to me that the Windows implementation of multiprocessing (spawning a separate process and send it the required state by serializing it) should work on Linux (or am I wrong?).
While I work on Linux, I would like to make sure that the code I write also works on Windows (e.g., not accidentally have unpicklable arguments, etc.). Is there a way I can force CPython to use the Windows implementation of multiprocessing on Linux?
Thanks.
Hum, in fact this has just become possible very recently: http://bugs.python.org/issue8713.
Now I just have to run 3.4alpha2 :)

When Python is Updated Will I Have to Update My Program?

I'm working on my first Python-based program. I would like it to be as maintenance free in the future as possible and I was wondering if this could be a problem as Python is updated. I'm using 2.7.2 currently, but when 3 becomes standard what could happen to my program? Will it likely stop working on a system installed with Python 3 and will it be impractical to have the user install an older version of Python? I assume 2.7.2 won't be maintained indefinitely, and I wouldn't think newer versions of Python would run my program successfully. Sorry if this seems like a newb question; I'm used to working with compiled languages.
Not to stray too far off topic, but would it be better to use Lua in this case?
Will it stop working - that depends on how the program is being run. If the system has both versions installed and you ask to run against Python 2, then it'll continue to work. If you don't explicitly ask to run against a certain version, and it's not there, then it'll probably fail.
Lua offers you no solutions here - if you rely on a system-installed Lua and that Lua becomes incompatible in the future, you're stuck. Think of your scripting language as a dynamic library - if the user has the right version, they're ok, and if not, they don't, just like with C/C++ apps.
If you're deploying to Unix-like platforms, I expect they will support Python 2 for at least another 5 years, maybe 10.
If you're deploying to a Windows platform, you usually package up the relevant version of Python as part of your app.
So the problem is unlikely to be as significant as you fear.
It will take a very, very long time until Python2 will die.
Python2 code will most likely require modifications to run with Python3 (there's the 2to3 tool to help with migrating though), but with all the libs out there which are for python2 it will take years until py2 dies - so you don't really have to care about it right now. Besides that, I believe as long as enough people are using Python2, a version will be kept up to date with fixes.

Python: could a script compiled with py2exe freeze the operating system?

I am using py2exe to compiling python scripts in executable files on Windows Xp/7/2000.
I am wondering if such executable scripts could freeze the operating system, and I have to reboot Windows.
I suppose such problems could occur if I try to manage driver library.
What do you think about?
Theoretically, yes. Windows is not the most stable OS out there, and programs sometime "freeze" it even without mucking with drivers and kernel-mode code. Python programs aren't any different in this respect, whether packed with py2exe or not, since Python programs on Windows easily have access to the same Windows APIs any other program can access.
However, I have a feeling you're not "just asking" if you have a specific application freezing the system, it's something that should be addressed for the specific case in hand. Unless the application does something really crazy, it's probably a bug in it that can be solved.
A Python program - regardless of whether iterpreted by the Python executable or in py2exe form - can do the same as any other program. That means that it should not be able to freeze a modern operating system unless it is run with superuser rights. However, programs (especially malicious and badly written ones) can significantly degrade user experience, for example by going fullscreen and refusing to show the desktop or starting lots of threads and processes.

Problems with process created using 32bit python on Debian 5.0 IA-64

So, the problem is with creating processes - all of them immediately return with exit code -9.
I tried to use subprocess.Popen module and os.popen2, os.system and some other. Nothing helps.
I'm using python 2.5.
I have a python binary executable built on 32bit platform with 32bit python using pyinstaller. This is very important and I know that the best way to make it working is to create ia64 executable. But currently I'm looking for more common solution. It possibly exists cause on several other IA64 machines it worked.
Have anybody faced similar problem?

Running Python code in different processors

In order to do quality assurance in a critical multicore (8) workstation, I want to run the same code in different processors but not in parallel or concurrently.
I need to run it 8 times, one run for each processor.
What I don't know is how to select the processor I want.
How can this be accomplished in Python?
In Linux with schedutils, I believe you'd use taskset -c X python foo.py to run that specific Python process on CPU X (exactly how you identify your CPUs may vary, but I believe numbers such as 1, 2, 3, ... should work anywhere). I'm sure Windows, BSD versions, etc, have similar commands to support direct processor assignment, but I don't know them.
Which process goes on which core is normaly decided by your OS. On linux there is taskset from the schedutils package to explicitly run a program on a processor.
Python 2.6 has a multiprocessing module that takes python functions and runs them in seperate processes, probably moving each new process to a different core.

Categories