calling a process from python - python

I am calling a perl script from python. The perl script retrieves a large data set in batches from a webserver which takes time. This perl script is executed in the loop. It does the job fairly well but during the last run of the loop, while the script is still downloading, it executes the rest of the python code.
I want to know what is the best way to call another program in python, and when running the perl script, the python process to wait till the execution of the perl script finishes as the rest of the python code is processing the data downloaded. I have read about threading but not sure how to implement it in my case.
the code is
for expr in names_dict[keys]:
subprocess.call(["./test.pl", expr, absFilePath])
Any help will be appreciated.
Many Thanks,

Use subprocess.Popen(). Check out this blog post:
http://trifoliummedium.blogspot.com/2010/12/running-command-line-with-python-and.html

Related

Control executed programm with python

I want to execute a testrun via bash, if the test needs too much time. So far, I found some good solutions here. But since the command kill does not work properly (when I use it correctly it says it is not used correctly), I decided to solve this problem using python. This is the Execution call I want to monitor:
EXE="C:/program.exe"
FILE="file.tpt"
HOME_DIR="C:/Home"
"$EXE" -vm-Xmx4096M --run build "$HOME_DIR/test/$FILE" "Auslieferung (ML) Execute"
(The opened *.exe starts a testrun which includes some simulink simulation runs - sometimes there are simulink errors - in this case, the execution time of the tests need too long and I want to restart the entire process).
First, I came up with the idea, calling a shell script containing these lines within a subprocess from python:
import subprocess
import time
process = subprocess.Popen('subprocess.sh', shell = True)
time.sleep(10)
process.terminate()
But when I use this, *.terminate() or *.kill() does not close the program I started with the subprocess call.
That´s why I am now trying to implement the entire call in python language. I got the following so far:
import subprocess
file = "somePath/file.tpt"
p = subprocess.Popen(["C:/program.exe", file])
Now I need to know, how to implement the second call "Auslieferung (ML) Execute" of the bash function. This call starts an intern testrun named "Auslieferung (ML) Execute". Any ideas? Or is it better to choose one of the other ways? Or can I get the "kill" option for bash somewhere, somehow?

Simultaneous Python and C++ run with read and write files

So this one is a doozie, and a little too specific to find an answer online.
I am writing to a file in C++ and reading that file in Python at the same time to move a robot. Or trying to.
When I try running both programs at the same time, the C++ one runs first and then the Python one runs.
Here's the command I use:
./ColorFollow & python fileToHex.py
This happens even if I switch the order of commands.
Even if I run them in different terminals (which is the same thing, just covering all bases).
Both the Python and C++ code read / write in 'infinite' loops, so these two should run until I say stop.
The code works fine; when the Python script finally runs the robot moves as intended. It's just that the code doesn't run at the same time.
Is there a way to make this happen, or is this impossible?
If you need more information, lemme know, but the code is pretty much what you'd expect it to be.
If you are using Linux, & will release bash session and in this case, CollorFlow and fileToXex.py will run in different bash sessions.
At the same time, composition ./ColorFollow | python fileToHex.py looks interesting, cause you redirect stdout of ColorFollow to fileToHex.py stdin - it can syncronize scripts by printing some code string upon exit, then reading it by fileToHex.py and exit as well.
I would create some empty file like /var/run/ColorFollow.flag and write there 1 when one of processes exit. Not a pipe - cause we do not care which process will start first. So, if next loop step of ColorFollow sees 1 in the file, it deletes it and exits (means that fileToHex already exited). The same - for fileToHex - check flag file each loop step and exit if it exists, after deleting flag file.

Python and Scheduling Computation

I wish to schedule a computation to occur after my current computation in Python is finished. Note that my Python interpreter is running through emacs.
For example I am currently running:
>>> for i in range(2, 5):
... tn.TweetNetwork.create_subnetworks(i)
...
I made a simple mistake and meant to type range(1,5). This has been running for at least 4 hours and should run for another few hours. That being said I do not want to re-execute the loop with the correction and lose all that has been computed.
As I am not by the computer 24/7, how can I schedule Python to execute the function `tn.TweetNetwork.create_subnetworks(1)?
I use emacs 24.3 and ubuntu 12.04 LTS, let me know if you need more information. All help is greatly appreciated!
EDIT: I like the answer posted, however I do not know how to find the PID. I am running a Python interpreter through emacs. So how would I find that out?
This was too much for the comment, but this isn't a complete reply.
To get a process started by Emacs:
M-x list-processes,
identify the process you want to get the id of
M-:(process-id (get-process "name-of-the-process")).
But this will give you the process of the interpreter, not any other process started from it.
If you then need to get all processes spawned through that process, you can do:
$ pstree PID
Where PID is the one you obtained earlier from Emacs.
I think, the easiest way is to write another script that wait until your process finished and runs tn.TweetNetwork.create_subnetworks(1). This will work only if your create_subnetworks does not access any global variables and does and write all results into database/file/etc.
# Write script similar to these
import os, time
print "Wait until old script completed..."
while os.path.exists("/proc/SCRIPT_PID"):
time.sleep(1)
print "Execute create_subnetworks..."
tn = ...
tn.TweetNetwork.create_subnetworks(1)
Connect to your computer by SSH, get process id by ps axu | grep script_name and run this new script.
If Tyler comment does not help, you may eval the following piece of code:
(defun foo (ignored)
(remove-hook 'comint-output-filter-functions 'foo)
(run-with-timer 1 nil (lambda()
(goto-char (point-max))
(insert "tn.TweetNetwork.create_subnetworks(1)")
(comint-send-input))))
(add-hook 'comint-output-filter-functions 'foo)
It defines a function that will insert the command you need to insert in the python inferior buffer, a second after the invocation of that function (the delay is for avoid recursive loops).
Then it setup the invocation of that function upon the event where the inferior process (python, in your case) writes anything. In your case, that would be the ">>>" prompt, that python writes when ready. If your code is generating output, this approach won't work.
If you are using comint in other buffers (shell, sql, ...) you would need to make variable comint-output-filter-functions local to your python interactive buffer (with make-variable-buffer-local)

Execute script from within python

I have a playgame.cmd file I would like to exceute from within my python code.
It is a genetic algorithm that runs the game (input is individual), waits for the game to run with that individual, then parses data from the game log to output the fitness of that individual.
Inside the .cmd file (shouldn't matter I don't think):
python tools/playgame.py "python MyBot.py" "python tools/sample_bots/python/HunterBot.py"
--map_file tools/maps/example/tutorial1.map --log_dir game_logs --turns 60 --scenario
--food none --player_seed 7 --verbose -e
(This is for the ants AI challenge if you were wondering)
This is all details though. My question is that of the title:
How do I start the script midline in python, wait for the script to finish, then resume the python execution? The script file is in the same folder as the python AntEvolver.py file.
If you want to launch a .cmd file from within a Python script which then launches two more copies of Python within the .cmd, I think you need to slow down, take a step back, and think about how to just get all this stuff to run within one Python interpreter. But, the direct answer to your question is to use os.system() (or the subprocess module, which is also mentioned here):
http://docs.python.org/library/os.html#os.system
A very little snippet:
import subprocess
# do your stuff with sys.argv
subprocess.Popen("python MyBot.py", shell=True).communicate()
# script executed and finished, you can continue...

Python subprocess problems - order of events

I'm writing some code that takes a bunch of text files, runs OpinionFinder on them, then analyses the results. OpinionFinder is a python program that calls a java progam to manage various other programs.
I have:
some code (pull data off the web, write text files)
args = shlex.split('python opinionfinder.py -f doclist')
optout = subprocess.Popen(args)
retcode = optout.wait()
some more code to analyse OpinionFinder's text files.
When I didn't have the optout.wait bit, the subprocess would get executed after the rest of the script had finished, i.e. before the file analysis part. When I added the optout.wait OpinionFinder didn't run properly - I think because it couldn't find the files from the first part of the script - i.e. the order is wrong again.
What am I doing wrong?
What's the best way to run some script, execute an external process, then run the rest of the script?
Thanks.

Categories