I'm writing some code that takes a bunch of text files, runs OpinionFinder on them, then analyses the results. OpinionFinder is a python program that calls a java progam to manage various other programs.
I have:
some code (pull data off the web, write text files)
args = shlex.split('python opinionfinder.py -f doclist')
optout = subprocess.Popen(args)
retcode = optout.wait()
some more code to analyse OpinionFinder's text files.
When I didn't have the optout.wait bit, the subprocess would get executed after the rest of the script had finished, i.e. before the file analysis part. When I added the optout.wait OpinionFinder didn't run properly - I think because it couldn't find the files from the first part of the script - i.e. the order is wrong again.
What am I doing wrong?
What's the best way to run some script, execute an external process, then run the rest of the script?
Thanks.
Related
I have a python script (on Windows Server) that requests information through an API, downloads data into several csv and at the end it 'zips' them all into 1 single .zip file. The idea is to access that zip file through an URL, using PHP.
But the main issue here is PHP making the zip available, even before the file is not yet completely compressed.
Using the 'sleep' function, I believe will halt the execution of PHP, even the Python script, am I right?
My current setting is using a Task, on Windows Task Scheduler, to trigger the Python script, download the necessary files and only then, making the zip file available with PHP.
Is there any way of doing all the process within PHP script, without using Windows Task Scheduler?
<?php
include_once $_SERVER['DOCUMENT_ROOT'] . "/#classes/simple_html_dom.php";
#header("Content-type: text/html");
header("Content-Disposition: attachment; filename=api-week-values.zip");
unlink("api-week-values.zip");
$command = escapeshellcmd('python api_weekly.py');
$output = shell_exec($command);
sleep(15);
?>
You can see the code I am using for the whole process.
To clarify the process:
An application needs to access the file I am downloading with the python script
I feed the application with the URL from Apache server
the application goes to the link (through index.php) and triggers the python script (which downloads the zip file)
php should pop-up a save file WHEN the zip file has been completely downloaded
May be you can do something like that :
The python script put at the begining a file somewhere with informations like start time and end time when it endded
The PHP script loop on read the file while the end time is not set. And after delete the file. And then, it takes the zip.
Why not call the PHP from Python at the end of the py script? Link
import subprocess
# if the script don't need output.
subprocess.call("php /path/to/your/script.php")
# if you want output
proc = subprocess.Popen("php /path/to/your/script.php", shell=True, stdout=subprocess.PIPE)
script_response = proc.stdout.read()
I want to execute a testrun via bash, if the test needs too much time. So far, I found some good solutions here. But since the command kill does not work properly (when I use it correctly it says it is not used correctly), I decided to solve this problem using python. This is the Execution call I want to monitor:
EXE="C:/program.exe"
FILE="file.tpt"
HOME_DIR="C:/Home"
"$EXE" -vm-Xmx4096M --run build "$HOME_DIR/test/$FILE" "Auslieferung (ML) Execute"
(The opened *.exe starts a testrun which includes some simulink simulation runs - sometimes there are simulink errors - in this case, the execution time of the tests need too long and I want to restart the entire process).
First, I came up with the idea, calling a shell script containing these lines within a subprocess from python:
import subprocess
import time
process = subprocess.Popen('subprocess.sh', shell = True)
time.sleep(10)
process.terminate()
But when I use this, *.terminate() or *.kill() does not close the program I started with the subprocess call.
That´s why I am now trying to implement the entire call in python language. I got the following so far:
import subprocess
file = "somePath/file.tpt"
p = subprocess.Popen(["C:/program.exe", file])
Now I need to know, how to implement the second call "Auslieferung (ML) Execute" of the bash function. This call starts an intern testrun named "Auslieferung (ML) Execute". Any ideas? Or is it better to choose one of the other ways? Or can I get the "kill" option for bash somewhere, somehow?
So this one is a doozie, and a little too specific to find an answer online.
I am writing to a file in C++ and reading that file in Python at the same time to move a robot. Or trying to.
When I try running both programs at the same time, the C++ one runs first and then the Python one runs.
Here's the command I use:
./ColorFollow & python fileToHex.py
This happens even if I switch the order of commands.
Even if I run them in different terminals (which is the same thing, just covering all bases).
Both the Python and C++ code read / write in 'infinite' loops, so these two should run until I say stop.
The code works fine; when the Python script finally runs the robot moves as intended. It's just that the code doesn't run at the same time.
Is there a way to make this happen, or is this impossible?
If you need more information, lemme know, but the code is pretty much what you'd expect it to be.
If you are using Linux, & will release bash session and in this case, CollorFlow and fileToXex.py will run in different bash sessions.
At the same time, composition ./ColorFollow | python fileToHex.py looks interesting, cause you redirect stdout of ColorFollow to fileToHex.py stdin - it can syncronize scripts by printing some code string upon exit, then reading it by fileToHex.py and exit as well.
I would create some empty file like /var/run/ColorFollow.flag and write there 1 when one of processes exit. Not a pipe - cause we do not care which process will start first. So, if next loop step of ColorFollow sees 1 in the file, it deletes it and exits (means that fileToHex already exited). The same - for fileToHex - check flag file each loop step and exit if it exists, after deleting flag file.
I have a playgame.cmd file I would like to exceute from within my python code.
It is a genetic algorithm that runs the game (input is individual), waits for the game to run with that individual, then parses data from the game log to output the fitness of that individual.
Inside the .cmd file (shouldn't matter I don't think):
python tools/playgame.py "python MyBot.py" "python tools/sample_bots/python/HunterBot.py"
--map_file tools/maps/example/tutorial1.map --log_dir game_logs --turns 60 --scenario
--food none --player_seed 7 --verbose -e
(This is for the ants AI challenge if you were wondering)
This is all details though. My question is that of the title:
How do I start the script midline in python, wait for the script to finish, then resume the python execution? The script file is in the same folder as the python AntEvolver.py file.
If you want to launch a .cmd file from within a Python script which then launches two more copies of Python within the .cmd, I think you need to slow down, take a step back, and think about how to just get all this stuff to run within one Python interpreter. But, the direct answer to your question is to use os.system() (or the subprocess module, which is also mentioned here):
http://docs.python.org/library/os.html#os.system
A very little snippet:
import subprocess
# do your stuff with sys.argv
subprocess.Popen("python MyBot.py", shell=True).communicate()
# script executed and finished, you can continue...
I am calling a perl script from python. The perl script retrieves a large data set in batches from a webserver which takes time. This perl script is executed in the loop. It does the job fairly well but during the last run of the loop, while the script is still downloading, it executes the rest of the python code.
I want to know what is the best way to call another program in python, and when running the perl script, the python process to wait till the execution of the perl script finishes as the rest of the python code is processing the data downloaded. I have read about threading but not sure how to implement it in my case.
the code is
for expr in names_dict[keys]:
subprocess.call(["./test.pl", expr, absFilePath])
Any help will be appreciated.
Many Thanks,
Use subprocess.Popen(). Check out this blog post:
http://trifoliummedium.blogspot.com/2010/12/running-command-line-with-python-and.html