When I execute the script in R, it is:
$ R --vanilla --args test_matrix.csv < hierarchical_clustering.R > out.txt
In Python, it works if I use:
process = subprocess.call("R --vanilla --args "+output_filename+"_DM_Instances_R.csv < /home/kevin/AV-labels/Results/R/hierarchical_clustering.R > "+output_filename+"_out.txt", shell=True)
But this method doesn't provide the process.wait() function.
So, I would like to use the subprocess.Popen, I tried:
process = subprocess.Popen(['R', '--vanilla', '--args', "\'"+output_filename+"_DM_Instances_R.csv\'", '<', '/home/kevin/AV-labels/Results/R/hierarchical_clustering.R'])
But it didn't work, Python just opened R but didn't execute my script.
Instead of 'R', give it the path to Rscript. I had the same problem. Opens up R but doesn't execute my script. You need to call Rscript (instead of R) to actually execute the script.
retcode = subprocess.call("/Pathto/Rscript --vanilla /Pathto/test.R", shell=True)
This works for me.
Cheers!
I've solved this problem by putting everything into the brackets..
process = subprocess.Popen(["R --vanilla --args "+output_filename+"_DM_Instances_R.csv < /home/kevin/AV-labels/Results/R/hierarchical_clustering.R > "+output_filename+"_out.txt"], shell=True)
process.wait()
A couple of ideas:
You might want to consider using the Rscript frontend, which makes
running scripts easier; you can pass the script filename directly
as a parameter, and do not need to read the script in through standard input.
You don't need the shell for just redirecting standard output to a file, you can
do that directly with subprocess.Popen.
Example:
import subprocess
output_name = 'something'
script_filename = 'hierarchical_clustering.R'
param_filename = '%s_DM_Instances_R.csv' % output_name
result_filename = '%s_out.txt' % output_name
with open(result_filename, 'wb') as result:
process = subprocess.Popen(['Rscript', script_filename, param_filename],
stdout=result);
process.wait()
You never actually execute it fully ^^ try the following
process = subprocess.Popen(['R', '--vanilla', '--args', '\\%s_DM_Instances_R.csv\\' % output_filename, '<', '/home/kevin/AV-labels/Results/R/hierarchical_clustering.R'], stdout=subprocess.PIPE, stdin=subprocess.PIPE, shell=True)
process.communicate()#[0] is stdout
Keven's solution works for my requirement. Just to give another example about #Kevin's solution. You can pass more parameters to the rscript with python-style string:
import subprocess
process = subprocess.Popen(["R --vanilla --args %s %d %.2f < /path/to/your/rscript/transformMatrixToSparseMatrix.R" % ("sparse", 11, 0.98) ], shell=True)
process.wait()
Also, to make things easier you could create an R executable file. For this you just need to add this in the first line of the script:
#! /usr/bin/Rscript --vanilla --default-packages=utils
Reference: Using R as a scripting language with Rscript or this link
Related
I have been looking for an answer for how to execute a java jar file through python and after looking at:
Execute .jar from Python
How can I get my python (version 2.5) script to run a jar file inside a folder instead of from command line?
How to run Python egg files directly without installing them?
I tried to do the following (both my jar and python file are in the same directory):
import os
if __name__ == "__main__":
os.system("java -jar Blender.jar")
and
import subprocess
subprocess.call(['(path)Blender.jar'])
Neither have worked. So, I was thinking that I should use Jython instead, but I think there must a be an easier way to execute jar files through python.
Do you have any idea what I may do wrong? Or, is there any other site that I study more about my problem?
I would use subprocess this way:
import subprocess
subprocess.call(['java', '-jar', 'Blender.jar'])
But, if you have a properly configured /proc/sys/fs/binfmt_misc/jar you should be able to run the jar directly, as you wrote.
So, which is exactly the error you are getting?
Please post somewhere all the output you are getting from the failed execution.
This always works for me:
from subprocess import *
def jarWrapper(*args):
process = Popen(['java', '-jar']+list(args), stdout=PIPE, stderr=PIPE)
ret = []
while process.poll() is None:
line = process.stdout.readline()
if line != '' and line.endswith('\n'):
ret.append(line[:-1])
stdout, stderr = process.communicate()
ret += stdout.split('\n')
if stderr != '':
ret += stderr.split('\n')
ret.remove('')
return ret
args = ['myJarFile.jar', 'arg1', 'arg2', 'argN'] # Any number of args to be passed to the jar file
result = jarWrapper(*args)
print result
I used the following way to execute tika jar to extract the content of a word document. It worked and I got the output also. The command I'm trying to run is "java -jar tika-app-1.24.1.jar -t 42250_EN_Upload.docx"
from subprocess import PIPE, Popen
process = Popen(['java', '-jar', 'tika-app-1.24.1.jar', '-t', '42250_EN_Upload.docx'], stdout=PIPE, stderr=PIPE)
result = process.communicate()
print(result[0].decode('utf-8'))
Here I got result as tuple, hence "result[0]". Also the string was in binary format (b-string). To convert it into normal string we need to decode with 'utf-8'.
With args: concrete example using Closure Compiler (https://developers.google.com/closure/) from python
import os
import re
src = test.js
os.execlp("java", 'blablabla', "-jar", './closure_compiler.jar', '--js', src, '--js_output_file', '{}'.format(re.sub('.js$', '.comp.js', src)))
(also see here When using os.execlp, why `python` needs `python` as argv[0])
How about using os.system() like:
os.system('java -jar blabla...')
os.system(command)
Execute the command (a string) in a subshell. This is implemented by calling the Standard C function system(), and has the same limitations. Changes to sys.stdin, etc. are not reflected in the environment of the executed command.
I have a python script that calls a shell scrips, that in turn calls a .exe called iv4_console. I need to print the stdout of iv4_console for debugging purposes. I used this:
Python:
import sys
import subprocess
var="rW015005000000"
proc = subprocess.Popen(["c.sh", var], shell=True, stdout=subprocess.PIPE)
output = ''
for line in iter(proc.stdout.readline, ""):
print line
output += line
Shell:
start_dir=$PWD
release=$1
echo Release inside shell: $release
echo Directory: $start_dir
cd $start_dir
cd ../../iv_system4/ports/visualC12/Debug
echo Debug dir: $PWD
./iv4_console.exe ../embedded/LUA/analysis/verbose-udp-toxml.lua ../../../../../logs/$release/VASP_DUN722_20160307_Krk_Krk_113048_092_1_$release.dvl &>../../../../FCW/ObjectDetectionTest/VASP_DUN722_20160307_Krk_Krk_113048_092_1_$release.xml
./iv4_console.exe ../embedded/LUA/analysis/verbose-udp-toxml.lua ../../../../../logs/$release/VASP_FL140_20170104_C60_Checkout_afterIC_162557_001_$release.dvl &>../../../../FCW/ObjectDetectionTest/VASP_FL140_20170104_C60_Checkout_afterIC_162557_001_$release.xml
exit
But this didn't work, it prints nothing. What do you think?
See my comment, best approach (i.m.o) would be to just use python only.
However, in answer of your question, try:
import sys
import subprocess
var="rW015005000000"
proc = subprocess.Popen(["/bin/bash", "/full/path/to/c.sh"], stdout=subprocess.PIPE)
# Best to always avoid shell=True because of security vulnerabilities.
proc.wait() # To make sure the shell script does not continue running indefinitely in the background
output, errors = proc.communicate()
print(output.decode())
# Since subprocess.communicate() returns a bytes-string, you can use .decode() to print the actual output as a string.
You can use
import subprocess
subprocess.call(['./c.sh'])
to call the shell script in python file
or
import subprocess
import shlex
subprocess.call(shlex.split('./c.sh var'))
I am new to python and working on trying to make a script which checks if a specified host as for example sensu-client exist. I use a deployment software called NSO and run it by: nso status and it shows me this information:
nagios-client host nagios-client down
test host test down
Is there any possibility to make a script to check if for example nagios-Client exist with a script ?
In shell I do it by:
nso status | awk '{ print $1 }'
In this case I would suggest using subprocess' check_output function. The documentation is here. check_output can return, as a string the shell output of a command. So you would have something like this:
import subprocess
foo=subprocess.check_output(['nso', 'status', '|', 'awk', '\'{ print $1 }\''], shell=True)
#Thanks bereal for shell=True
print foo
Of course, if your only targeting linux, you could use the much easier sh module. It allows you to import programs as if they were libraries.
you can use subprocess to run this command and parse the output
import subprocess
command = ['nso', 'status', '|', 'awk', '\'{ print $1 }\'']
p1 = subprocess.Popen(command, stdout=subprocess.PIPE)
You don't have to run awk, since you're already in Python:
import subprocess
proc = subprocess.Popen(['nso', 'status'], stdout=subprocess.PIPE)
# get stdout as a EOL-separated string, ignore stderr for now
out, _ = proc.communicate()
# parse the output, line.split()[0] is awk's $1
items = [line.split()[0] for line in out.split('\n')]
If possible I would like to not use subProcess.popen. The reason I want to capture the stdout of the process started by the child is because I need to save the output of the child in a variable to display it back later. However I have yet to find a way to do so anywhere. I also need to activate multiple programs without necessarily closing the one that's active. I also need to be controlling the child process whit the parent process.
I'm launching a subprocess like this
listProgram = ["./perroquet.py"]
listOutput = ["","",""]
tubePerroquet = os.pipe()
pipeMain = os.pipe()
pipeAge = os.pipe()
pipeSavoir = os.pipe()
pid = os.fork()
process = 1
if pid == 0:
os.close(pipePerroquet[1])
os.dup2(pipePerroquet[0],0)
sys.stdout = os.fdopen(tubeMain[1], 'w')
os.execvp("./perroquet.py", listProgram)
Now as you can see I'm launching the program with os.execvp and using os.dup2() to redirect the stdout of the child. However I'm not sure of what I've done in the code and want to know of the correct way to redirect stdout with os.dup2 and then be able to read it in the parent process.
Thank you for your help.
I cannot understand why you do not want to use the excellent subprocess module that could save you a lot of boiler plate code (and as much error possibilities ...). Anyway, I assume perroquet.py is a python script, not an executable progam. Shell know how to find the correct interpretor for scripts, but exec family are low-level functions that expect a real executable program.
You should at least have something like :
listProgram = [ "python", "./perroquet.py","",""]
...
os.execvp("python", listProgram)
But I'd rather use :
prog = subprocess.Popen(("python", "./perroquet.py", "", ""), stdout = PIPE)
or even as you are already in python import it and directly call the functions from there.
EDIT :
It looks thart what you really want is :
user gives you a command (can be almost anything)
[ you validate that the command is safe ] - unsure if you intend to do it but you should ...
you make the shell execute the command and get its output - you may want to read stderr too and control exit code
You should try something like
while True:
cmd = raw_input("commande :") # input with Python 3
if cmd.strip().lower() == exit: break
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, shell=True)
out, err = proc.communicate()
code = proc.returncode
print("OUT", out, "ERR", err, "CODE", code)
It is absolutely unsafe, since this code executes any command as the underlying shell would do (include rm -rf *, rd /s/q ., ...), but it gives you the output, the output and the return code of the command, and it can be used is a loop. The only limitation is that as you use a different shell for each command, you cannot use commands that change shell environment - they will be executed but will have no effect.
Here's a solution if you need to extract any changes to the environment
from subprocess import Popen, PIPE
import os
def execute_and_get_env(cmd, initial_env=None):
if initial_env is None:
initial_env = os.environ
r_fd, w_fd = os.pipe()
write_env = "; env >&{}".format(w_fd)
p = Popen(cmd + write_env, shell=True, env=initial_env, pass_fds=[w_fd], stdout=PIPE, stderr=PIPE)
output, error = p.communicate()
# this will cause problems if the environment gets very large as
# writing to the pipe will hang because it gets full and we only
# read from the pipe when the process is over
os.close(w_fd)
with open(r_fd) as f:
env = dict(line[:-1].split("=", 1) for line in f)
return output, error, env
export_cmd = "export my_var='hello world'"
echo_cmd = "echo $my_var"
out, err, env = execute_and_get_env(export_cmd)
out, err, env = execute_and_get_env(echo_cmd, env)
print(out)
I'm really stuck with a problem I'm hoping someone can help me with. I'm trying to create a wrapper in Python3.1 for a command line program called spooky. I can successfully run this program on the command line like this:
$ spooky -a 4 -b .97
My first Python wrapper attempt for spooky looked like this:
import subprocess
start = "4"
end = ".97"
spooky_path = '/Users/path/to/spooky'
cmd = [spooky_path, '-a', start, '-b', end]
process = subprocess.Popen(cmd, stdout=subprocess.PIPE)
process.wait()
print('Done')
The above code prints Done, but does not execute the program spooky
Next I tried to just execute the program on the command line like this:
$ /Users/path/to/spooky -a 4 -b .97
The above code also fails, and provides no helpful errors.
My question is: How can I get Python to run this program by sending spooky -a 4 -b .97 to the command line? I would VERY much appreciate any help you can provide. Thanks in advance.
You need to drop the stdout=subprocess.PIPE. Doing that disconnects the stdout of your process from Python's stdout and makes it retrievable using the Popen.communicate() function, like so:
import subprocess
spooky_path = 'ls'
cmd = [spooky_path, '-l']
process = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output = process.communicate()[0]
print "Output:", output
process.wait()
print('Done')
To make it print directly you can use it without the stdout argument:
process = subprocess.Popen(cmd)
Or you can use the call function:
process = subprocess.call(cmd)
Try making your command into a single string:
cmd = 'spooky_path -a start -b end'
process = subprocess.Popen(cmd, shell=True)