Forwarding command line arguments to a process in Python - python

I'm using a crude IDE (Microchip MPLAB) with C30 toolchain on Windows XP.
The C compiler has a very noisy output that I'm unable to control, and it's very hard to spot actual warnings and errors in output window.
I want to write a python script that would receive arguments for compiler, call the compiler with same arguments, filter results and output them to stdout. Then I can replace the compiler executable with my script in toolchain settings. The IDE calls my script and receives filtered compiler output.
My code for executing the compiler looks like this:
arguments = ' '.join(sys.argv[1:])
cmd = '%s %s' % (compiler_path, arguments)
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
The problem is that quotes from arguments are consumed on script execution, so if IDE calls my script with following arguments:
main.c -o"main.o"
the value of arguments is
main.c -omain.o
The most obvious solution is to put whole argument list in quotes, but this would require modification in compiler calling code in IDE. I also tried using batch file, but it can only accept nine parameters (%1 to %9), and compiler is called with 15+ parameters.
Is there a way to forward exactly the same arguments to a process from script?

Give the command arguments to Popen as a list:
arguments = sys.argv[1:]
cmd = [compiler_path] + arguments
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

As ChristopheD said the shell removes the quotes.
But you don't need to create the string yourself when using Popen: it can handle that for you automatically. You can do this instead:
import sys, subprocess
process = subprocess.Popen(sys.argv[1:], executable=compiler_path, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
The subprocess module hopefully will pass the arguments correctly for you.

Your shell is eating the quotes (the python script never even receives them) so I suppose it's not very easy to get them 'unaltered'.

Related

Popen prepending path to executable

I am using popen to run the following command on a windows vm
'tf changeset ...'
however when I run it using
commandLine = 'tf changeset /noprompt /latest /loginType:OAuth /login:.,***'
process = Popen(commandLine, shell=True, stdout=PIPE, stderr=PIPE)
I see the following being executed in the logs
'C:\Azure\Agent-1/externals/tf/tf changeset ...'
Meaning that 'C:\Azure\Agent-1/externals/tf/' has been prepended to my command. I was just expecting to see
'tf changeset ...'
Unfortunately adding the path to the execution breaks the command, is there any way to stop python from doing this?
Try passing the commandLine to Popen as a list of arguments:
commandLine = ["tf", "changeset", "/noprompt", "/latest", "/loginType:OAuth", "/login:.,***'"]
process = Popen(commandLine, stdout=PIPE, stderr=PIPE)
Python by itself does no such thing. Perhaps the shell=True is doing more than you hoped or bargained for? But we would need access to your shell's configuration to get beyond mere speculation around this.
Calling Popen on the result from Popen is obviously not well-defined; but perhaps this is just an error in your transcription of your real code?
Removing the first process =Popen( would fix this with minimal changes. As per the above, I would also remove shell=True as at least superfluous and at worst directly harmful.
commandLine = 'tf changeset /noprompt /latest /loginType:OAuth /login:.,***'
process = Popen(commandLine, stdout=PIPE, stderr=PIPE)
Like the subprocess documentation tells you, shell=True is only useful on Windows when your command is a cmd built-in.
For proper portability, you should break the command into tokens, manually or by way of shlex.split() if you are lazy or need the user to pass in a string to execute.
commandLine = ['tf ', 'changeset', '/noprompt', '/latest', '/loginType:OAuth', '/login:.,***']
process = Popen(commandLine, stdout=PIPE, stderr=PIPE)
This avoids the other dangers of shell=True and will be portable to non-Windows platforms (assuming of course that the command you are trying to run is available on the target platform).

Python subprocess.popen fails when interacting with the subprocess

I have a python build script for a Xamarin application that I need to compile into different ipa's and apk's based on locale.
The script manipulates the necessary values in info.plist and the Android manifest and then builds each of the versions using subprocess.popen to call xbuild. Or at least that's how it's suppose to be.
The problem is that when I in anyway interact with the subprocess (basically i need to wait until it's done before I start changing values for the next version)
This works:
build_path = os.path.dirname(os.path.realpath(__file__))
ipa_path = "/path/to/my.ipa"
cmd = '/Library/Frameworks/Mono.framework/Versions/4.6.2/Commands/xbuild /p:Configuration="Release" /p:Platform="iPhone" /p:IpaPackageDir="%s" /t:Build %s/MyApp/iOS/MyApp.iOS.csproj' % (ipa_path, build_path)
subprocess.Popen(cmd, env=os.environ, shell=True)
However it will result in the python script continuing in parallel with the build.
If I do this:
subprocess.Popen(cmd, env=os.environ, shell=True).wait()
Xbuild fail with the following error message:
Build FAILED.
Errors:
/Users/sune/dev/MyApp/iOS/MyApp.iOS.csproj: error :
/Users/sune/dev/MyApp/iOS/MyApp.iOS.csproj: There is an unclosed literal string.
Line 2434, position 56.
It fails within milliseconds of being called, whereas normally the build process takes several minutes
Any other shorthand methods of subprocess.popen such as .call, .check_call, and the underlying operations of subprocess.poll and subprocess.communicate causes the same error to happen.
What's really strange is that even calling time.sleep can provoke the same error:
subprocess.Popen(cmd, env=os.environ, shell=True)
time.sleep(2)
Which I don't get because as I understand it I should also be able to do something like this:
shell = subprocess.Popen(cmd, env=os.environ, shell=True)
while shell.poll() is None:
time.sleep(2)
print "done"
To essentially achieve the same as calling shell.wait()
Edit: Using command list instead of string
If I use a command list and shell=False like this
cmd = [
'/Library/Frameworks/Mono.framework/Versions/4.6.2/Commands/xbuild',
'/p:Configuration="Release"',
'/p:Platform="iPhone"',
'/p:IpaPackageDir="%s' % ipa_path,
'/t:Build %s/MyApp/iOS/MyApp.iOS.csproj' % build_path
]
subprocess.Popen(cmd, env=os.environ, shell=False)
Then this is the result:
MSBUILD: error MSBUILD0003: Please specify the project or solution file to build, as none was found in the current directory.
Any input is much appreciated. I'm banging my head against the wall here.
I firmly believe that this is not possible. It must be a shortcoming of the way the subprocess module is implemented.
xbuild spawns multiple subprocesses during the build and if polled for status the subprocess in python will discover that one of these had a non-zero return status and stop the execution of one or more of the xbuild subprocesses causing the build to fail as described.
I ended up using a bash script to do the compiling and use python to manipulate xml files etc.

Checking Subprocesses in python

I'm trying to run one python program from another using subprocess. Here's the function I've got so far:
def runProcess(exe):
p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
retcode = p.poll() #returns None while subprocess is running
line = p.stdout.readline()
yield line
if(retcode is not None):
break
then i run:
for line in runProcess('python myotherprogram.py'): print line
but I get an OS error: no such file, but it doesn't tell me what file doesn't exist. It's baffling. Any suggestions? I can use the runProcess function for normal terminal commands, such as ls.
What doesn't exist is a single executable named python myotherprogram.py. To specify arguments, you need to provide a list consisting of the command and its argument, such as with runProcess(["python", "myotherprogram.py"]), or specify shell=True to the Popen constructor.
The relevant quote from the documentation:
args should be a sequence of program arguments or else a single
string. By default, the program to execute is the first item in args
if args is a sequence. If args is a string, the interpretation is
platform-dependent and described below. See the shell and executable
arguments for additional differences from the default behavior. Unless
otherwise stated, it is recommended to pass args as a sequence.
On Unix, if args is a string, the string is interpreted as the name or
path of the program to execute. However, this can only be done if not
passing arguments to the program.

Python3 subprocess module: Passing an empty variable as a flag, is it possible?

I am trying to use subprocess.call() to execute a command-line program in python3. I can get it to work fine, the following example executes with no problems:
subprocess.call(['add_phenotype.py', '-t', threads, '-s'])
However, I want to parse a file, and then based on what I find, run the command with different flags. I can't figure out how to do this.
For example:
if zeroed_out_file:
args = '-z'
else:
args = ''
subprocess.call(['add_phenotype.py', '-t', threads, '-s', args])
fails if zeroed_out_file is FALSE. The add_phenotype.py script exits immediately claiming that it doesn't recognize the arguments.
The first argument takes a list; just build that list dynamically:
args = ['add_phenotype.py', '-t', threads, '-s']
if zeroed_out_file:
args.append('-z')
subprocess.call(args)
Appending additional command line switches is just a question of appending more values to args.

Python: executing shell script with arguments(variable), but argument is not read in shell script

I am trying to execute a shell script(not command) from python:
main.py
-------
from subprocess import Popen
Process=Popen(['./childdir/execute.sh',str(var1),str(var2)],shell=True)
execute.sh
----------
echo $1 //does not print anything
echo $2 //does not print anything
var1 and var2 are some string that I am using as an input to shell script. Am I missing something or is there another way to do it?
Referred: How to use subprocess popen Python
The problem is with shell=True. Either remove that argument, or pass all arguments as a string, as follows:
Process=Popen('./childdir/execute.sh %s %s' % (str(var1),str(var2),), shell=True)
The shell will only pass the arguments you provide in the 1st argument of Popen to the process, as it does the interpretation of arguments itself.
See a similar question answered here. What actually happens is your shell script gets no arguments, so $1 and $2 are empty.
Popen will inherit stdout and stderr from the python script, so usually there's no need to provide the stdin= and stderr= arguments to Popen (unless you run the script with output redirection, such as >). You should do this only if you need to read the output inside the python script, and manipulate it somehow.
If all you need is to get the output (and don't mind running synchronously), I'd recommend trying check_output, as it is easier to get output than Popen:
output = subprocess.check_output(['./childdir/execute.sh',str(var1),str(var2)])
print(output)
Notice that check_output and check_call have the same rules for the shell= argument as Popen.
you actually are sending the arguments ... if your shell script wrote a file instead of printing you would see it. you need to communicate to see your printed output from the script ...
from subprocess import Popen,PIPE
Process=Popen(['./childdir/execute.sh',str(var1),str(var2)],shell=True,stdin=PIPE,stderr=PIPE)
print Process.communicate() #now you should see your output
If you want to send arguments to shellscript from python script in a simple way.. You can use python os module :
import os
os.system(' /path/shellscriptfile.sh {} {}' .format(str(var1), str(var2))
If you have more arguments.. Increase the flower braces and add the args..
In shellscript file.. This will read the arguments and u can execute the commands accordingly

Categories