subprocess popen command arguements do not work [duplicate] - python

I've been trying to pass a command that works only with literal double quotes in the commandline around the "concat:file1|file2" argument for ffmpeg.
I cant however make this work from python with subprocess.Popen(). Anyone have an idea how one passes quotes into subprocess.Popen?
Here is the code:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
output,error = subprocess.Popen(command, universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
When I do this, ffmpeg won't take it any other way other than quotes around the concat segement. Is there a way to successfully pass this line to subprocess.Popen command?

I'd suggest using the list form of invocation rather than the quoted string version:
command = ["ffmpeg", "-i", "concat:1.ts|2.ts", "-vcodec", "copy",
"-acodec", "copy", "temp.mp4"]
output,error = subprocess.Popen(
command, universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
This more accurately represents the exact set of parameters that are going to be passed to the end process and eliminates the need to mess around with shell quoting.
That said, if you absolutely want to use the plain string version, just use different quotes (and shell=True):
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
output,error = subprocess.Popen(
command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()

Either use single quotes 'around the "whole pattern"' to automatically escape the doubles or explicitly "escape the \"double quotes\"". Your problem has nothing to do with Popen as such.
Just for the record, I had a problem particularly with a list-based command passed to Popen that would not preserve proper double quotes around a glob pattern (i.e. what was suggested in the accepted answer) under Windows. Joining the list into a string with ' '.join(cmd) before passing it to Popen solved the problem.

This works with python 2.7.3 The command to pipe stderr to stdout has changed since older versions of python:
Put this in a file called test.py:
#!/usr/bin/python
import subprocess
command = 'php -r "echo gethostname();"'
p = subprocess.Popen(command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
text = p.stdout.read()
retcode = p.wait()
print text
Invoke it:
python test.py
It prints my hostname, which is apollo:
apollo
Read up on the manual for subprocess: http://docs.python.org/2/library/subprocess.html

I have been working with a similar issue, with running a relatively complex
command over ssh. It also had multiple double quotes and single quotes. Because
I was piping the command through python, ssh, powershell etc.
If you can instead just convert the command into a shell script, and run the
shell script through subprocess.call/Popen/run, these issues will go away.
So depending on whether you are on windows or on linux or mac, put the
following in a shell script either (script.sh or script.bat)
ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4
Then you can run
import subprocess; subprocess.call(`./script.sh`; shell=True)
Without having to worry about single quotes, etc.

This line of code in your question isn't valid Python syntax:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
If you had a Python file with just this line in it, you would get a syntax error. A string literal surrounded with double quotes can't have double quotes in them unless they are escaped with a backslash. So you could fix that line by replacing it with:
command = "ffmpeg -i \"concat:1.ts|2.ts\" -vcodec copy -acodec copy temp.mp4"
Another way to fix this line is to use single quotes for the string literal in Python, that way Python is not confused when the string itself contains a double quote:
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
Once you have fixed the syntax error, you can then tackle the issue with using subprocess, as explained in this answer. I also wrote this answer to explain a helpful mental model for subprocess in general.

Also struggling with a string argument containing spaces and not wanting to use the shell=True.
The solution was to use double quotes for the inside strings.
args = ['salt', '-G', 'environment:DEV', 'grains.setvals', '{"man_version": "man-dev-2.3"}']
try:
p = subprocess.Popen(args, stdin=subprocess.PIPE
, stdout=subprocess.PIPE
, stderr=subprocess.PIPE
)
(stdin,stderr) = p.communicate()
except (subprocess.CalledProcessError, OSError ) as err:
exit(1)
if p.returncode != 0:
print("Failure in returncode of command:")

Anybody suffering from this pain. It also works with params enclosed with quotation marks.
params = ["ls", "-la"]
subprocess.check_output(" ".join(params), shell=True)

Related

Python and Bash - execution of specific command doesn't work from Python, manually in Bash it works [duplicate]

I've been trying to pass a command that works only with literal double quotes in the commandline around the "concat:file1|file2" argument for ffmpeg.
I cant however make this work from python with subprocess.Popen(). Anyone have an idea how one passes quotes into subprocess.Popen?
Here is the code:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
output,error = subprocess.Popen(command, universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
When I do this, ffmpeg won't take it any other way other than quotes around the concat segement. Is there a way to successfully pass this line to subprocess.Popen command?
I'd suggest using the list form of invocation rather than the quoted string version:
command = ["ffmpeg", "-i", "concat:1.ts|2.ts", "-vcodec", "copy",
"-acodec", "copy", "temp.mp4"]
output,error = subprocess.Popen(
command, universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
This more accurately represents the exact set of parameters that are going to be passed to the end process and eliminates the need to mess around with shell quoting.
That said, if you absolutely want to use the plain string version, just use different quotes (and shell=True):
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
output,error = subprocess.Popen(
command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
Either use single quotes 'around the "whole pattern"' to automatically escape the doubles or explicitly "escape the \"double quotes\"". Your problem has nothing to do with Popen as such.
Just for the record, I had a problem particularly with a list-based command passed to Popen that would not preserve proper double quotes around a glob pattern (i.e. what was suggested in the accepted answer) under Windows. Joining the list into a string with ' '.join(cmd) before passing it to Popen solved the problem.
This works with python 2.7.3 The command to pipe stderr to stdout has changed since older versions of python:
Put this in a file called test.py:
#!/usr/bin/python
import subprocess
command = 'php -r "echo gethostname();"'
p = subprocess.Popen(command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
text = p.stdout.read()
retcode = p.wait()
print text
Invoke it:
python test.py
It prints my hostname, which is apollo:
apollo
Read up on the manual for subprocess: http://docs.python.org/2/library/subprocess.html
I have been working with a similar issue, with running a relatively complex
command over ssh. It also had multiple double quotes and single quotes. Because
I was piping the command through python, ssh, powershell etc.
If you can instead just convert the command into a shell script, and run the
shell script through subprocess.call/Popen/run, these issues will go away.
So depending on whether you are on windows or on linux or mac, put the
following in a shell script either (script.sh or script.bat)
ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4
Then you can run
import subprocess; subprocess.call(`./script.sh`; shell=True)
Without having to worry about single quotes, etc.
This line of code in your question isn't valid Python syntax:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
If you had a Python file with just this line in it, you would get a syntax error. A string literal surrounded with double quotes can't have double quotes in them unless they are escaped with a backslash. So you could fix that line by replacing it with:
command = "ffmpeg -i \"concat:1.ts|2.ts\" -vcodec copy -acodec copy temp.mp4"
Another way to fix this line is to use single quotes for the string literal in Python, that way Python is not confused when the string itself contains a double quote:
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
Once you have fixed the syntax error, you can then tackle the issue with using subprocess, as explained in this answer. I also wrote this answer to explain a helpful mental model for subprocess in general.
Also struggling with a string argument containing spaces and not wanting to use the shell=True.
The solution was to use double quotes for the inside strings.
args = ['salt', '-G', 'environment:DEV', 'grains.setvals', '{"man_version": "man-dev-2.3"}']
try:
p = subprocess.Popen(args, stdin=subprocess.PIPE
, stdout=subprocess.PIPE
, stderr=subprocess.PIPE
)
(stdin,stderr) = p.communicate()
except (subprocess.CalledProcessError, OSError ) as err:
exit(1)
if p.returncode != 0:
print("Failure in returncode of command:")
Anybody suffering from this pain. It also works with params enclosed with quotation marks.
params = ["ls", "-la"]
subprocess.check_output(" ".join(params), shell=True)

Sed working from the command line but not within a Python script

I'm trying to run the following command in a Python script:
sudo sed -i 's/auth-user-pass/auth-user-pass \/etc\/openvpn\/credentials/g' /etc/openvpn/US-East.ovpn
The command above runs fine in a terminal.
My Python script looks like this;
import subprocess
subprocess.Popen(["sudo", "sed", "-i", "'s/auth-user-pass/auth-user-pass", "\/etc\/openvpn\/credentials/g'", "/etc/openvpn/US-East.ovpn"], stdout=subprocess.PIPE, text=True)
But I get the following error when I run the script
sed: -e expression #1, char 1: unknown command: `''
I thought some characters (like the single quote) might need escaping but I've been trying and none of them work.
I'm quite lost; can anyone help?
Since your command line is:
sudo sed -i 's/auth-user-pass/auth-user-pass \/etc\/openvpn\/credentials/g' /etc/openvpn/US-East.ovpn
You need to remove the single quotes (the shell does that for you) and you need to keep the whole of the single-quoted argument as one argument, not splitting it at spaces (the shell doesn't split at spaces inside a quoted string):
subprocess.Popen(["sudo", "sed", "-i", "s/auth-user-pass/auth-user-pass \/etc\/openvpn\/credentials/g", "/etc/openvpn/US-East.ovpn"], stdout=subprocess.PIPE, text=True)
The sed command does not expect to see the single quotes.
Can you test this without the commas in the sed command parameter?
import subprocess
subprocess.Popen(["sudo", "sed", "-i", "'s/auth-user-pass/auth-user-pass \/etc\/openvpn\/credentials/g'", "/etc/openvpn/US-East.ovpn"], stdout=subprocess.PIPE, text=True)

Terminal command escaping issue

I am trying to escape the following, so I can grab the version of iDevice attached via USB:
system_profiler SPUSBDataType | sed -n -e 's/ */ /g' -e '/iPad/,/Version/p' -e '/iPhone/,/Version/p' | grep 'iPad\|iPhone\|Version' | awk 'NR%2{printf $0;next;}1'
So I can run it via Popen, however everytime I always get an issue on iPad\|iPhone\|Version, my code is the following, in an attempt to escape the single quotes:
cmd1 = Popen([r'system_profiler', 'SPUSBDataType'], stdout=subprocess.PIPE)
cmd2 = Popen([r'sed','-n','-e','\'s/ */ /g\'','-e','\'/iPad/,/Version/p\'', '-e', '\'/iPhone/,/Version/p\''], stdin=cmd1.stdout, stdout=subprocess.PIPE)
cmd3 = Popen([r'grep', '\'iPad\|iPhone\|Version\''], stdin=cmd2.stdout, stdout=subprocess.PIPE)
cmd4 = Popen([r'awk', '\'NR%2{printf $0;next;}1\''], stdin=cmd3.stdout, stdout=subprocess.PIPE)
cmd1.stdout.close()
ver = cmd4.communicate()[0]
Use a raw string literal, or double the backslashes; \| has a meaning in a Python string definition syntax too, resulting in no backslash being present in the resulting value. You don't need those quotes either (the shell would have removed them too):
cmd3 = Popen([r'grep', r"iPad\|iPhone\|Version"], stdin=cmd2.stdout, stdout=subprocess.PIPE)
It'd be much easier to apply the string filtering and replacements in Python code, in my opinion.
Played around with grep and managed to extract what I needed from system_profiler. However Martijn's answer is more suitable if you cannot grep for the necessary string.
prof = Popen(['system_profiler', 'SPUSBDataType'], stdout=subprocess.PIPE)
grep1 = Popen(['grep','-e','iPhone','-e','iPad','-e','iPod', '-A', '4'], stdin=prof.stdout, stdout=subprocess.PIPE)
grep2 = Popen(['grep', 'Version'], stdin=grep1.stdout, stdout=subprocess.PIPE)
prof.stdout.close() # Allow ps_process to receive a SIGPIPE if grep_process exits.
stdoutver = grep2.communicate()[0]

Running bash commands in python with quotes

I am trying to run a bash command to start up a stream using MJPG streamer in python. While I know the general process is to put the command in as a string, split the string, then put the split string into Popen, the issue I'm having is that the command requires double quotes and .split() removes these so I am getting errors stating that the -d flag is an unrecognised option. The command runs fine if I just run it directly, but I can't seem to get it running from python (Python 2.7).
from subprocess import Popen
def start_stream(device):
stream_start_cmd = """
sudo /usr/local/bin/mjpg_streamer -i
"/usr/local/lib/input_uvc.so -d /dev/video{0} -y"
-o "/usr/local/lib/output_http.so -w
/usr/local/www -p {1}"
""".format(device,
'80' if device == 0 else '443 &')
Popen(stream_start_cmd.split())
if __name__ == '__main__':
start_stream(0)
Also side note, is there any better way to format this mess?
The python document says:
args should be a sequence of program arguments or else a single string.
Based on the command you provided, once split, we have
['sudo', '/usr/local/bin/mjpg_streamer', '-i', '"/usr/local/lib/input_uvc.so', '-d', '/dev/video{0}', '-y"', '-o', '"/usr/local/lib/output_http.so', '-w', '/usr/local/www', '-p', '{1}"']
You can see there's double quote in front of /usr/local/lib/input_uvc.so and after -y. Those double quotes will make the args inaccurate.

Running bash command on server

I am trying to run the bash command pdfcrack in Python on a remote server. This is my code:
bashCommand = "pdfcrack -f pdf123.pdf > myoutput.txt"
import subprocess
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
I, however, get the following error message:
Non-option argument myoutput2.txt
Error: file > not found
Can anybody see my mistake?
The first argument to Popen is a list containing the command name and its arguments. > is not an argument to the command, though; it is shell syntax. You could simply pass the entire line to Popen and instruct it to use the shell to execute it:
process = subprocess.Popen(bashCommand, shell=True)
(Note that since you are redirecting the output of the command to a file, though, there is no reason to set its standard output to a pipe, because there will be nothing to read.)
A better solution, though, is to let Python handle the redirection.
process = subprocess.Popen(['pdfcrack', '-f', 'pdf123.pdf'], stdout=subprocess.PIPE)
with open('myoutput.txt', 'w') as fh:
for line in process.stdout:
fh.write(line)
# Do whatever else you want with line
Also, don't use str.split as a replacement for the shell's word splitting. A valid command line like pdfcrack -f "foo bar.pdf" would be split into the incorrect list ['pdfcrack', '-f', '"foo', 'bar.pdf"'], rather than the correct list ['pdfcrack', '-f', 'foo bar.pdf'].
> is interpreted by shell, but not valid otherwise.
So, that would work (don't split, use as-is):
process = subprocess.Popen(bashCommand, shell=True)
(and stdout=subprocess.PIPE isn't useful since all output is redirected to the output file)
But it could be better with native python for redirection to output file and passing arguments as list (handles quote protection if needed)
with open("myoutput.txt","w") as f:
process = subprocess.Popen(["pdfcrack","-f","pdf123.pdf"], stdout=subprocess.PIPE)
f.write(process.read())
process.wait()
Your mistake is > in command.
It doesn't treat this as redirection to file because normally bash does it and now you run it without using bash.
Try with shell=True if you whan to use bash. And then you don't have to split command into list.
subprocess.Popen("pdfcrack -f pdf123.pdf > myoutput.txt", shell=True)

Categories