Using Python2.4 I want to capture output from a mysql command. One caveat is that I need to pipe the SQL statement using an echo.
echo 'SELECT user FROM mysql.user;' | mysql
I see example using call, os.system, popen but what is best to use for my version of python and capturing the output in a tuple.
Thanks
The subprocess module is the most flexible tool for running commands and controlling the input and output. The following runs a command and captures the output as a list of lines:
import subprocess
p = subprocess.Popen(['/bin/bash', '-c', "echo 'select user from mysql.user;' | mysql" ],
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
lines = [line for line in p.stdout]
On Windows, bash -c would be replaced with cmd /c.
Related
I am using speedtest-cli to get my internet speed in a python script. I would like to run this command in shell via subprocess.Popen.
Here is the command in terminal:
`speedtest-cli --share > output.log`
speedtest-cli runs the test, whilst --share provides me with an additional link in the output, pointing to an image of the speedtest result. Here is the content of output.log:
Retrieving speedtest.net configuration...
Testing from M-net (xxx.xxx.xxx.xxx)...
Retrieving speedtest.net server list...
Selecting best server based on ping...
Hosted by XXXXXXXXXXXXXXXXXXXXXX [16.12 km]: 20.902 ms
Testing download speed................................................................................
Download: 48.32 Mbit/s
Testing upload speed......................................................................................................
Upload: 12.49 Mbit/s
Share results: http://www.speedtest.net/result/670483456.png
If I run the command in terminal, I get all the test results as well as the link in the target file as expected. I confirmed it is all stdout and not another channel by using this grep trick: command | grep .
I am trying to run it in Python as follows:
subprocess.Popen(['speedtest-cli', '--share', '>', 'output.log`],
stdout=subprocess.PIPE, shell=True)
...and I also tried putting the output into the file directly via python:
with open('output.log', 'w') as f:
Popen(['speedtest-cli', '--json', '--share'], stdout=f, shell=True)
Neither of these work. I get a nice file created with the latter approach, BUT the link is not included! (the last line in the output above).
Against all the warnings of deprecation and better safety with using subprocess module, I became desparate and tried os.system():
os.system('speedtest-cli --share > output.log')
Annoyingly, this works... the full output along with the link is captured in the file.
What is going on here? How do I get the link to be captured using Popen?
I'm using Python 3.5
When using shell=True, your argument to Popen needs to be a string, not a list:
subprocess.Popen('speedtest-cli --json --share > output.log',
stdout=subprocess.PIPE, shell=True)
Compare:
>>> subprocess.Popen('echo hello', shell=True)
>>> hello
And:
>>> subprocess.Popen(['echo', 'hello'], shell=True)
>>>
When you pass a list and you are using shell=True, only the the first item is relevant and the remainder is ignored.
If you want to collect the output yourself, consider subprocess.check_output:
>>> output = subprocess.check_output(['echo', 'hello'])
>>> output
b'hello\n'
Or:
>>> output = subprocess.check_output('echo hello', shell=True)
The check_output method works with both Python 2 and Python 3. In Python 3, you also have available the run method.
This question already has answers here:
File not found error when launching a subprocess containing piped commands
(6 answers)
Closed 9 months ago.
cmd= ["sudo", "cat", "{filepath}".format(filepath=filepath), "|","egrep", "-v","\'{filteruser}\'".format(filteruser=filteruser)]
fileformat and filteruser can be blank too
config file below
[plugin_haproxy]
user= tara
host=localhost
filepath=/etc/haproxy/haproxy.global.inc
filter=
This is the command I want to run on subprocess, checking the above variable values with pdb shows bellow value and looks great
['sudo', 'cat', '/etc/haproxy/haproxy.global.inc', '|', 'egrep', '-v', "''"]
Manullay running the code sudo cat /etc/haproxy/haproxy.global.inc | egrep -v " '' " on terminal works great
Why is subprocess not able to process it.
"cat: |: No such file or directory\ncat: egrep: No such file or
directory\ncat:
your shell will take a command such $ a | b | c and turn it into three separate processes (a,b,c) and attach the output of one to the input of the next in the chain.
['sudo', 'cat', '/etc/haproxy/haproxy.global.inc', '|', 'egrep', '-v', "''"]
The above will execute the sudo command (which will in-turn fork the cat command and pass all the remaining args to cat). It doesn't understand the "|" symbol. It just treats it as another arg. It is cat that is complaining that it can not open "|", "egrep", etc at it is treating these as file names to cat.
You can try the Popen with shell=True. I have not tried that and not sure if it will handle pipes.
The other option is to use Popen to execute the sudo command only (no filter) and then use Popen.communicate to read the output from the command in python and do the empty line filtering in python.
EDIT:
I did a quick check for the shell=True. Here is the script below:
#!/usr/bin/env python
import subprocess
subprocess.call(["ls", "-l", "|", "cat"], shell=False)
With shell=False, I get the following error: ls: cat: No such file or directory and ls: |: No such file or directory. This is expected as ls is trying to list the contents of | and cat.
with shell=True, I get the desired output. That is, the output of ls is piped to cat. The shell is processing the |.
You can either use shell=True as was already suggested, or use subprocess to pipe:
p1 = subprocess.Popen(['sudo', 'cat', '/etc/haproxy/haproxy.global.inc'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['egrep', '-v', "''"], stdin=p1.stdout,stdout=subprocess.PIPE)
Command and filter above
command = ["sudo", "cat", "{filepath}".format(filepath=filepath)]
userfilter = ["egrep", "-v", "\"{filteruser}\"".format(filteruser=filteruser)***]
Subprocess
cmdout = subprocess.Popen(command, stdout=subprocess.PIPE)
filtration = subprocess.Popen(runfilter, stdin=cmdout.stdout,stdout=subprocess.PIPE)
output, err = filtration.communicate()
Yes,I used a two steps solutions could not find a single best refer here,Python subprocess command with pipe
I am trying to run a bash command to start up a stream using MJPG streamer in python. While I know the general process is to put the command in as a string, split the string, then put the split string into Popen, the issue I'm having is that the command requires double quotes and .split() removes these so I am getting errors stating that the -d flag is an unrecognised option. The command runs fine if I just run it directly, but I can't seem to get it running from python (Python 2.7).
from subprocess import Popen
def start_stream(device):
stream_start_cmd = """
sudo /usr/local/bin/mjpg_streamer -i
"/usr/local/lib/input_uvc.so -d /dev/video{0} -y"
-o "/usr/local/lib/output_http.so -w
/usr/local/www -p {1}"
""".format(device,
'80' if device == 0 else '443 &')
Popen(stream_start_cmd.split())
if __name__ == '__main__':
start_stream(0)
Also side note, is there any better way to format this mess?
The python document says:
args should be a sequence of program arguments or else a single string.
Based on the command you provided, once split, we have
['sudo', '/usr/local/bin/mjpg_streamer', '-i', '"/usr/local/lib/input_uvc.so', '-d', '/dev/video{0}', '-y"', '-o', '"/usr/local/lib/output_http.so', '-w', '/usr/local/www', '-p', '{1}"']
You can see there's double quote in front of /usr/local/lib/input_uvc.so and after -y. Those double quotes will make the args inaccurate.
I am trying to run the bash command pdfcrack in Python on a remote server. This is my code:
bashCommand = "pdfcrack -f pdf123.pdf > myoutput.txt"
import subprocess
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
I, however, get the following error message:
Non-option argument myoutput2.txt
Error: file > not found
Can anybody see my mistake?
The first argument to Popen is a list containing the command name and its arguments. > is not an argument to the command, though; it is shell syntax. You could simply pass the entire line to Popen and instruct it to use the shell to execute it:
process = subprocess.Popen(bashCommand, shell=True)
(Note that since you are redirecting the output of the command to a file, though, there is no reason to set its standard output to a pipe, because there will be nothing to read.)
A better solution, though, is to let Python handle the redirection.
process = subprocess.Popen(['pdfcrack', '-f', 'pdf123.pdf'], stdout=subprocess.PIPE)
with open('myoutput.txt', 'w') as fh:
for line in process.stdout:
fh.write(line)
# Do whatever else you want with line
Also, don't use str.split as a replacement for the shell's word splitting. A valid command line like pdfcrack -f "foo bar.pdf" would be split into the incorrect list ['pdfcrack', '-f', '"foo', 'bar.pdf"'], rather than the correct list ['pdfcrack', '-f', 'foo bar.pdf'].
> is interpreted by shell, but not valid otherwise.
So, that would work (don't split, use as-is):
process = subprocess.Popen(bashCommand, shell=True)
(and stdout=subprocess.PIPE isn't useful since all output is redirected to the output file)
But it could be better with native python for redirection to output file and passing arguments as list (handles quote protection if needed)
with open("myoutput.txt","w") as f:
process = subprocess.Popen(["pdfcrack","-f","pdf123.pdf"], stdout=subprocess.PIPE)
f.write(process.read())
process.wait()
Your mistake is > in command.
It doesn't treat this as redirection to file because normally bash does it and now you run it without using bash.
Try with shell=True if you whan to use bash. And then you don't have to split command into list.
subprocess.Popen("pdfcrack -f pdf123.pdf > myoutput.txt", shell=True)
I'm trying to run a Perl script from Python. I know that if run the Perl script in terminal and I want the output of the Perl script to be written a file I need to add > results.txt after perl myCode.pl. This works fine in the terminal, but when I try to do this in Python it doesn't work.
This the code:
import shlex
import subprocess
args_str = "perl myCode.pl > results.txt"
args = shlex.split(args_str)
subprocess.call(args)
Despite the > results.txt it does not output to that file but it does output to the command line.
subprocess.call("perl myCode.pl >results.txt", shell=True)
or
subprocess.call(["sh", "-c", "perl myCode.pl >results.txt"])
or
with open('results.txt', 'wb', 0) as file:
subprocess.call(["perl", "myCode.pl"], stdout=file)
The first two invoke a shell to execute the shell command perl myCode.pl > results.txt. The last one executes perl directly by having call do the redirection itself. This is the more reliable solution.