Problems running terminal command via Python - python

I'm working on a small project where I need to control a console player via python. This example command works perfectly on the Linux terminal:
mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)
In Python I'm doing the following:
command = """mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)"""
os.system(command)
The problem is when I try it using Python it gives me an error when I run it:
sh: 1: Syntax error: "(" unexpected
I'm really confused here because it is the exact same string. Why doesn't the second method work?
Thanks.

Your default user shell is probably bash. Python's os.system command calls sh by default in linux.
A workaround is to use subprocess.check_call() and pass shell=True as an argument to tell subprocess to execute using your default user shell.
import subprocess
command = """mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)"""
subprocess.check_call(command, shell=True)

Your python call 'os.system' is probably just using a different shell than the one you're using on the terminal:
os.system() execute command under which linux shell?
The shell you've spawned with os.system may not support parentheses for substitution.

import subprocess
COMND=subprocess.Popen('command',shell=True,stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
COMND_bytes = COMND.stdout.read() + COMND.stderr.read()
print(COMND_bytes)

Related

Can we use a variable containing linux command result in the python code

I am working with shell scripting and trying to learn Python scripting, any suggestion is welcome.
I want to achieve something like below:
Usage1:
ps_result=`ps -eaf|grep -v "grep" | grep "programmanager"`
then can we straight away use ps_result variable in python code; if yes, how?
Usage2:
matched_line=`cat file_name |grep "some string"`
can we use matched_line variable in python code as a list, if yes how?
PS: If possible assume I am writing the bash and python code in the one file, if not possible request you to please suggest a way. TIA
Yes, you can do it via environment variables.
First, define the environment variable for the shell using export:
$ export ps_result=`ps -eaf|grep -v "grep" | grep "programmanager"`
Then, import the os module in Python and read the environment variable:
$ python -c 'import os; ps_result=os.environ.get("ps_result"); print(ps_result)'
Second question first, if you run python -, python will run the script piped on stdin. There are several functions in python subprocess that let you run other programs. So you could write
test.sh
#!/bin/sh
python - << endofprogram
import subprocess as subp
result = subp.run('ps -eaf | grep -v "grep" | grep "python"',
shell=True, capture_output=True)
if result.returncode == 0:
matched_lines = result.stdout.decode().split("\n")
print(matched_lines)
endofprogram
In this example we pipe via the shell but python can chain stdout/stdin also, albeit more verbosely.

Running bash command with redirection in python

I want to use redirection from bash function return values to e.g. grep, and call it from python.
Even though I am using shell=True in subprocess, this does not seem to work.
Example that works in bash:
grep foo <(echo "fooline")
Example that gives the error /bin/sh: 1: Syntax error: "(" unexpected in python 3.7.3, Ubuntu 19.04:
#!/usr/bin/env python3
import subprocess
subprocess.call('grep foo <(echo "fooline")', shell=True)
According to answers like these, redirection should work with shell=True (and it does for redirecting actual files, but not return values).
EDIT:
Added shebang and python version.
The inconsistency that you observe is due to the fact that shell=True gives you a sh shell, not bash.
The following is valid in bash but not in sh.
grep foo <(echo "fooline")
Example output:
sh-3.2$ grep foo <(echo "fooline")
sh: syntax error near unexpected token `('
If you use a valid sh expression your approach will work. Alternatively you can specify which shell to use with executable='/bin/bash'. You can also use something like:
subprocess.Popen(['/bin/bash', '-c', cmd])
The error is a shell error, nothing to do with python. sh chokes on the parentheses of python syntax.
/bin/sh: 1: Syntax error: "(" unexpected
Let's add a shebang (reference: Should I put #! (shebang) in Python scripts, and what form should it take?)
And also stop using shell=True and all. Use real pipes and command lines with arguments (note: this has been tested & works on windows using a native grep command, so now this is portable)
#!/usr/bin/env python3
import subprocess
p = subprocess.Popen(['grep','foo'],stdin = subprocess.PIPE)
p.stdin.write(b"fooline\n")
p.stdin.close()
p.wait()

How to force os.system() to use bash instead of shell

I've tried what's told in How to force /bin/bash interpreter for oneliners
By doing
os.system('GREPDB="my command"')
os.system('/bin/bash -c \'$GREPDB\'')
However no luck, unfortunately I need to run this command with bash and subp isn't an option in this environment, I'm limited to python 2.4. Any suggestions to get me in the right direction?
Both commands are executed in different subshells.
Setting variables in the first system call does not affect the second system call.
You need to put two command in one string (combining them with ;).
>>> import os
>>> os.system('GREPDB="echo 123"; /bin/bash -c "$GREPDB"')
123
0
NOTE You need to use "$GREPDB" instead of '$GREPDBS'. Otherwise it is interpreted literally instead of being expanded.
If you can use subprocess:
>>> import subprocess
>>> subprocess.call('/bin/bash -c "$GREPDB"', shell=True,
... env={'GREPDB': 'echo 123'})
123
0
The solution below still initially invokes a shell, but it switches to bash for the command you are trying to execute:
os.system('/bin/bash -c "echo hello world"')
I use this:
subprocess.call(["bash","-c",cmd])
//OK, ignore this because I have not notice subprocess not considered.
subprocess.Popen(cmd, shell=True, executable='/bin/bash')
I searched this command for some days and found really working code:
import subprocess
def bash_command(cmd):
subprocess.Popen(['/bin/bash', '-c', cmd])
code="abcde"
// you can use echo options such as -e
bash_command('echo -ne "'+code+'"')
Output:
abcde

python 2.3 how to run a piped command

I want to run a command like this
grep -w 1 pattern <(tail -f mylogfile.log)
basically from a python script i want to monitor a log file for a specific string and continue with the python script as soon as i found that.
I am using os.system(), but that is hanging. The same command in bash works good.
I have a very old version of python (v2.3) and so don't have sub-process module.
do we have a way to acheive this
In Python 2.3, you need to use subprocess from SVN
import subprocess
import shlex
subprocess.call(shlex.split("/bin/bash -c 'grep -w 1 pattern <(tail -f mylogfile.log)'"))
To be explicit, you need to install it from the SVN link above.
You need to call this with /bin/bash -c due to the shell redirection you're using
EDIT
If you want to solve this with os.system(), just wrap the command in /bin/bash -c since you're using shell redirection...
os.system("/bin/bash -c 'grep -w 1 pattern <(tail -f mylogfile.log)'")
First of all, the command i think you should be using is grep -w -m 1 'pattern' <(tail -f in)
For executing commands in python, use the Popen constructor from the subprocess module. Read more at
http://docs.python.org/library/subprocess.html
If I understand correctly, you want to send the output to python like this -
tail -f mylogfile.log | grep -w 1 pattern | python yourscript.py
i.e., read all updates to the log file, and send matching lines to your script.
To read from standard input, you can use the file-like object: sys.stdin.
So your script would look like
import sys
for line in sys.stdin.readlines():
#process each line here.

Python Popen grep

I'd like Popen to execute:
grep -i --line-buffered "grave" data/*.txt
When run from the shell, this gives me the wanted result. If I start, in the very same directory where I test grep, a python repl and follow the instruction from the docs, I obtain what should be the proper argument list to feed Popen with:
['grep', '-i', '--line-buffered', 'grave', 'data/*.txt']
The result of p = subprocess.Popen(args) is
grep: data/*.txt: No such file or directory
and if I try p = subprocess.Popen(args, shell=True), I get:
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
Any help on how to perform the wanted process? I'm on MacOS Lion.
If you type * in bash the shell expands it to the files in the given directory before executing the command. Python's Popen does no such thing, so what you're doing when you call Popen like that is telling grep there is a file called *.txt in the data directory, instead of all the .txt files in the data directory. That file doesn't exist and you get the expected error.
To solve this you can tell python to run the command through the shell by passing shell=True to Popen:
subprocess.Popen('grep -i --line-buffered grave data/*.txt', shell=True)
Which gets translated to:
subprocess.Popen(['/bin/sh', '-c', 'grep -i --line-buffered "grave" data/*.txt'])
As explained in the documentation of Popen.
You have to use a string instead of a list here, because you want to execute /bin/sh -c "grep -i --line-buffered "grave" data/*.txt" (N.B. quotes around the command, making it a single argument to sh). If you use a list this command is run: /bin/sh -c grep -i --line-buffered "grave" data/*.txt, which gives you the output of simply running grep.
The problem is that shell makes file globbing for you: data/*.txt
You will need to do it youself, for example, by using glob module.
import glob
cmd_line = ['grep', '-i', '--line-buffered', 'grave'] + glob.glob('data/*.txt')

Categories