I want to run a command like this
grep -w 1 pattern <(tail -f mylogfile.log)
basically from a python script i want to monitor a log file for a specific string and continue with the python script as soon as i found that.
I am using os.system(), but that is hanging. The same command in bash works good.
I have a very old version of python (v2.3) and so don't have sub-process module.
do we have a way to acheive this
In Python 2.3, you need to use subprocess from SVN
import subprocess
import shlex
subprocess.call(shlex.split("/bin/bash -c 'grep -w 1 pattern <(tail -f mylogfile.log)'"))
To be explicit, you need to install it from the SVN link above.
You need to call this with /bin/bash -c due to the shell redirection you're using
EDIT
If you want to solve this with os.system(), just wrap the command in /bin/bash -c since you're using shell redirection...
os.system("/bin/bash -c 'grep -w 1 pattern <(tail -f mylogfile.log)'")
First of all, the command i think you should be using is grep -w -m 1 'pattern' <(tail -f in)
For executing commands in python, use the Popen constructor from the subprocess module. Read more at
http://docs.python.org/library/subprocess.html
If I understand correctly, you want to send the output to python like this -
tail -f mylogfile.log | grep -w 1 pattern | python yourscript.py
i.e., read all updates to the log file, and send matching lines to your script.
To read from standard input, you can use the file-like object: sys.stdin.
So your script would look like
import sys
for line in sys.stdin.readlines():
#process each line here.
Related
I am working with shell scripting and trying to learn Python scripting, any suggestion is welcome.
I want to achieve something like below:
Usage1:
ps_result=`ps -eaf|grep -v "grep" | grep "programmanager"`
then can we straight away use ps_result variable in python code; if yes, how?
Usage2:
matched_line=`cat file_name |grep "some string"`
can we use matched_line variable in python code as a list, if yes how?
PS: If possible assume I am writing the bash and python code in the one file, if not possible request you to please suggest a way. TIA
Yes, you can do it via environment variables.
First, define the environment variable for the shell using export:
$ export ps_result=`ps -eaf|grep -v "grep" | grep "programmanager"`
Then, import the os module in Python and read the environment variable:
$ python -c 'import os; ps_result=os.environ.get("ps_result"); print(ps_result)'
Second question first, if you run python -, python will run the script piped on stdin. There are several functions in python subprocess that let you run other programs. So you could write
test.sh
#!/bin/sh
python - << endofprogram
import subprocess as subp
result = subp.run('ps -eaf | grep -v "grep" | grep "python"',
shell=True, capture_output=True)
if result.returncode == 0:
matched_lines = result.stdout.decode().split("\n")
print(matched_lines)
endofprogram
In this example we pipe via the shell but python can chain stdout/stdin also, albeit more verbosely.
In Jupyter, this can be achieved with the %run line magic. Like this:
%run -i somescript.py -f -b
But that magic does not work from within a python script file. I tried this:
os.system("python3 -i somescript.py -f -b")
A further complication is that I want to use a variable, like this:
%run -i somescript.py -f $LINE -b
Have you tried using subprocess ?
That is probably not the best answer, but you can call bash from python script, which calls another python script
import subprocess
subprocess.check_call(["echo", "Hello world!"])
Is there a way to create a python script which wraps an entire bash command including the pipes.
For example, if I have the following simple script
import sys
print sys.argv
and call it like so (from bash or ipython), I get the expected outcome:
[pkerp#pendari trell]$ python test.py ls
['test.py', 'ls']
If I add a pipe, however, the output of the script gets redirected to the pipe sink:
[pkerp#pendari trell]$ python test.py ls > out.txt
And the > out.txt portion is not in sys.argv. I understand that the shell automatically process this output, but I'm curious if there's a way to force the shell to ignore it and pass it to the process being called.
The point of this is to create something like a wrapper for the shell. I'd like to run the commands regularly, but keep track of the strace output for each command (including the pipes). Ideally I'd like to keep all of the bash features, such as tab-completion and up and down arrows and history search, and then just pass the completed command through a python script which invokes a subprocess to handle it.
Is this possible, or would I have to write my own shell to do this?
Edit
It appears I'm asking the exact same thing as this question.
The only thing you can do is pass the entire shell command as a string, then let Python pass it back to a shell for execution.
$ python test.py "ls > out.txt"
Inside test.py, something like
subprocess.call("strace " + sys.argv[1], shell=True, executable="/bin/bash")
to ensure the entire string is passed to the shell (and bash, specifically).
Well, I don't quite see what you are trying to do. The general approach would be to give the desired output destination to the script using command line options: python test.py ls --output=out.txt. Incidentally, strace writes to stderr. You could capture everything using strace python test.py > out 2> err if you want to save everything...
Edit: If your script writes to stderr as well you could use strace -o strace_out python test.py > script_out 2> script_err
Edit2: Okay, I understand better what you want. My suggestion is this: Write a bash helper:
function process_and_evaluate()
{
strace -o /tmp/output/strace_output "$#"
/path/to/script.py /tmp/output/strace_output
}
Put this in a file like ~/helper.sh. Then open a bash, source it using . ~/helper.sh.
Now you can run it like this: process_and_evaluate ls -lA.
Edit3:
To capture output / error you could extend the macro like this:
function process_and_evaluate()
{
out=$1
err=$2
shift 2
strace -o /tmp/output/strace_output "$#" > "$out" 2> "$err"
/path/to/script.py /tmp/output/strace_output
}
You would have to use the (less obvious ) process_and_evaluate out.txt err.txt ls -lA.
This is the best that I can come up with...
At least in your simple example, you could just run the python script as an argument to echo, e.g.
$ echo $(python test.py ls) > test.txt
$ more test.txt
['test.py','ls']
Enclosing a command in parenthesis with a dollar sign first executes the contents then passes the output as an argument to echo.
I'm working on a small project where I need to control a console player via python. This example command works perfectly on the Linux terminal:
mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)
In Python I'm doing the following:
command = """mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)"""
os.system(command)
The problem is when I try it using Python it gives me an error when I run it:
sh: 1: Syntax error: "(" unexpected
I'm really confused here because it is the exact same string. Why doesn't the second method work?
Thanks.
Your default user shell is probably bash. Python's os.system command calls sh by default in linux.
A workaround is to use subprocess.check_call() and pass shell=True as an argument to tell subprocess to execute using your default user shell.
import subprocess
command = """mplayer -loop 0 -playlist <(find "/mnt/music/soundtrack" -type f | egrep -i '(\.mp3|\.wav|\.flac|\.ogg|\.avi|\.flv|\.mpeg|\.mpg)'| sort)"""
subprocess.check_call(command, shell=True)
Your python call 'os.system' is probably just using a different shell than the one you're using on the terminal:
os.system() execute command under which linux shell?
The shell you've spawned with os.system may not support parentheses for substitution.
import subprocess
COMND=subprocess.Popen('command',shell=True,stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
COMND_bytes = COMND.stdout.read() + COMND.stderr.read()
print(COMND_bytes)
I'd like Popen to execute:
grep -i --line-buffered "grave" data/*.txt
When run from the shell, this gives me the wanted result. If I start, in the very same directory where I test grep, a python repl and follow the instruction from the docs, I obtain what should be the proper argument list to feed Popen with:
['grep', '-i', '--line-buffered', 'grave', 'data/*.txt']
The result of p = subprocess.Popen(args) is
grep: data/*.txt: No such file or directory
and if I try p = subprocess.Popen(args, shell=True), I get:
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
Any help on how to perform the wanted process? I'm on MacOS Lion.
If you type * in bash the shell expands it to the files in the given directory before executing the command. Python's Popen does no such thing, so what you're doing when you call Popen like that is telling grep there is a file called *.txt in the data directory, instead of all the .txt files in the data directory. That file doesn't exist and you get the expected error.
To solve this you can tell python to run the command through the shell by passing shell=True to Popen:
subprocess.Popen('grep -i --line-buffered grave data/*.txt', shell=True)
Which gets translated to:
subprocess.Popen(['/bin/sh', '-c', 'grep -i --line-buffered "grave" data/*.txt'])
As explained in the documentation of Popen.
You have to use a string instead of a list here, because you want to execute /bin/sh -c "grep -i --line-buffered "grave" data/*.txt" (N.B. quotes around the command, making it a single argument to sh). If you use a list this command is run: /bin/sh -c grep -i --line-buffered "grave" data/*.txt, which gives you the output of simply running grep.
The problem is that shell makes file globbing for you: data/*.txt
You will need to do it youself, for example, by using glob module.
import glob
cmd_line = ['grep', '-i', '--line-buffered', 'grave'] + glob.glob('data/*.txt')