os.system giving weird output in linux - python

I am running this python function in suse linux to grep ip of node from /etc/hosts--
def mm_node():
import os
node_name = os.system("`cat /etc/hosts | egrep -i mm | grep om | awk '{print $1}'`")
return node_name
mm_node()
As a result, it is showing this weird output
sh: 192.168.10.10: command not found
instead of
192.168.10.10
If I run the shell command
(cat /etc/hosts | egrep -i mm | grep om | awk '{print $1}')
directly on linux command prompt, it gives the o/p as
192.168.10.10

Backquotes tell the shell to capture the output and use it on the command line, typically as argument to a command, as in grep `whoami` /etc/passwd. In your case the command line consists only of a backquoted pipeline, so the shell interprets the output of the pipeline as the command to execute. That is why it complains that the IP address is "not found".
If your intention is to capture the output of the pipeline to use in your Python code, you should use the subprocess module, which is the modern alternative to os.system that allows easy capturing of the output. For example:
import subprocess
def mm_node():
output = subprocess.run(
"cat /etc/hosts | egrep -i mm | grep om | awk '{print $1}'",
shell=True,
capture_output=True
).stdout
return output.strip()
print(mm_node())

def mm_node():
import os
node_name = os.system("cat /etc/hosts | egrep -i mm | grep om | awk '{print $1}'")
return node_name
mm_node()

Related

Why do my shell statements cause blocking? bash -c "ps ax | grep 'tcpdump' | grep -v grep | kill -9 $(awk '{print $1}')"

I am rewriting a function to alternate os.system() in Python to ensure that the output of the process is logged.
def log_system(self, cmd: str):
cmd = cmd.rstrip(' ')
if cmd[-1] == '&' and cmd[-2] != '&':
cmd = cmd[:-1]
# Redirects standard error to standard output for logging
executing = os.popen('bash -c ' + '"' + cmd + '"' + '>/dev/stdout 2>&1 &')
else:
executing = os.popen('bash -c ' + '"' + cmd + '"' + '>/dev/stdout 2>&1')
res = executing.readlines()
ret = executing.close()
if len(res) > 0: # print message if have some content
msg = ''.join(res)
if ret is None: # None means process return successfully
log.i(msg) # print stdout
else:
log.e(f'cmd execute failed: {cmd}\n' + msg) # print stderr
The original statement works well and it is:
os.system("ps ax | grep 'tcpdump' | grep -v grep | kill `awk '{print $1}'` ")
After I use log_system(), the shell script could be:
bash -c "ps ax | grep 'tcpdump' | grep -v grep | kill `awk '{print $1}'`"
But, When I execute the code, it blocks, I don't know what happened.
I use terminal to retry, this script is non-blocking:
ps ax | grep 'tcpdump' | grep -v grep | kill `awk '{print $1}'`
and this is also blocking:
bash -c "ps ax | grep 'tcpdump' | grep -v grep | kill -9 $(awk '{print $1}')"
bash -c "ps ax | grep 'tcpdump' | grep -v grep | kill `awk '{print $1}'`"
The kill command does not take process IDs on standard input, it must be supplied as an argument.
Specifically, your current command snippet (regardless of what goes into the kill standard input) will hang forever because the awk is waiting on its standard input (the terminal), effectively:
kill `awk '{print $1}'`
Hence the command would need to be something like:
kill $(ps ax | grep 'tcpdump' | grep -v grep | awk '{print $1}')
with the $() capturing the standard output of its content and using that as a command line argument to kill.
However, you may also want to look at one of my earlier answers. There are better tools available for finding and/or killing processes, such as pgrep and pkill. If you have those available to you, they're usually a better option than trying to do it with ps/grep/awk pipelines.

Handling Quotation Syntax error when killing a process remotely (using SSH) through a script written in Python

I have a python script file foo.py which has a function to delete a process remotely.
Code Snippet
I use the following command:-
import os
def exit_app():
os.system("ssh -tt nvidia#10.x.x.xxx "kill -9 \$(ps -aux | grep 'start_with_nn_obj_detection.bash' | awk '{print \$2}')"")
exit()
I receive the error that the syntax is invalid. It has something to do with the placement of the single and double quotation marks.
Error:
File "foo.py", line 2
os.system('ssh nvidia#10.x.x.xxx "kill -9 \$(ps -aux | grep 'start_with_nn_obj_detection.bash' | awk '{print \$2}')"')
^
SyntaxError: invalid syntax
Things I have tried:-
Verifying that the command:-
ssh -tt nvidia#10.x.x.xxx "kill -9 \$(ps -aux | grep 'start_with_nn_obj_detection.bash' | awk '{print \$2}')"
works perfectly when used in the terminal.
Tried the '"..."' approach, yet I receive the error:-
os.system("ssh -tt nvidia#10.5.3.157 '""kill -9 \$(ps -aux | grep 'start_with_nn_obj_detection.bash' | awk '{print \$2}')"'"")
syntax error near unexpected token `('
How should I fix this line then?
os.system("ssh -tt nvidia#10.x.x.xxx "kill -9 \$(ps -aux | grep 'start_with_nn_obj_detection.bash' | awk '{print \$2}')"")
Avoid os.system; use subprocess instead, and eliminate the (or rather, one) shell from the equation.
import subprocess
def exit_app():
subprocess.call(
['ssh',
'-tt',
'nvidia#10.x.x.xxx',
"kill $(ps -aux | awk '/start_with_nn_obj_detection.bash/ {print $2}')"
])
exit()
If pkill is available on the remote host, though, you can simply run something like
subprocess.call(['ssh', '-tt', 'nvidia#10.x.x.x', 'pkill "start_with_n_obj_detection.bash"'])

Killing a process within a python script running on linux

I have Raspbian as the linux distro running on my RPI. I've setup a small socket server using twisted and it receives certain commands from an iOS app. These commands are strings. I started a process when I received "st" and now I want to kill it when i get "sp". This is the way I tried:
Imported OS
Used os.system("...") //to start process
os.system("...") // to kill process
Lets say the service is named xyz.
This is the exact way I tried to kill it:
os.system('ps axf | grep xyz | grep -v grep | awk '{print "kill " $1 }' | sh')
But I got a syntax error. That line runs perfectly when I try it in terminal separately. Is this a wrong way to do this in a python script? How do I fix it?
You will need to escape the quotes in your string:
os.system('ps axf | grep xyz | grep -v grep | awk \'{print "kill " $1 }\' | sh')
Or use a triple quote:
os.system('''ps axf | grep xyz | grep -v grep | awk '{print "kill " $1 }' | sh''')
Alternatively, open the process with Popen(...).pid and then use os.kill()
my_pid = Popen('/home/rolf/test1.sh',).pid
os.kill(int(my_pid), signal.SIGKILL)
Remember to include a shebang in your script (#!/bin/sh)
Edit:
On second thoughts, perhaps
os.kill(int(my_pid), signal.SIGTERM)
is probably a better way to end the process, it at least gives the process the chance to close down gracefully.

Execute complex bash script within Python

I have a bash script which I run on my .csv file and then I run python script on the output of bash script. I would like to make everything into a single script, but bash scrip is quite complex and I couldn't find a way to use it in a Python..
grep "$(grep -E "tcp|udp" results.csv | grep -E "Critical|High|Medium" | awk -F "\"*,\"*" '{print $8}')" results.csv | sort -t',' -k4,4 -k8,8 | awk -F "\"*,\"*" '{print $5,"port",$7"/"$6,$8}' | sed '/tcp\|udp/!d' | awk '!a[$0]++' | sed '/,port,\/,/d' > out
I tried this both as a string, and as a parametrized command with subprocess, however it's just seems way too many complex characters for everything to work.
Is there a way simpler way to run this command in Python?
P.S. I know there are multiple questions & answers regarding this same topic, but none of them worked for me.
Could you please escape all the " double quotes" with \ please try it out and let us know if it worked:
os.system(" grep \"$(grep -E \"tcp|udp\" results.csv | grep -E \"Critical|High|Medium\" | awk -F \"\\\"*,\\\"*\" '{print $8}')\" results.csv | sort -t',' -k4,4 -k8,8 | awk -F \"\\\"*,\\\"*\" '{print $5,\"port\",$7\"/\"$6,$8}' | sed '/tcp\|udp/!d' | awk '!a[$0]++' | sed '/,port,\/,/d' > out ")
The whole command can be put into " your_command_with\"escaped\"double quotes ".
Have a nice day

How can i let the BASH script run as process? So that even the Python script is killed the BASH script runs forever?

I need to track and launch few BASH scripts as process (if they for some reason crashed or etc). So i was trying as below: but not working
def ps(self, command):
process = subprocess.Popen(['/bin/bash'], shell=False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
process.stdin.write(command + '\n')
process.stdout.readline()
ps("/var/tmp/KernelbootRun.sh")
ps("ps aux | grep processCreator.py | awk '{print $2}' | xargs kill -9")
None is working.
How about running it through a subshell with disown:
import os
def ps(self, command):
os.system(command + " & disown")
ps("/var/tmp/KernelbootRun.sh")
ps("ps aux | grep processCreator.py | awk '{print $2}' | xargs kill -9")
Note that sometimes you have to use a null input and output to keep your process active when the terminal is closed:
ps("</dev/null /var/tmp/KernelbootRun.sh >/dev/null 2>&1")
ps("</dev/null ps aux | grep processCreator.py | awk '{print $2}' | xargs kill -9 >/dev/null 2>&1")
Or perhaps define another function:
def psn(self, command):
os.system("</dev/null " + command + " >/dev/null 2>&1 & disown")
psn("/var/tmp/KernelbootRun.sh")
psn("ps aux | grep processCreator.py | awk '{print $2}' | xargs kill -9")
This works great, as stand-alone process for my movie.
p.py:
import subprocess
subprocess.Popen("/var/tmp/runme.sh", shell=False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
runme.sh:
#!/bin/bash
export DISPLAY=:0.0
vlc /var/tmp/Terminator3.Movie.mp4

Categories