Getting user input for os.system - python

Only recently started to learn Python, so please excuse the noob question (might be doing this completely wrong to begin with)
I'm trying to get user input to use with os.system
import os
account = input("Account name: ")
os.system("op get item + account + | awk -F':' '{print $24}' | cut -d '}' -f 1")
this line of code is used to take and display the information from 1Password and outputs just
the password of Facebook is this example, it works but as soon as i try to add the variable it breaks
op get item Facebook | awk -F':' '{print $24}' | cut -d '}' -f 1
I hope someone here can help me out or point me in the right direction, thanks!

I think that you need to work with the output of these commands, so you can use the subprocess module instead: Replacing Shell Pipeline
For example:
from subprocess import check_output
check_output(f"op get item {account} | awk -F':' '{{print $24}}' | cut -d '}}' -f 1", shell=True)
Be aware that you would like to sanitize the account variable, before touching your system calls. :)

You just need string substitution, e.g. with Python3.6+ the elegant
os.system(f"op get item {account} | awk -F':'" + " '{print $24}' | cut -d '}' -f 1")

Related

Executing awk in Python shell

I have a shell command which parses a certain content and gives the required output. I need to implement this in python but the shell command has a new line character "\n" which is not getting getting executed when run through python command.
Of the many lines in the output log, the required line looks like - configurationFile=/app/log/conf/the_jvm_name.4021.logback.xml
I would only need the_jvm_name from the above. The syntax will always be the same. The shell command works fine.
Shell Command -
ps -ef | grep 12345 | tr " " "\n" | grep logback.configurationFile | awk -F"/" '{print $NF}'| cut -d. -f1
Python (escaped all the required double quotes) -
import subprocess
pid_arr = "12345"
sh_command = "ps -ef | grep "+pid_arr+" | tr \" \" \"\n\" | grep configurationFile | awk -F \"/\" '{print $NF}' | cut -d. -f1"
outpt = subprocess.Popen(sh_command , shell=True,stdout=subprocess.PIPE).communicate()[0].decode('utf-8').strip()
With python, I'm not getting the desired output. It just prints configurationFile as it is in the command.
what am I missing here. Any other better way for getting this details?
You can achieve what you want using a regex substitution in Python:
output = subprocess.check_output(["ps", "-ef"])
for line in output.splitlines():
if re.search("12345", line):
output = re.sub(r".*configurationFile=.*/([^.]+).*", r"\1", line)
This captures the part after the last / in the configuration file path, up to the next ..
You could make it slightly more robust by checking only the second column (the PID) for 12345, either by splitting each line on white space:
cols = re.split("\s+", line)
if len(cols) > 1 and cols[1] == "12345":
or by using a better regex, like:
if re.match(r"\S+\s+12345\s", line):
Note that you could also shorten your pipe considerable by just doing something like:
ps -ef | sed -nE '/12345/ { s/.*configurationFile=.*\/([^.]*).*/\1/; p }'
Your shell command works, but it has to deal with too many lines of output and too many fields per line. An easier solution is to tell the ps command to just give you 1 line and on that line, just one field that you care about. For example, on my system:
ps -o cmd h 979
will output:
/usr/bin/dbus-daemon --config-file=/usr/share/defaults/at-spi2/accessibility.conf --nofork --print-address 3
The -o cmd flag will output only the CMD column of the output, while the h parameter represents a command to tell ps to omit the header. Finally, the 979 is the process ID, which tells ps to output information just for this process.
This output is not exactly what you have in your problem, but similar enough. Once we limited the output, we eliminate the need for other commands such as grep, awk, ... At this point, we can use regular expression to extract what we want:
from __future__ import print_function
import re
import subprocess
pid = '979'
command = ['ps', '-o', 'cmd', 'h', pid]
output = subprocess.check_output(command)
pattern = re.compile(r"""
config-file= # Literal string search
.+\/ # Everything up to the last forward slash
([^.]+) # Non-dot chars, this is what we want
""", re.VERBOSE)
matched = pattern.search(output)
if matched:
print(matched.group(1))
Notes
For the regular expression, I am using a verbose form, allowing me to use comment to annotate my pattern. I like this way as regular expression can be difficult to read
On your system, please adjust the "configuration-file" part to work with your output.

awk and echo commands in python

I need to use echo and awk commands in python script. Can you help me?
I have a bash script, there is an example:
while read LINE
do
BOM1=`echo "$LINE" | awk -F $'\t' '{print $1}'`
BOM2=`echo "$LINE" | awk -F $'\t' '{print $2}'`
done < file.txt
I try to rewrite the same in python script:
import subprocess
with open(PT_tmp_bom_list,"r+") as Tmp_list_file:
for line in Tmp_list_file:
cmd="echo {} | awk -F '\t' '{print $1}'".format(line)
subprocess.call(cmd, shell=True)
I have a several questions:
If line is a string. I cannot output it, tried:
cmd="echo {} ".format(line)
it says that: The system cannot find the file specified. It means, I can't get a line for awk.
The line should look like:
<deliverydir>/bom/bom_list.txt**TAB**<bom_list_dir>/bom_list.txt**TAB**Internal User
The second question, if i get a line from echo, how should I use awk command for this line?
You definitely do not need external programs for this; Python easily subsumes the functionality of Awk and then some.
with open(PT_tmp_bom_list,"r+") as Tmp_list_file:
for line in Tmp_list_file:
bom1, bom2, _ = line.rstrip('\n').split('\t')
Take out the , _ if the lines have exactly two fields.

How can I get rid of the new line created after this bash command?

I'm trying to store the result of this BASH command in a variable:
localip = subprocess.check_output(["ifconfig | grep 'inet addr' | cut -d ':' -f2 | grep 'Bcast' | cut -d ' ' -f1"], shell=True)
but if I print local ip it always have a new line:
pi#raspberrypi:~/Documents $ python change.py
192.168.1.6
pi#raspberrypi:~/Documents $
Which is of course not what I need, so I've tried something like this:
localip[0].replace("\n", "")
or simply
localip.replace("\n", "")
but still no good....does anyone know of to get rid of the new created line? Thanks!
You haven't posted the code that actually prints localip but one possible reason is that you don't print the return value of str.replace. str.replace doesn't mutate the existing string but it returns a new copy instead.
In the meantime I solved in that way:
localip = subprocess.check_output(["ifconfig | grep 'inet addr' | cut -d ':' -f2 | grep 'Bcast' | cut -d ' ' -f1 | tr -d '\n'"], shell=True)
piping a tr -d '\n' at the end.
Thanks.

Sending piped commands with pxssh/pexpect

I am currently working with pxssh for Python and it's working like a charm, however it seems like it doesn't process piped (|) commands - it interprets them as separate ones.
At the moment I have the following:
s = pxssh.pxssh()
s.login(sshHost, sshUsername, sshPassword)
s.sendline("cat somefile | grep something | cut -d \' -f 4")
It works properly with any commands that are not piped, however I need to send a piped one at the moment.
Is there a way around this with pxssh, or can you please suggest a way to implement another solution for this command?
It's not clear to me why pxssh would behave as you describe. Are you sure the issue is not that your \' is interpreted by Python, whereas you want it to be interpreted by the remote shell? That would be better spelled like so:
s.sendline("cat somefile | grep something | cut -d \\' -f 4")
You certainly do have alternatives. One would be to use a single command instead of a pipeline, such as:
s.sendline("sed -n '/something/ { s/\\([^,]*,\\)\\{3\\}\\([^,]*\\),.*/\\2/; p }'")
As a special case of that, you could launch a subshell in which to run your pipeline:
s.sendline('''bash -c "cat somefile | grep something | cut -d \\' -f 4"''')

How to store the result of an executed shell command in a variable in python? [duplicate]

This question already has answers here:
Running shell command and capturing the output
(21 answers)
Closed 1 year ago.
I need to store the result of a shell command that I executed in a variable, but I couldn't get it working. I tried like:
import os
call = os.system("cat syscall_list.txt | grep f89e7000 | awk '{print $2}'")
print call
But it prints the result in terminal and prints the value of call as zero, possibly indicating as success. How to get the result stored in a variable?
Use the subprocess module instead:
import subprocess
output = subprocess.check_output("cat syscall_list.txt | grep f89e7000 | awk '{print $2}'", shell=True)
Edit: this is new in Python 2.7. In earlier versions this should work (with the command rewritten as shown below):
import subprocess
output = subprocess.Popen(['awk', '/f89e7000/ {print $2}', 'syscall_list.txt'], stdout=subprocess.PIPE).communicate()[0]
As a side note, you can rewrite
cat syscall_list.txt | grep f89e7000
To
grep f89e7000 syscall_list.txt
And you can even replace the entire statement with a single awk script:
awk '/f89e7000/ {print $2}' syscall_list.txt
Leading to:
import subprocess
output = subprocess.check_output(['awk', '/f89e7000/ {print $2}', 'syscall_list.txt'])
In python 3 you can use
import subprocess as sp
output = sp.getoutput('whoami --version')
print (output)
``
os.popen works for this. popen - opens a pipe to or from command. The return value is an open file object connected to the pipe, which can be read. split('\n') converts the output to list
import os
list_of_ls = os.popen("ls").read().split('\n')
print list_of_ls
import os
list_of_call = os.popen("cat syscall_list.txt | grep f89e7000 | awk '{print $2}'").read().split('\n')
print list_of_call
commands.getstatusoutput would work well for this situation. (Deprecated since Python 2.6)
import commands
print(commands.getstatusoutput("cat syscall_list.txt | grep f89e7000 | awk '{print $2}'"))

Categories