Python os.popen quotes within quotes error - python

I have the following command, but it does not work. Can anyone help what the issue.
cur_usage=os.popen("""df -k \tmp |tail -1 | awk '{{print $4"\n"$5}}'| grep '%'|tr -d '%'""").read()
print(cur_usage)

There are a couple of things you need to do here:
If you're on a Unix-like OS, you should change \tmp to /tmp
You need to either change \n to \\n or mark the string as a raw string.
One of either of the following should work for you:
curr_usage = os.popen("""df -k /tmp |tail -1 | awk '{{print $4"\\n"$5}}'| grep '%'|tr -d '%'""").read().strip()
or
curr_usage = os.popen(r"""df -k /tmp |tail -1 | awk '{{print $4"\n"$5}}'| grep '%'|tr -d '%'""").read().strip()

Related

How to filter shell output to only number with decimal?

I have CI/CD Config which required python version to be set to default by pyenv. I want to python2 -V output showed up with only, example, 2.7.18. But, rather than showing 2.7.18, it showing full text Python 2.7.18 .
But, when I use it in python3 python -V, it showed the correct & current python3 version (3.9.0).
I use this code to try showing numbers only : $(python -V | grep -Eo '[0-9]\.[0-9]\.[10-19]').
And to set default with pyenv : pyenv global $(python3 -V | grep -Eo '[0-9]\.[0-9]\.[10-19]') $(python -V | grep -Eo '[0-9]\.[0-9]\.[10-19]')
So pyenv $(python3 version) $(python2 version)
Here is the image :
Image of wrong output
Thanks!
A simple way would be to just replace the string Python with the emtpy string, if it exists.
Here a quick one-liner
python -V 2>&1| sed -e "s/Python//g" | xargs
That would print the python version, redirects stderr to stdout, replaces "Python" with "". Xargs without parameters returns the trimmed input string.
Here are a few more ways to get the version number:
# Print 1 word per line, the select the last word:
python -V 2>&1 | xargs -n1 | tail -n1
# Print the last word:
python -V 2>&1 | perl -lane 'print $F[-1];'
# Print the first stretch of 1 or more { digits or periods }:
python -V 2>&1 | grep -Po '[\d.]+'

Bash/Python processes matching

I can use
pgrep -f 'keyword1 | keyword2'
to run a pgrep and return all processes that match either keyword.
How can I use & to do this instead? I just want processes that contain both keywords
The following patterns failed:
pgrep -f 'keyword1 & keyword2'
pgrep -f 'keyword2 && keyword2'
MAN pgrep(1)
OPTIONS
-f The pattern is normally only matched against the process
name. When -f is set, the full command line is used.
.
Side question:
Is there a built in Python library for running these commands? I couldnt seem to find one and everyone suggested using subprocess.Popen(), which is how I'm running the 'pgrep' command, however I'd prefer a pure Python solution if it's available
I'm not sure you can do that with pgrep you can however use awk:
ps ax -o pid,cmd | awk '{pid = $1; $1=""}/[k]eyword1/ && /keyword2/ {print pid}'
The reason i use [k]eyword1 is to avoid matching the awk process.
If PCRE is supported with pgrep something like this would work:
pgrep -f '(?=.*keyword1)(?=.*keyword2)'
You can use or with a wildcard reversing the pattern to get either keyword in any order.
pgrep -f 'keyword1.*keyword2|keyword2.*keyword1'
The typical way to do a grep 'and' is to grep multiple times. Since pgrep returns pids you have to filter the list of processes directly and then extract the PID:
ps ax -o pid,cmd | grep 'keyword1' | grep 'keyword2' | awk '{print $2}'

Execute complex bash script within Python

I have a bash script which I run on my .csv file and then I run python script on the output of bash script. I would like to make everything into a single script, but bash scrip is quite complex and I couldn't find a way to use it in a Python..
grep "$(grep -E "tcp|udp" results.csv | grep -E "Critical|High|Medium" | awk -F "\"*,\"*" '{print $8}')" results.csv | sort -t',' -k4,4 -k8,8 | awk -F "\"*,\"*" '{print $5,"port",$7"/"$6,$8}' | sed '/tcp\|udp/!d' | awk '!a[$0]++' | sed '/,port,\/,/d' > out
I tried this both as a string, and as a parametrized command with subprocess, however it's just seems way too many complex characters for everything to work.
Is there a way simpler way to run this command in Python?
P.S. I know there are multiple questions & answers regarding this same topic, but none of them worked for me.
Could you please escape all the " double quotes" with \ please try it out and let us know if it worked:
os.system(" grep \"$(grep -E \"tcp|udp\" results.csv | grep -E \"Critical|High|Medium\" | awk -F \"\\\"*,\\\"*\" '{print $8}')\" results.csv | sort -t',' -k4,4 -k8,8 | awk -F \"\\\"*,\\\"*\" '{print $5,\"port\",$7\"/\"$6,$8}' | sed '/tcp\|udp/!d' | awk '!a[$0]++' | sed '/,port,\/,/d' > out ")
The whole command can be put into " your_command_with\"escaped\"double quotes ".
Have a nice day

Open Stack endpoints API request OS X

I've got the following bash script to parse endpoints JSON:
echo curl -s -H "X-Auth-Token: my_access_token" -X GET "https://api.selvpc.ru/identity/v3/endpoints?interface=public" | python -mjson.tool | grep -Pi '^\s*"url":\s*".*",?$' | awk '{print $2}' | tr -d '"' | sed "s/[%\\\$](tenant_id)s/my_project_id/g")
But bash says:
-bash: syntax error near unexpected token `)'
My hoster says, this script works well on Linux-based OS, but no guarantee to work on OS X. What can be the syntax issue?
EDIT:
If i use the following:
curl -s -H "X-Auth-Token: my_access_token" -X GET "https://api.selvpc.ru/identity/v3/endpoints?interface=public" | python -mjson.tool
JSON parses as expected. But with grep -Pi '^\s*"url":\s*".*",?$' I guess it causes grep warning
usage: grep [-abcDEFGHhIiJLlmnOoqRSsUVvwxZ] [-A num] [-B num] [-C[num]]
[-e pattern] [-f file] [--binary-files=value] [--color=when]
[--context[=num]] [--directories=action] [--label] [--line-buffered]
[--null] [pattern] [file ...]
I guess the first problem is grep error?
As #4ae1e1 suggested, please use a JSON processor for the job. jq is great and it's worthwhile investing your time to learn it.
wget https://github.com/stedolan/jq/releases/download/jq-1.5/jq-osx-amd64
mv jq-osx-amd64 jq
chmod u+x jq
curl -s -H "X-Auth-Token: $TOKEN" https://api.selvpc.ru/identity/v3/endpoints?interface=public | \
./jq -r .endpoints[].url
That will get you a list of OpenStack API endpoints.
I think a python script using python-keystoneclient can be easier to understand and maintain

very complex quotes in python/shell string

I have a very long string ssh_cmd, I get it from
cmd = """kill -9 `ps -ef|grep "udp_receiver"|grep -v "grep"|awk '{print $2}'`"""
HostName="133.33.22.1"
ssh_cmd = """ssh -t inria_spoofing#{0} 'sudo nohup bash -c "{1} > /nohup.out 2>&1 &"'""".format(HostName, cmd)
the resulted ssh_cmd is:
ssh -t kitty#133.33.22.1 'sudo nohup bash -c "kill -9 `ps -ef|grep "udp_receiver"|grep -v "grep"|awk '{print $2}'` > /nohup.out 2>&1 &"'
however, I'm afraid when I run
child = pexpect.spawn(ssh_cmd)
there is problem,
so how to organize the string?
thanks!
To answer the question, here's the proper ssh_cmd: ssh -t kitty#133.33.22.1 "sudo nohup bash -c \"kill -9 \\\`ps -ef | grep 'udp_receiver' | grep -v 'grep' | awk '{print \\\$2}'\\\` > /nohup.out 2>&1 &\""
Basically, you need to escape double quotes, backticks and backslashes in a command each time you embed this command in another one. I did not use single quotes except at the lower level because you cannot use escaped single quotes inside single quotes.
You do need to escape the $ too when it is just a character inside a string quoted with double quotes, even if the string does also contain single quotes.

Categories