I've got the following bash script to parse endpoints JSON:
echo curl -s -H "X-Auth-Token: my_access_token" -X GET "https://api.selvpc.ru/identity/v3/endpoints?interface=public" | python -mjson.tool | grep -Pi '^\s*"url":\s*".*",?$' | awk '{print $2}' | tr -d '"' | sed "s/[%\\\$](tenant_id)s/my_project_id/g")
But bash says:
-bash: syntax error near unexpected token `)'
My hoster says, this script works well on Linux-based OS, but no guarantee to work on OS X. What can be the syntax issue?
EDIT:
If i use the following:
curl -s -H "X-Auth-Token: my_access_token" -X GET "https://api.selvpc.ru/identity/v3/endpoints?interface=public" | python -mjson.tool
JSON parses as expected. But with grep -Pi '^\s*"url":\s*".*",?$' I guess it causes grep warning
usage: grep [-abcDEFGHhIiJLlmnOoqRSsUVvwxZ] [-A num] [-B num] [-C[num]]
[-e pattern] [-f file] [--binary-files=value] [--color=when]
[--context[=num]] [--directories=action] [--label] [--line-buffered]
[--null] [pattern] [file ...]
I guess the first problem is grep error?
As #4ae1e1 suggested, please use a JSON processor for the job. jq is great and it's worthwhile investing your time to learn it.
wget https://github.com/stedolan/jq/releases/download/jq-1.5/jq-osx-amd64
mv jq-osx-amd64 jq
chmod u+x jq
curl -s -H "X-Auth-Token: $TOKEN" https://api.selvpc.ru/identity/v3/endpoints?interface=public | \
./jq -r .endpoints[].url
That will get you a list of OpenStack API endpoints.
I think a python script using python-keystoneclient can be easier to understand and maintain
Related
I have a python script that I am executing with cron job. This script generates some output while executing and I wish to post it to Slack channel in real time.
Here is a bash script that I have:
#!/bin/bash
log_file=logs_$(date '+\%Y-\%m-\%d_\%H:\%M').txt
cd /path/to/script/run.py > /path/to/logs/${log_file} 2>&1
cat /path/to/logs/${log_file} | while read LINE; do
(echo "$LINE" | grep -e "Message" ) && curl -X POST --silent --data-urlencode \
"payload={\"text\": \"$(echo $LINE | sed "s/\"/'/g")\"}" "https://hooks.slack.com/services/xxxxxx";
done
This scrip works but it of course posts all messages to slack once the python script has already been executed. Is there any way I could configure it so that messages would be sent to Slack in real time while the python script is still being executed?
You may be able to read the output from your run.py script via process substitution:
#!/bin/bash
log_file=logs_$(date '+\%Y-\%m-\%d_\%H:\%M').txt
while read -r line ; do
echo "$line"
(echo "$line" | grep -e "Message" ) && curl -X POST --silent --data-urlencode \
"payload={\"text\": \"$(echo $line | sed "s/\"/'/g")\"}" "https://hooks.slack.com/services/xxxxxx";
done < <(/path/to/script/run.py 2>&1) >> "$log_file"
It may also prove useful to paste your code into shellcheck.net and have a look at the suggested changes.
Your script shouldn't work at all as you're not executing run.py but you're changing your working directory into it, so unless run.py is a directory, your script should fail.
Also, commands in bash scripting are executed sequentially, so if you launch your python command and then read the log, no wonder that you're not getting log lines in real time.
What i would do is using some pipes and xargs:
#!/bin/bash
/path/to/script/run.py | grep -e "Message" | sed "s/\"/'/g" | xargs -I{} curl -L -X POST --silent --data-urlencode 'payload={"text":"{}"}' https://hooks.slack.com/services/xxx
I've added -L to the curl command because hooks.slack.com makes a redirect to api.slack.com and without that flag curl will stop after the 302 instead of following the Location header in the response.
I'm currently writing a dockerfile, in which the project's lines of code shall be counted via cloc, and sent to a server. The server is providing a RESTful API. From this API I want to get a Data-Transfer-Object, modify it and send it back to the same API to update it.
Since the runner does not memorize variables over more than one line, I (probably) have to put everything into one line (except the static API):
ENV API=http://example-url.com/api/datapoints/
My curl command that I go so far is this:
RUN curl
-d (curl -s ${API} | python3 -c "import sys, json;
newDatapointDTO=json.load(sys.stdin);
newDatapointDTO['metric']="`cloc ./src/ --json`";
print(newDatapointDTO)")
-H "Content-Type: application/json"
-X PUT ${API}
The problem here is that the runner does not like the interruption of the python command in order to set in a variable from bash.. So how do I insert cloc ./src/ --json then into Python?
I also think the command is a bit too complicated - there should be probably a better solution for this...
Converting your command to JSON using jq -Rs . <<'EOF', pasting your command, and then EOF gives us:
RUN ["bash", "-c", "curl \n -d (curl -s ${API} | python3 -c \"import sys, json; \n newDatapointDTO=json.load(sys.stdin); \n newDatapointDTO['metric']=\"`cloc ./src/ --json`\"; \n print(newDatapointDTO)\") \n -H \"Content-Type: application/json\" \n -X PUT ${API}\n"]
Taking out the extra whitespace and fixing up the quoting makes that:
RUN ["bash", "-c", "curl -d (curl -s ${API} | python3 -c \"import sys, json;\nnewDatapointDTO=json.load(sys.stdin);\nnewDatapointDTO['metric']=$(cloc ./src/ --json);print(newDatapointDTO)\") -H \"Content-Type: application/json\" -X PUT ${API}"]
However, because that code is using shell injection to insert data into Python, it's still a Really Bad Idea. A saner version of your Python code might instead look like:
import sys, json, subprocess
newDatapointDTO=json.load(sys.stdin)
newDatapointDTO['metric'] = subprocess.run(['cloc', './src/', '--json'], check=True, stdout=subprocess.PIPE).stdout
print(newDatapointDTO)
...so wrapping that in a shell command might look like:
pythonScript="import sys, json, subprocess
newDatapointDTO=json.load(sys.stdin)
newDatapointDTO['metric'] = subprocess.run(['cloc', './src/', '--json'], check=True, stdout=subprocess.PIPE).stdout
print(newDatapointDTO)"
curl -d "$(curl -s "$API" | python -c "$pythonScript")" \
-H "Content-Type: application/json" \
-X PUT "$API"
...so, generating a JSON encoding of a command line that calls that would look like:
jq -Rs . <<'EOF'
pythonScript="import sys, json, subprocess
newDatapointDTO=json.load(sys.stdin)
newDatapointDTO['metric'] = subprocess.run(['cloc', './src/', '--json'], check=True, stdout=subprocess.PIPE).stdout
print(newDatapointDTO)"
curl -d "$(curl -s "$API" | python -c "$pythonScript")" \
-H "Content-Type: application/json" \
-X PUT "$API"
EOF
...which gives us the output:
"pythonScript=\"import sys, json, subprocess\nnewDatapointDTO=json.load(sys.stdin)\nnewDatapointDTO['metric'] = subprocess.run(['cloc', './src/', '--json'], check=True, stdout=subprocess.PIPE).stdout\nprint(newDatapointDTO)\"\n\ncurl -d \"$(curl -s \"$API\" | python -c \"$pythonScript\")\" \\\n -H \"Content-Type: application/json\" \\\n -X PUT \"$API\"\n"
...so we know you can put in your Dockerfile the following:
RUN ["bash", "-c", "pythonScript=\"import sys, json, subprocess\nnewDatapointDTO=json.load(sys.stdin)\nnewDatapointDTO['metric'] = subprocess.run(['cloc', './src/', '--json'], check=True, stdout=subprocess.PIPE).stdout\nprint(newDatapointDTO)\"\n\ncurl -d \"$(curl -s \"$API\" | python -c \"$pythonScript\")\" \\\n -H \"Content-Type: application/json\" \\\n -X PUT \"$API\"\n"]
I have CI/CD Config which required python version to be set to default by pyenv. I want to python2 -V output showed up with only, example, 2.7.18. But, rather than showing 2.7.18, it showing full text Python 2.7.18 .
But, when I use it in python3 python -V, it showed the correct & current python3 version (3.9.0).
I use this code to try showing numbers only : $(python -V | grep -Eo '[0-9]\.[0-9]\.[10-19]').
And to set default with pyenv : pyenv global $(python3 -V | grep -Eo '[0-9]\.[0-9]\.[10-19]') $(python -V | grep -Eo '[0-9]\.[0-9]\.[10-19]')
So pyenv $(python3 version) $(python2 version)
Here is the image :
Image of wrong output
Thanks!
A simple way would be to just replace the string Python with the emtpy string, if it exists.
Here a quick one-liner
python -V 2>&1| sed -e "s/Python//g" | xargs
That would print the python version, redirects stderr to stdout, replaces "Python" with "". Xargs without parameters returns the trimmed input string.
Here are a few more ways to get the version number:
# Print 1 word per line, the select the last word:
python -V 2>&1 | xargs -n1 | tail -n1
# Print the last word:
python -V 2>&1 | perl -lane 'print $F[-1];'
# Print the first stretch of 1 or more { digits or periods }:
python -V 2>&1 | grep -Po '[\d.]+'
I have the following command, but it does not work. Can anyone help what the issue.
cur_usage=os.popen("""df -k \tmp |tail -1 | awk '{{print $4"\n"$5}}'| grep '%'|tr -d '%'""").read()
print(cur_usage)
There are a couple of things you need to do here:
If you're on a Unix-like OS, you should change \tmp to /tmp
You need to either change \n to \\n or mark the string as a raw string.
One of either of the following should work for you:
curr_usage = os.popen("""df -k /tmp |tail -1 | awk '{{print $4"\\n"$5}}'| grep '%'|tr -d '%'""").read().strip()
or
curr_usage = os.popen(r"""df -k /tmp |tail -1 | awk '{{print $4"\n"$5}}'| grep '%'|tr -d '%'""").read().strip()
I am using python=2.7 and pexpect=4.5.0 on ubuntu 16.04
This is the code:
telnet.sendline("ls --color=never | grep -v bootimage | xargs -n1 rm -rf")
I'm sending this line to an embedded linux machine from my ubuntu computer, i'm in a virutalenv.
However, what this embedded machine gets is:
ls --color=never | grep -v bootimage | xar gs -n1 rm -rf
I mean, why does xargs become xar gs ...??? How do I fix it?
Note:
1. I also send other shorter commands, they are good.
2. It used to be good. This does not happen consistently.
Using raw string maybe a better choice like telnet.sendline(r"ls --color=never | grep -v bootimage | xargs -n1 rm -rf")