I am trying to trigger the python script or shell script whenever a desktop notification has arrived using dbus-monitor
I am using the command in this way
dbus-monitor "interface='org.freedesktop.Notifications'" | grep --line-buffered "string" | xargs -I '{}' python3 ./test.py {}
after that, I am trying to send the desktop notification from another terminal using
-> notify-send "hello" "world"
the output for the above custom notification is
string "notify-send"
string ""
string "hello"
string "world "
string "urgency"
string "notify-send"
string ""
string "hello"
string "world "
string "urgency"
but if my output of this command is 10 lines, then the python script is getting called for every line.
but my expectation is to call the python script once for every notification and then get all the output in a single line as a param for the python script.
It is wise to take advantage of systemd integration with dbus.
Using systemd integration the programmer has better controls/sensors over the dbus integration. Also can take advantage on systemd loging/monitors mechanisms.
There is a good article here about systemd dbus with python..
Also there is very related answer to your question in this answer. as well.
Related
I've got a Docker Service Log that takes in NiFi Actions and I want to capture only Log Entries that include "Successfully sent" and "Failed to process session" (and nothing more). They should be captured in a directory called "nifi_logs" in the present working directory. I need to do all of this using Python.
This is what I got so far:
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
I believe subprocess.Popen() is having difficulty with the double quotes used in the grep, as nifi1.log is completely empty. If the first command looks like the following:
docker_log = 'docker service logs nifi | grep session >> $PWD/nifi_logs/nifi1.log'
The Python code works just fine and captures all log entries with "session" in nifi1.log. As I explained above though, I need to grep for 2 kinds of Log Entires and both include multiple words, meaning I need to use quotes.
If I were to just run this command on the Terminal without Python:
docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log
The log generates the entries just fine, so I know the Docker Service command is written correctly.
I've tried switching the single and double quotes around, I've tried using \" instead of " within the single quotes ... nifi1.log continues to be empty.
I also tried using os.system() instead of subprocess.Popen(), but I run into the same problem (and I believe os.system() is somewhat deprecated).
Any ideas what I'd need to do to change what docker_log equals so that it will properly grep for the 2 search criteria? So you're aware: this question is not asking HOW I generate the log entries (I know what Docker Services I'm looking for, they generate properly), just what I need to do to get Python Subprocess Popen to accept a command with quotes in it.
Thank you for your assistance #David. Looking at your example, I found a solution: I removed stdout=subprocess.PIPE from subprocess.Popen and now it accepts double quotes just fine!
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stderr=subprocess.STDOUT)
After some searching and checking previous answers like here Passing objects from python to powershell, apparently the best way to send objects from a Python script to PowerShell script or command is going to be as JSON.
However, with something like this (dir_json.py):
from json import dumps
from pathlib import Path
for fn in Path('.').glob('**/*'):
print(dumps({'name': str(fn)}))
You can do this:
python .\dir_json.py | ConvertFrom-JSON
And the result is OK, but the problem I'm hoping to solve is that ConvertFrom-JSON seems to wait until the script has completed before reading any of the JSON, even though the invidual JSON objects end on each line. This can easily be verified by adding a line like time.sleep(1) after the print.
Is there a better way to send objects from Python to PowerShell than using JSON objects? And is there a way to actually stream them as they are written, instead of passing the entire output of the Python script after the script completes?
I ran into jq, which was recommended by "people on the internet" as a solution to my type of problem, stating that ConvertFrom-JSON doesn't allow streaming, but jq does. However, this did nothing to improve my situation:
python .\dir_json_slow.py | jq -cn --stream 'fromstream(1|truncate_stream(inputs))' | ConvertFrom-JSON
To make jq play nice, I did change the script to write a list of objects instead of separate objects:
from sys import stdout
from time import sleep
from json import dumps
from pathlib import Path
first = True
stdout.write('[\n')
for fn in Path('.').glob('**/*'):
if first:
stdout.write(dumps({'name': str(fn)}))
first = False
else:
stdout.write(',\n'+dumps({'name': str(fn)}))
stdout.flush()
sleep(.1)
stdout.write('\n]')
(note that the problem isn't ConvertFrom-JSON holding things up at the end, jq itself only starts writing output once the Python script completes)
As long as each line[1] that your python script outputs is a complete JSON object by itself, you can use a ForEach-Object call to process each output line as it is being received by PowerShell and call ConvertFrom-Json for each:
python .\dir_json.py | ForEach-Object { ConvertFrom-JSON $_ }
A simplified example that demonstrates that streaming occurs, pausing between lines processed (waiting for a keypress):
# Prompts for a keystroke after each line emitted by the Python command.
python -c 'from json import dumps; print(dumps({''name'': ''foo''})); print(dumps({''name'': ''bar''}))' |
ForEach-Object { ConvertFrom-Json $_ | Out-Host; pause }
Note: The Out-Host call is only used to work around a display bug in PowerShell, still present as of PowerShell 7.2: Out-Host forces synchronous printing of the implicit table-formatting that is applied - see this answer.
ConvertFrom-Json - atypically for PowerShell cmdlets - collects all input up front before emitting the object(s) that the JSON input has been parsed into, which can be demonstrated as follows:
# Prompts for a keystroke first, and only after *both*
# strings have been emitted does ConvertFrom-Json produce output.
& { '{ "name": "foo" }'; pause; '{ "name": "bar" }' } |
ConvertFrom-Json | Out-Host
[1] PowerShell relays output from external programs such as Python invariably line by line. By contrast, a PowerShell-native command is free to emit any object to the pipeline, including multiline strings.
I'm passing the result of the execution of a command to python as input, like so:
$ python parse_ips.py "$(kubectl get configmap ...)"
This works fine when executing from the command line, however I'm now trying to edit the file using PyCharm. Therefore I need the escaped version of the result of this command which I can paste into PyCharm's debug configuration, as I can't execute the command in real-time like I can do on the command line.
However, I am struggling to find a way to replicate the escaping bash does behind the scenes, so I can use the result as an argument within the PyCharm configuration. Running the above kubectl command results in a multi-line string which includes spaces and quotes. When I paste this into PyCharm it just interprets it as multiple arguments. I'm looking for the escaped result, which I could paste directly into the command line, or into PyCharm's debug configuration, to achieve the same result with a fixed parameter for testing.
Any help would be greatly appreciated!
Edit: To clarify, I mean on the command line the result of the $(kubectl ...) command is passed into the python program as a single command line argument when it is surrounded by quotes ("$(kubectl ...)"). So in the python program, you can access sys.argv[1] and it will contain the entire execution output of $(kubectl get configmap ...). However, if I execute that command myself on the command line, the result is a multi-line string.
If I then copy the result of that into PyCharm (or even on the command line again), it is interpreted as many command line arguments. E.g. it would look something like this:
$ python parse_ips.py apiVersion: v1
data:
item1: ifconfig-push 127.0.0.0 255.255.0.0
item2: ifconfig-push 127.0.0.1 255.255.0.0
item3: ifconfig-push 127.0.0.2 255.255.0.0
...
And so on. This obviously doesn't work in the same way as it did before. So I am unable to test my program without making the kubectl call from the command line each time. I was looking to replicate what "$(kubectl ...)" gets converted into so it is able to pass the entire output as a single command line entry.
I am struggling to find a way to replicate the escaping bash does behind the scenes
Typically use printf "%q" to escape stuff.
printf "%q" "$(kubectl get configmap ....)"
This is printf as the bash builtin command. It differs from coreutils printf, and newest ones also support %q with different quoting style:
/usr/bin/printf "%q" "$(kubectl get configmap ....)"
Modern bash also has quoting expansion:
var="$(kubectl get configmap ....)"
echo "${var#Q}"
And there is also the quoting style outputted by set -x.
I would suggest to use a file:
kubectl get configmap ... > /tmp/tempfile
python parse_ips.py "$(cat /tmp/tempfile)"
With xclip you can copy command output straight to the X server clipboard, which is handy:
printf "%q" "$(kubectl get configmap ...)" | xclip -selection clipboard
# then in another window:
python parse_ips.py <right mouse click><select paste>
I'm trying to run a Python script in PowerShell. While using sys module I stumble upon a problem of getting the return value of a PowerShell function. The function in question is Date -f 'dd\/MM\/yyyy' that returns 14/03/2019 and I'd like that value to be used as an argument in the Python script. I've been able so far to get the literal arguments from the command line, e.g. text.
This Python script (see sys.argv docs):
import sys
print(sys.argv)
called like this in Powershell:
python .\test.py -date $(Date -f "dd\/MM\/yyyy")
outputs:
['.\\test.py', '-date', '14/03/2019']
On a general note, I recommend using better date formats than dd/MM/yyyy - ISO 8061 yyyy-MM-dd would be a good one.
If I understand correctly, you want to have the output of the command Date -f 'dd\/MM\/yyyy' to be fed into your python script (script.py).
What you are looking for are so called "pipes" or output redirects. Looking at the officical documentation (https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/write-output?view=powershell-6) the following might work:
$DATE=Date -f 'dd/\MM\/yyyy'
Write-Output $DATE | python script.py
So I have a vulnerable program which is to be exploited using buffer overflow. After analysis I have all the relevant values(buffer size, the address to be injected ,etc )
The issue is that I need to automate the inputs given to it.
Precisely here's what happens :
the program asks for a normal input -- (No vulnerability present here)
It asks for a second input -- this is to be injected with the shellcode
I have tried sending the input from the file (by writing it with the payload say python -c 'print '.. blah blah), but the file somehow sends the ascii representation and messes up with the desired input
Things I have tried : Wrote a shell script like this :
echo -e "<first input>\r";
python -c 'print "A"*100 + "<shell code>"+"<ret>";
after this I send this file as an input : $ ./vuln < File
Is there anyway that I send the output from python when the program is prompting for the second input ?
You can use expect, a program used to automate interactions with programs that expose text terminal interfaces.
The script should look like this:
#!/usr/bin/expect
spawn programUnderTest
expect firstPrompt
send firstInput
expect secondPrompt
send secondInput