Running a python script from crontab - python

I've got a python program which runs via crontab and that works perfectly. However, I decided to add the ability to notify me of what it's doing, and suddenly it's failing. It runs from the command line, however, running it as a crontab program causes it to fail
libnotify-Message: Unable to get session bus: /bin/dbus-launch terminated abnormally with the following error: Autolaunch error: X11 initialization failed.
What am I doing wrong?
Edit
I would like this program to still run from cron and be able to take advantage of notifying the user of it's work. Is there any way to do this?
Edit 2
I've tried using root's crontab and sudo -u esr python script.py yet this also fails, silently at that.
Edit 3
It is possible! Here's the code.
* * * * * su $user -c "DBUS_SESSION_BUS_ADDRESS=$(grep -z DBUS_SESSION_BUS_ADDRESS /proc/$(ps -au esr | grep -i "gnome-session" | awk '{ print $1 }')/environ | sed -e 's/DBUS_SESSION_BUS_ADDRESS=//') $(whereis notify-send | awk '{ print $2 }') -u normal -t 20000 \"Hello\" "

* * * * * su esr -c "DBUS_SESSION_BUS_ADDRESS=$(grep -z DBUS_SESSION_BUS_ADDRESS /proc/$(ps -au esr | grep -i "gnome-session" | awk '{ print $1 }')/environ | sed -e 's/DBUS_SESSION_BUS_ADDRESS=//') $(whereis notify-send | awk '{ print $2 }') -u normal -t 20000 \"Hello\" "
As per a suggestion, an explanation, unfortunately not mine

You're trying to run a script that requires user resources in an environment where said resources are not available. You will have to strip the script of all references to PyGTK and to the session bus if you want this to work.

I just wanted to mention that the following recipe works for users of the awesome window manager:
*/1 * * * * DBUS_SESSION_BUS_ADDRESS=$(grep -zi DBUS /proc/$(pgrep awesome)/environ | sed -r -e 's/^DBUS_SESSION_BUS_ADDRESS=//') DISPLAY=":0.0" notify-send -t 0 blah blah

You're trying to use GUI (GTK+ library calls) for cron program that has no access to graphical terminal. You need to avoid creating dialogs and windows when you run it from cron.

Related

Php script triggering python script in background

Current Situation
I created a php script, to start the python script.
Following is the script:
$python_file = "/var/www/web/test.py 2>&1 | tee -a /tmp/mylog 2>/dev/null >/dev/null &";
$command = "nohup python3 ".$python_file;
exec($command);
Problem:
After triggering the php script, the script keeps on running and finally it returns 504 error page.
Expected Solution
After triggering the above script, it needs to return immediately after the exec statement. is it possible?
add & to run in the background
$python_file = "/var/www/web/test.py 2>&1 | tee -a /tmp/mylog 2>/dev/null >/dev/null &";
$command = "nohup python3 ".$python_file . " &";
exec($command);

Is it possible to compile microbit python code locally?

I am running Ubuntu 22.04 with xorg.
I need to find a way to compile microbit python code locally to a firmware hex file. Firstly, I followed the guide here https://microbit-micropython.readthedocs.io/en/latest/devguide/flashfirmware.html.
After a lot of debugging, I got to this point: https://pastebin.com/MGShD31N
However, the file platform.h does exist.
sawntoe#uwubuntu:~/Documents/Assignments/2022/TVP/micropython$ ls /home/sawntoe/Documents/Assignments/2022/TVP/micropython/yotta_modules/mbed-classic/api/platform.h
/home/sawntoe/Documents/Assignments/2022/TVP/micropython/yotta_modules/mbed-classic/api/platform.h
sawntoe#uwubuntu:~/Documents/Assignments/2022/TVP/micropython$
At this point, I gave up on this and tried using Mu editor with the AppImage. However, Mu requires wayland, and I am on xorg.
Does anyone have any idea if this is possible? Thanks.
Mu and the uflash command are able to retrieve your Python code from .hex files. Using uflash you can do the following for example:
uflash my_script.py
I think that you want is somehow possible to do, but its harder than just using their web python editor: https://python.microbit.org/v/2
Peter Till answers the original question. The additional below adds to this answer by showing how to automate the build and load process. I use Debian. The original question states that Ubuntu is used, which is built on Debian.
A script to find and mount the micro:bit
When code is loaded to the micro:bit, the board is dismounted from the system. So each time you have new code to load, you have to remount the board.
I modified a script to find and mount the micro:bit.
#!/bin/bash
BASEPATH="/media/$(whoami)/"
MICRO="MICROBIT"
if [ $# -eq 0 ]
then
echo "no argument supplied, use 'mount' or 'unmount'"
exit 1
fi
if [ $1 == "--help" ]
then
echo "mounts or unmounts a BBC micro:bit"
echo "args: mount - mount the microbit, unmout - unmount the microbit"
fi
# how many MICRO found in udiksctl dump
RESULTS=$(udisksctl dump | grep IdLabel | grep -c -i $MICRO)
case "$RESULTS" in
0 ) echo "no $MICRO found in 'udkisksctl dump'"
exit 0
;;
1 ) DEVICELABEL=$(udisksctl dump | grep IdLabel | grep -i $MICRO | cut -d ":" -f 2 | sed 's/^[ \t]*//')
DEVICE=$(udisksctl dump | grep -i "IdLabel: \+$DEVICELABEL" -B 12 | grep " Device:" | cut -d ":" -f 2 | sed 's/^[ \t]*//')
DEVICEPATH="$BASEPATH""$DEVICELABEL"
echo "found one $MICRO, device: $DEVICE"
if [[ -z $(mount | grep "$DEVICE") ]]
then
echo "$DEVICELABEL was unmounted"
if [ $1 == "mount" ]
then
udisksctl mount -b "$DEVICE"
exit 0
fi
else
echo "$DEVICELABEL was mounted"
if [ $1 == "unmount" ]
then
udisksctl unmount -b "$DEVICE"
exit 0
fi
fi
;;
* ) echo "more than one $MICRO found"
;;
esac
echo "exiting without doing anything"
I alias this script to mm in my .bashrc file.
Automate mounting the micro:bit and flashing the python file
I use the inotifywait command to run mm and to then run uflash to load the .py file I am working on. Each time that the python file is saved, the aliased command mm is run followed by the uflash command.
while inotifywait -e modify <your_file>.py ; do mm && uflash <your_file>.py ; done
Okay, so elaborating on Peter Till's answer.
Firstly, you can use uflash:
uflash path/to/your/code .
Or, you can use microfs:
ufs put path/to/main.py
Working Ubuntu 22.04 host CLI setup with Carlos Atencio's Docker to build your own firmware
After trying to setup the toolchain for a while, I finally decided to Google for a Docker image with the toolchain, and found https://github.com/carlosperate/docker-microbit-toolchain at this commit from Carlos Atencio, a Micro:Bit foundation employee, and that just absolutely worked:
# Get examples.
git clone https://github.com/bbcmicrobit/micropython
cd micropython
git checkout 7fc33d13b31a915cbe90dc5d515c6337b5fa1660
# Get Docker image.
docker pull ghcr.io/carlosperate/microbit-toolchain:latest
# Build setup to be run once.
docker run -v $(pwd):/home --rm ghcr.io/carlosperate/microbit-toolchain:latest yt target bbc-microbit-classic-gcc-nosd#https://github.com/lancaster-university/yotta-target-bbc-microbit-classic-gcc-nosd
docker run -v $(pwd):/home --rm ghcr.io/carlosperate/microbit-toolchain:latest make all
# Build one example.
docker run -v $(pwd):/home --rm ghcr.io/carlosperate/microbit-toolchain:latest \
tools/makecombinedhex.py build/firmware.hex examples/counter.py -o build/counter.hex
# Build all examples.
docker run -v $(pwd):/home --rm ghcr.io/carlosperate/microbit-toolchain:latest \
bash -c 'for f in examples/*; do b="$(basename "$f")"; echo $b; tools/makecombinedhex.py build/firmware.hex "$f" -o "build/${b%.py}.hex"; done'
And you can then flash the example you want to run with:
cp build/counter.hex "/media/$USER/MICROBIT/"
Some further comments at: Generating micropython + python code `.hex` file from the command line for the BBC micro:bit

Post real time output to Slack with bash script

I have a python script that I am executing with cron job. This script generates some output while executing and I wish to post it to Slack channel in real time.
Here is a bash script that I have:
#!/bin/bash
log_file=logs_$(date '+\%Y-\%m-\%d_\%H:\%M').txt
cd /path/to/script/run.py > /path/to/logs/${log_file} 2>&1
cat /path/to/logs/${log_file} | while read LINE; do
(echo "$LINE" | grep -e "Message" ) && curl -X POST --silent --data-urlencode \
"payload={\"text\": \"$(echo $LINE | sed "s/\"/'/g")\"}" "https://hooks.slack.com/services/xxxxxx";
done
This scrip works but it of course posts all messages to slack once the python script has already been executed. Is there any way I could configure it so that messages would be sent to Slack in real time while the python script is still being executed?
You may be able to read the output from your run.py script via process substitution:
#!/bin/bash
log_file=logs_$(date '+\%Y-\%m-\%d_\%H:\%M').txt
while read -r line ; do
echo "$line"
(echo "$line" | grep -e "Message" ) && curl -X POST --silent --data-urlencode \
"payload={\"text\": \"$(echo $line | sed "s/\"/'/g")\"}" "https://hooks.slack.com/services/xxxxxx";
done < <(/path/to/script/run.py 2>&1) >> "$log_file"
It may also prove useful to paste your code into shellcheck.net and have a look at the suggested changes.
Your script shouldn't work at all as you're not executing run.py but you're changing your working directory into it, so unless run.py is a directory, your script should fail.
Also, commands in bash scripting are executed sequentially, so if you launch your python command and then read the log, no wonder that you're not getting log lines in real time.
What i would do is using some pipes and xargs:
#!/bin/bash
/path/to/script/run.py | grep -e "Message" | sed "s/\"/'/g" | xargs -I{} curl -L -X POST --silent --data-urlencode 'payload={"text":"{}"}' https://hooks.slack.com/services/xxx
I've added -L to the curl command because hooks.slack.com makes a redirect to api.slack.com and without that flag curl will stop after the 302 instead of following the Location header in the response.

Crontab: read argument from file

I am trying to execute a python script (with chmod +x) which accepts several options via cronjob. One option is a password which I don't want to store in the crontab file, so I saved it with chmod 600 in my user's home directory (OS: raspbian).
My crobtab line is:
* * * * 5 [ $(date +\%d) -le 07 ] && /opt/scripts/myscript.py -p '$(< /home/pi/mypasswordfile)' >> /tmp/backup.log 2>&1
The line
/opt/scripts/myscript.py -p '$(< /home/pi/mypasswordfile)' >> /tmp/backup.log 2>&1
is executed correctly with bash, but not from the crontab. This is correct as crontab does not execute a bash - but how to do it correctly?
Thanks in advance!
I generally recommend against putting any complex syntax directly into crontab files. Put it into a script, and run the script from crontab. So create a script like runmyscript.sh that contains:
#!/bin/bash
if [ $(date +%d) -le 7 ]
then
/opt/scripts/myscript.py -p "$(< /home/pi/mypasswordfile)"
fi
and change the crontab to:
* * * * 5 /opt/scripts/runmyscript.sh >> /tmp/backup.log 2>&1
You could just capture your password and pass it as an argument using cat & backticks:
/opt/scripts/myscript.py -p `cat /home/pi/mypasswordfile` >> /tmp/backup.log
Disclosure: backticks have been deprecated in favor of $() but sometimes just doesn't fit the scenario.
Simply add
SHELL=/bin/bash
to your crontab file, to use bash instead of /bin/sh to execute the commands.
I would try:
bash -c '/opt/scripts/myscript.py -p $(< /home/pi/mypasswordfile)'
Also, sometimes you might need to pass environmental variables, specially DISPLAY for some programs to run correctly, for example:
* * * * 5 env DISPLAY=:0 [ $(date +\%d) -le 07 ] && bash -c '/opt/scripts/myscript.py -p $(< /home/pi/mypasswordfile)' >> /tmp/backup.log 2>&1

Script works differently when ran from the terminal and ran from Python

I have a short bash script foo.sh
#!/bin/bash
cat /dev/urandom | tr -dc 'a-z1-9' | fold -w 4 | head -n 1
When I run it directly from the shell, it runs fine, exiting when it is done
$ ./foo.sh
m1un
$
but when I run it from Python
$ python -c "import subprocess; subprocess.call(['./foo.sh'])"
ygs9
it outputs the line but then just hangs forever. What is causing this discrepancy?
Adding the trap -p command to the bash script, stopping the hung python process and running ps shows what's going on:
$ cat foo.sh
#!/bin/bash
trap -p
cat /dev/urandom | tr -dc 'a-z1-9' | fold -w 4 | head -n 1
$ python -c "import subprocess; subprocess.call(['./foo.sh'])"
trap -- '' SIGPIPE
trap -- '' SIGXFSZ
ko5o
^Z
[1]+ Stopped python -c "import subprocess; subprocess.call(['./foo.sh'])"
$ ps -H -o comm
COMMAND
bash
python
foo.sh
cat
tr
fold
ps
Thus, subprocess.call() executes the command with the SIGPIPE signal masked. When head does its job and exits, the remaining processes do not receive the broken pipe signal and do not terminate.
Having the explanation of the problem at hand, it was easy to find the bug in the python bugtracker, which turned out to be issue#1652.
The problem with Python 2 handling SIGPIPE in a non-standard way (i.e., being ignored) is already coined in Leon's answer, and the fix is given in the link: set SIGPIPE to default (SIG_DFL) with, e.g.,
import signal
signal.signal(signal.SIGPIPE,signal.SIG_DFL)
You can try to unset SIGPIPE from within your script with, e.g.,
#!/bin/bash
trap SIGPIPE # reset SIGPIPE
cat /dev/urandom | tr -dc 'a-z1-9' | fold -w 4 | head -n 1
but, unfortunately, it doesn't work, as per the Bash reference manual
Signals ignored upon entry to the shell cannot be trapped or reset.
A final comment: you have a useless use of cat here; it's better to write your script as:
#!/bin/bash
tr -dc 'a-z1-9' < /dev/urandom | fold -w 4 | head -n 1
Yet, since you're using Bash, you might as well use the read builtin as follows (this will advantageously replace fold and head):
#!/bin/bash
read -n4 a < <(tr -dc 'a-z1-9' < /dev/urandom)
printf '%s\n' "$a"
It turns out that with this version, you'll have a clear idea of what's going on (and the script will not hang):
$ python -c "import subprocess; subprocess.call(['./foo'])"
hcwh
tr: write error: Broken pipe
tr: write error
$
$ # script didn't hang
(Of course, it works well with no errors with Python3). And telling Python to use the default signal for SIGPIPE works well too:
$ python -c "import signal; import subprocess; signal.signal(signal.SIGPIPE,signal.SIG_DFL); subprocess.call(['./foo'])"
jc1p
$
(and also works with Python3).

Categories