/dev/mem access denied on raspberry pi - python

I am working with my Raspberry Pi and I am writing a cgi python script that creates a webpage to control my gpio out pins. My script crashes when I try to import RPi.GPIO as GPIO. This is the error that I am getting:
File "./coffee.py", line 7, in <module>
import RPi.GPIO as GPIO
RuntimeError: No access to /dev/mem. Try running as root!
My code works perfectly when I use sudo to run my script, but when I am running from a URL from my apache2 server it says that I do not have access to /dev/mem. I have already tried editing visudo and that did not work. This is what my visudo file looks like:
#includedir /etc/sudoers.d
pi ALL=(ALL) NOPASSWD: ALL
www-data ALL=(root) NOPASSWD: /usr/bin/python3 /usr/lib/cgi-bin/coffee.py *
apache2 ALL = (root) NOPASSWD: /usr/lib/cgi-bin/coffee.py
There any way that I can run my script as root from a URL call? Can anyone tell me what I am doing wrong?

I found that adding www-data to the gpio user group worked fine:
sudo usermod -aG gpio www-data
You can also add www-data to the memory user group:
sudo usermod -aG kmem www-data
As mentioned, it is a bad idea, but for me it was necessary.

Your problem is that the script is not executed as root. It is executed as the user that apache runs as.
Your apache process runs as a specific user, probably www-data. You could change the user that apache runs as. You should be able to find this in /etc/apache2/envvars:
# Since there is no sane way to get the parsed apache2 config in scripts, some
# settings are defined via environment variables and then used in apache2ctl,
# /etc/init.d/apache2, /etc/logrotate.d/apache2, etc.
export APACHE_RUN_USER=www-data
export APACHE_RUN_GROUP=www-data
If you change that to root you should have access. Normally this would be a terrible security hole, but you are doing direct memory access already. Be very careful!
If you are uncomfortable with this then you need to update your command so it is executed as root (this is a good way, but it requires you understand what you are doing!). You can do this by altering the way you call it, or by wrapping the call in a script which itself changes the user, or by using setuid (this is very similar to the suEXEC approach mentioned earlier). Wrapping it in a script seems the best way to me, as that should allow your entry in sudoers to correctly apply the privilieges for only that command, and it doesn't require you to understand the full implications of setuid approaches.

Related

bash script that is run from Python reaches sudo timeout

This is a long bash script (400+ lines ) that is originally invoked from a django app like so -
os.system('./bash_script.sh &> bash_log.log')
It stops on a random command in the script. If the order of commands is changed, it hangs on another command in approx. the same location.
sshing to the machine that runs the django app, and running sudo ./bash_script.sh, asks for a password and then runs all the way.
I can't see the message it presents when it hangs in the log file, couldn't make it redirect there. I assume it's a sudo password request.
Tried -
sudo -v in the script - didn't help.
ssh to the machine and manually extend the sudo timeout in /etc/sudoers - didnt help, I think since the django app is already in the air and uses the previos timeout.
splitting the script in two, and running one in separate thread, like so -
def basher(command, log_path):
with open(log_path) as log:
Popen(command, stdout=log, stderr=log).wait()
script_thread = Thread(target=basher, args=('bash_script_pt1.sh', 'bash_log_pt1.log'))
script_thread.start()
os.system('./bash_script_pt2.sh &> bash_log_pt2.log') # I know it's deprecated, not sure if maybe it's better in this case
script_thread.join()
The logs showed that part 1 ended ok, but part 2 still hangs, albeit later in the code than when they were together.
I thought to edit /etc/sudoers from inside the Python code, and then re-login via su - user. There are snippets of how to pass the password using pty, however I don't understand the mechanics of it and could not get it to work.
I also noted that ps aux | grep bash_script.sh shows that the script is being run twice. As -
/bin/bash bash_script.sh
and as
sh -c bash_script.sh.
I assume os.system has an internal shell=True going on.
I don't understand the Linux entities/mechanics in play to figure out what's happening.
My guess is that the django app has different and more limited permissions, than the script itself does, and the script is inheriting said restrictions because it is being executed by it.
You need to find out what permissions the script has when you run it just from bash, and what it has when you run it via django, and then figure out what the difference is.

How to use a python package with sudo privileges inside Flask?

I have a Flask setup in my Raspberry Pi 4 Model B via this tutorial.
OS = Ubuntu Server 20.04.2 LTS
Python = 3.8
Please keep in mind that I am using virtual env for my Flask application as shown in the tutorial and my Flask application is running absolutely fine.
Now I installed Adafruit_DHT in the same venv and tried using the following code in one of the endpoints
import Adafruit_DHT
humidity, temperature = Adafruit_DHT.read_retry(Adafruit_DHT.DHT22, 24)
to which I am getting the following error
File "/usr/local/lib/python3.8/dist-packages/Adafruit_DHT/common.py", line 81, in read
return platform.read(sensor, pin)
File "/usr/local/lib/python3.8/dist-packages/Adafruit_DHT/Raspberry_Pi_2.py", line 34, in read
raise RuntimeError('Error accessing GPIO.')
RuntimeError: Error accessing GPIO.
So, after that, I created a simple python script say z.py and wrote the above code in it. Then, I activated the same Flask venv using
source venv1/bin/activate
And run the script using
python z.py
Again I got the same error. But If I run the above command as sudo
sudo python z.py
then script executed perfectly fine and I got the following response
87.0999984741211 29.399999618530273
So, now the question arrives, how do I use Adafruit_DHT package inside the Flask app with sudo permission?
I don't think setting 777 to www-data group would be the right choice. Or running the Flask app as sudo user would be a great idea.
I have tried installing Adafruit_DHT package globally with sudo, but still I have to execute z.py as sudo
So what is the correct way to do this?
I believe the package will be trying to access the device /dev/gpiomem (possibly (/dev/gpiochip0 or /dev/gpiochip1).
I think the neatest way to address this would be have those devices be owned by a group other than root and give that group permission to access the device, e.g.
sudo su
groupadd gpio
chgrp gpio /dev/gpio*
chmod g+rw /dev/gpio*
Then I'd go ahead and add your user to that group (by default this is ubuntu, but you may have created another user):
usermod -a -G gpio ubuntu
Now you've created a group called "gpio" that now has permissions to access your Pi's GPIO, and added your user to that group.
Please note, I have not tested this

How to start/stop service from python script running in Flask and Apache server using Debian?

I'm trying to start and stop services from a python script that is running using Flask and Apache.
To get the status from memcached, for example, I'm using
os.popen('service memcached status').read() and works like a charm.
The problem is that when I try to start/stop doing something like
os.popen('service memcached stop').read() it just does nothing (I checked by the shell that the service is still running)
To summarize, I can get the status but can't start/stop and don't know why its happens.
Does anyone have any suggestion?
Thanks,
I saw the apache logs in /var/log/apache2/error.log and the problem was that I needed more privileges to execute start/stop. But when I tried to use
os.popen('sudo service memcached stop').read()
I got an error, saying that I should have typed the su password.
To solve this problem I typed in the shell:
visudo
which opened the /etc/sudoers file. And there I added the line
www-data ALL=(ALL) NOPASSWD:ALL
I understood that this means that I am giving permission to the user www-data execute sudo without password.
To quit, press Ctrl+X and then y to save.
note: www-data is the username that executes the apache.

Giving django app root access?

I am trying to build an api using django which should alter iptables using POST parameters. I am using django 1.4, djangorestframework, python-iptables. The problem I am facing is that python-iptables need root access to change iptables rules. I can change the iptables rules by doing $ sudo python and from the python shell I can change those rules. Also I can change those rules by using iptdump module which takes in iptables parameters and create those rules which I can later save in a file (iptables.rules.txt) and use fabric's local('sudo iptables-restore < iptables.rules.txt'). But this will always prompt the user for a root password. Is there a way I can give django app root privileges so that I can bypass sudo password prompt.
If you really need a part of the application to run as root, you could rewrite it as a daemon, and communicate with it from the main Django application as suggested in this answer. I would only recommend it if the alternative below does not suit your requirements.
The alternative sudo iptables-restore < iptables.rules.txt is much simpler, just tell sudo not to ask for the password for just this command by adding this to your /etc/sudoers file:
djangouser ALL=(ALL) NOPASSWD: /sbin/iptables-restore
Where djangouser is the user the Django process is running as.
EDIT: You can avoid writing an intermediate file by sending the new iptables directly to the iptables-restore process:
import iptdump
import subprocess
ipt = iptdump.Iptables()
proc = subprocess.Popen(['sudo', '/sbin/iptables-restore'],
stdin=subprocess.PIPE)
proc.communicate(ipt.dump())

Most pythonic way of running a single command with sudo rights

I have a python script which is performing some nagios configuration. The script is running as a user which has full sudo rights (the user can run any command with sudo, without password prompt). The final step in the configuration is this:
open(NAGIOS_COMMAND_FILE, 'a').write(cmdline)
The NAGIOS_COMMAND_FILE is only writable by root, so this command should be run by root. I can think of two ways of achieving this (both unsatisfactory):
Run the whole script as root. I do not like doing this, since any error in my script will be executed with full root rights.
Put the open(NAGIOS_COMMAND_FILE, 'a').write(cmdline) command in a separate script, and use the subprocess library to call that script, with sudo. I do not like creating an extra script just to run a single command.
I suppose there is no way of changing the running user just for a single command, in my current script, or am I wrong?
Why don't you give write permission on NAGIOS_COMMAND_FILE to your user who have all sudo rights?
Never, ever run a web server as root or as a user with full sudo privileges. This isn't a pythonic thing, it is a "keep my server from being pwned" thing.
Look at os.seteuid, the "principle of least privilege", and man sudoers and run your server as regular "httpd-server" where "httpd-server" has sudoer permission to write to NAGIOS_COMMAND_FILE. And then be sure that what you write to the command file is as clean as you can make it.
It is actually possible to change user for a single command.
Fabric provides a way to log in as any user to a server. It relies on ssh connections I believe. So you could connect to localhost with a different user in your python script and execute the desired command.
http://docs.fabfile.org/en/1.4.3/api/core/decorators.html
Anyway, as others have already precised, it is best to allow the user running the script permission to execute this one command and avoid relying on root for execution.
I would agree with the post above, either give your user write perms to the NAGIOS_COMMAND_FILE or add that use to a group that has those permissions, like nagcmd.

Categories