Most pythonic way of running a single command with sudo rights - python

I have a python script which is performing some nagios configuration. The script is running as a user which has full sudo rights (the user can run any command with sudo, without password prompt). The final step in the configuration is this:
open(NAGIOS_COMMAND_FILE, 'a').write(cmdline)
The NAGIOS_COMMAND_FILE is only writable by root, so this command should be run by root. I can think of two ways of achieving this (both unsatisfactory):
Run the whole script as root. I do not like doing this, since any error in my script will be executed with full root rights.
Put the open(NAGIOS_COMMAND_FILE, 'a').write(cmdline) command in a separate script, and use the subprocess library to call that script, with sudo. I do not like creating an extra script just to run a single command.
I suppose there is no way of changing the running user just for a single command, in my current script, or am I wrong?

Why don't you give write permission on NAGIOS_COMMAND_FILE to your user who have all sudo rights?

Never, ever run a web server as root or as a user with full sudo privileges. This isn't a pythonic thing, it is a "keep my server from being pwned" thing.
Look at os.seteuid, the "principle of least privilege", and man sudoers and run your server as regular "httpd-server" where "httpd-server" has sudoer permission to write to NAGIOS_COMMAND_FILE. And then be sure that what you write to the command file is as clean as you can make it.

It is actually possible to change user for a single command.
Fabric provides a way to log in as any user to a server. It relies on ssh connections I believe. So you could connect to localhost with a different user in your python script and execute the desired command.
http://docs.fabfile.org/en/1.4.3/api/core/decorators.html
Anyway, as others have already precised, it is best to allow the user running the script permission to execute this one command and avoid relying on root for execution.

I would agree with the post above, either give your user write perms to the NAGIOS_COMMAND_FILE or add that use to a group that has those permissions, like nagcmd.

Related

How do I execute ssh via Robot Framework's Process library

I'm at a loss for what to think, here. There must be something I'm missing since I'm able to execute this perfectly fine in another, similar context, but not here. I am trying to run the following console command via Robot Framework for the sake of testing my system's reaction to a network outage: ssh it#128.0.0.1 sudo ifconfig ens192 down (the user & host ip are modified for posting here.)
To accomplish this, I'm using the following line of code:
Run Process 'ssh it#128.0.0.1 sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
Now, I've executed this myself in console and saw that the result was an inability to ping the ip, which is what I want. So I know for certain that the command itself isn't a problem. What is a problem is the fact that I get the following result: /bin/sh: ssh it#128.0.0.1 sudo ifconfig ens192 down: command not found
That's fine, though, because I believe this is a simple issue of the ssh instance not having the it user's $PATH. For reasons described below, I'm going to guess its $PATH variable is empty. Thus, I modified the keyword call as such:
Run Process 'ssh it#128.0.0.1 PATH\=/usr/sbin sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
The result of this command is as follows: /bin/sh: ssh it#128.0.0.1 PATH=/usr/sbin sudo ifconfig ens192 down: No such file or directory, which I believe to be referring to the items in the PATH attribute. I've tried several different things to determine what the problem could be, such as why it would be unable to find these files in an instance where, executed manually in the console, these directories are indeed present.
Given that I have less than a year's worth of experience with Robot Framework, I'm certain that there's just something I'm not understanding about the Process library. Could I have an explanation as to what I am missing for this execution and, if possible, an explanation as to how I should be executing an ssh command via Robot Framework?
A few extra things worth pointing out:
The it user is brand new, having been created via useradd it and copying the home directory contents into its new space.
I have already added an entry into the device's sudoers file to allow the it user to execute ifconfig without requiring password entry.
On that note, I've sent the executing user's ssh keys over to the device such that it does not require a password to log into it as that user.
I have attempted to execute several basic commands. I have intermittently gotten ls to work but have been unable to get a greater sense of what $PATH the Robot Framework ssh instance has because echo doesn't seem to work.

bash script that is run from Python reaches sudo timeout

This is a long bash script (400+ lines ) that is originally invoked from a django app like so -
os.system('./bash_script.sh &> bash_log.log')
It stops on a random command in the script. If the order of commands is changed, it hangs on another command in approx. the same location.
sshing to the machine that runs the django app, and running sudo ./bash_script.sh, asks for a password and then runs all the way.
I can't see the message it presents when it hangs in the log file, couldn't make it redirect there. I assume it's a sudo password request.
Tried -
sudo -v in the script - didn't help.
ssh to the machine and manually extend the sudo timeout in /etc/sudoers - didnt help, I think since the django app is already in the air and uses the previos timeout.
splitting the script in two, and running one in separate thread, like so -
def basher(command, log_path):
with open(log_path) as log:
Popen(command, stdout=log, stderr=log).wait()
script_thread = Thread(target=basher, args=('bash_script_pt1.sh', 'bash_log_pt1.log'))
script_thread.start()
os.system('./bash_script_pt2.sh &> bash_log_pt2.log') # I know it's deprecated, not sure if maybe it's better in this case
script_thread.join()
The logs showed that part 1 ended ok, but part 2 still hangs, albeit later in the code than when they were together.
I thought to edit /etc/sudoers from inside the Python code, and then re-login via su - user. There are snippets of how to pass the password using pty, however I don't understand the mechanics of it and could not get it to work.
I also noted that ps aux | grep bash_script.sh shows that the script is being run twice. As -
/bin/bash bash_script.sh
and as
sh -c bash_script.sh.
I assume os.system has an internal shell=True going on.
I don't understand the Linux entities/mechanics in play to figure out what's happening.
My guess is that the django app has different and more limited permissions, than the script itself does, and the script is inheriting said restrictions because it is being executed by it.
You need to find out what permissions the script has when you run it just from bash, and what it has when you run it via django, and then figure out what the difference is.

"sudo" operations from python daemon

I am writing a python administrative daemon on linux that needs to start/stop other services. Following the principle of least privilege, I want to run this normally with regular user privileges but when it needs to start/stop other services, I want it to become root. Essentially I want to do what sudo would do from the command line. I cannot directly exec sudo from the daemon because it has no tty. I want to avoid running the daemon as root when it does not need to run as root. Is there any way to do this from python without needing to use sudo?
Thank you in advance.
Ranga.
In this case I have a flask back end that needed to do something privileged. I broke it up into two back ends - one unprivileged and another small privileged piece rather than use sudo.
It is also possible to run sudo in a pty but I decided against this approach as it does indeed have a security flaw.

Running python cron script as non-root user

I have a small problem running a python script as a specific user account in my CentOS 6 box.
My cron.d/cronfile looks like this:
5 17 * * * reports /usr/local/bin/report.py > /var/log/report.log 2>&1
The account reports exists and all the files that are to be accessed by that script are chowned and chgrped to reports. The python script is chmod a+r. The python script starts with a #!/usr/bin/env python.
But this is not the problem. The problem is that I see nothing in the logfile. The python script doesn't even start to run! Any ideas why this might be?
If I change the user to root instead of reports in the cronfile, it runs fine. However I cannot run it as root in production servers.
If you have any questions please ask :)
/e:
If I do sudo -u reports python report.py it works fine.
Cron jobs run with the permissions of the user that the cron job was setup under.
I.E. Whatever is in the cron table of the reports user, will be run as the reports user.
If you're having to so sudo to get the script to run when logged in as reports, then the script likely won't run as a cron job either. Can you run this script when logged in as reports without sudo? If not, then the cron job can't either. Make sense?
Check your logs - are you getting permissions errors?
There are a myriad of reasons why your script would need certain privs, but an easy way to fix this is to set the cron job up under root instead of reports. The longer way is to see what exactly is requiring elevated permissions and fix that. Is it file permissions? A protected command? Maybe adding reports to certain groups would allow you to run it under reports instead of root.
*be ULTRA careful if/when you setup cron jobs as root

How to gain root privileges in python via a graphical sudo?

Part of my python program needs administrator access. How can I gain root privileges using a GUI popup similar to the gksudo command?
I only need root privileges for a small part of my program so it would be preferable to only have the privileges for a particular function.
I'm hoping to be able to do something like:
gksudo(my_func, 'description of why password is needed')
gksudo can be used to launch programs running with administrator privileges. The part of your application that needs to run as root, must be able to be invoked as a separate process from the command line. If you need some form of communication between the two, you could use sockets or watch for a file, etc.
You have two options here:
You will need to make the part of the program that requires root privileges a separate file and then execute the file like this:
>>> import subprocess
>>> subprocess.call(['gksudo','python that_file.py'])
which will bring up the password prompt and run that_file.py as root
You could also require that a program be run as root from the start and just have the program user type "gksudo python your_program.py" in the command-line from the start which is obviously not the best idea if your program is normally launched from a menu.

Categories