sudo/suid non-root nesting fails - python

I have a python script (which must be called as root), which calls a bash script (which must be called as non-root), which sometimes needs to call sudo. This does not work - the "leaf" sudo calls give the message "$user is not in the sudoers file. This incident will be reported." How can I make this work?
The code (insert your non-root username in place of "your_username_here"):
tezt.py:
#!/usr/bin/python3
import os
import pwd
import subprocess
def run_subshell_as_user(cmd_args, user_name=None, **kwargs):
cwd = os.getcwd()
user_obj = pwd.getpwnam(user_name)
# Set up the child process environment
new_env = os.environ.copy()
new_env["PWD"] = cwd
if user_name is not None:
new_env["HOME"] = user_obj.pw_dir
new_env["LOGNAME"] = user_name
new_env["USER"] = user_name
# This function is run after the fork and before the exec in the child
def suid_func():
os.setgid(user_obj.pw_gid)
os.setuid(user_obj.pw_uid)
return subprocess.Popen(
cmd_args,
preexec_fn=suid_func,
cwd=cwd,
env=new_env,
**kwargs).wait() == 0
run_subshell_as_user(["./tezt"], "your_username_here") # <-- HERE
tezt:
#!/bin/bash
sudo ls -la /root
Then run it as:
sudo ./tezt.py
Does anyone know why this doesn't work? The user can run sudo under normal circumstances. Why does "user -sudo-> root -suid-> user" work fine, but then when you try to sudo from there it fails?

I'd suggest using sudo to drop privileges rather than doing so yourself -- that's a bit more thorough where possible, modifying effective as opposed to only real uid and gid. (To modify the full set yourself, you might try changing setuid() to setreuid(), and likewise setgid() to setregid()).
...this would mean passing something to Popen akin to the following:
["sudo", "-u", "your_username_here", "--"] + cmd_args

Related

python subprocess popen execute as different user

I am trying to execute a command in python 3.6 as a different user with popen from subprocess but it will still execute as the user who called the script (i plan to call it as root). I am using threads and therefore it is important that i don't violate the user rights when 2 threads execute in parallel.
proc = subprocess.Popen(['echo $USER; touch myFile.txt'],
shell=True,
env={'FOO':'bar', 'USER':'www-data'},
stdout=subprocess.PIPE)
The example above will still create the myFile.txt with my user_id 1000
I tried different approaches :
tried with as described in Run child processes as different user from a long running Python process by copying the os.environment and changed the user, etc (note this is for python 2)
tried with as described in https://docs.python.org/3.6/library/subprocess.html#popen-constructor by using start_new_session=True
My Last option is to prefix the command with sudo -u username command but i don't think this is the elegant way.
The standard way [POSIX only] would be to use preexec_fn to set gid and uid as described in more detail in this answer
Something like this should do the trick -- for completeness I've also modified your initial snippet to include the other environment variables you'd likely want to set, but just setting preexec_fn should be sufficient for the simple command you are running:
import os, pwd, subprocess
def demote(user_uid, user_gid):
def result():
os.setgid(user_gid)
os.setuid(user_uid)
return result
def exec_cmd(username):
# get user info from username
pw_record = pwd.getpwnam(username)
homedir = pw_record.pw_dir
user_uid = pw_record.pw_uid
user_gid = pw_record.pw_gid
env = os.environ.copy()
env.update({'HOME': homedir, 'LOGNAME': username, 'PWD': os.getcwd(), 'FOO': 'bar', 'USER': username})
# execute the command
proc = subprocess.Popen(['echo $USER; touch myFile.txt'],
shell=True,
env=env,
preexec_fn=demote(user_uid, user_gid),
stdout=subprocess.PIPE)
proc.wait()
exec_cmd('www-data')
Note that you'll also need to be sure the current working directory is accessible (e.g. for writing) by the demoted user since not overriding it explicitly

sudo pass automatic password in Python

I want to call a .sh file from a python script. This requires sudo permissions and I want to automatically pass the password without getting a prompt. I tried using subprocess.
(VAR1 is variable I want to pass, permissions.sh is the sh file I want to call from python script)
process = subprocess.Popen(['sudo', './permissions.sh', VAR1], stdin = subprocess.PIPE, stdout = subprocess.PIPE)
process.communicate(password)
Then I tried using pexpect
child = pexpect.spawn('sudo ./permissions.sh'+VAR1)
child.sendline(password)
In both cases it still prompts for password on the terminal. I want to pass the password automatically. I do not want to use os modules. How can this be done?
would use pexpect, but you need to tell it what to expect after the sudo as so:
#import the pexpect module
import pexpect
# here you issue the command with "sudo"
child = pexpect.spawn('sudo /usr/sbin/lsof')
# it will prompt something like: "[sudo] password for < generic_user >:"
# you "expect" to receive a string containing keyword "password"
child.expect('password')
# if it's found, send the password
child.sendline('S3crEt.P4Ss')
# read the output
print(child.read())
# the end
# use python3 for pexpect module e.g python3 myscript.py
import pexpect
# command with "sudo"
child = pexpect.spawn('sudo rm -f')
# it will prompt a line like "abhi#192.168.0.61's password:"
# as the word 'password' appears in the line pass it as argument to expect
child.expect('password')
# enter the password
child.sendline('mypassword')
# must be there
child.interact()
# output
print(child.read())

how to type sudo password when using subprocess.call?

i defined a function that switch my proxy settings every now and then,
problem is that i want it to run in a loop without manual intervention. But when i execute the program in sudo it gets called the first time en runs smoothly, second time it asks me for my sudo password. Here is the bit of code:
def ProxySetting(Proxy):
print "ProxyStetting(Proxy)"
call("networksetup -setwebproxy 'Wi-Fi' %s" "on" % Proxy, shell = True)
call("networksetup -setsecurewebproxy 'Wi-Fi' %s" "on" % Proxy, shell = True)
call("networksetup -setftpproxy 'Wi-Fi' %s" "on" %Proxy , shell=True)
I could use threading but am sure there is a way of doing it that wont cause problems. How can i hard code my sudo password so that it runs at the beginning of the function?
Here you can execute a command sudo without interactive prompt asking you to type your password :
from subprocess import call
pwd='my password'
cmd='ls'
call('echo {} | sudo -S {}'.format(pwd, cmd), shell=True)
Another method of passing your password to a shell command through python that wouldn't involve it showing up in any command history or ps output is:
p = subprocess.Popen(['sudo', self.resubscribe_script], stdin=subprocess.PIPE)
p.communicate('{}\n'.format(self.sudo_password))
Note that using communicate will only allow one input to be given to stdin; there are other methods for getting a reusable input.

permanently change directory python scripting/what environment do python scripts run in?

I have a small git_cloner script that clones my companies projects correctly. In all my scripts, I use a func that hasn't given me problems yet:
def call_sp(
command, **arg_list):
p = subprocess.Popen(command, shell=True, **arg_list)
p.communicate()
At the end of this individual script, I use:
call_sp('cd {}'.format(branch_path))
This line does not change the terminal I ran my script in to the directory branch_path, in fact, even worse, it annoyingly asks me for my password! When removing the cd yadayada line above, my script no longer demands a password before completing. I wonder:
How are these python scripts actually running? Since the cd command had no permanent effect. I assume the script splits its own private subprocess separate from what the terminal is doing, then kills itself when the script finishes?
Based on how #1 works, how do I force my scripts to change the terminal directory permanently to save me time,
Why would merely running a change directory ask me for my password?
The full script is below, thank you,
Cody
#!/usr/bin/env python
import subprocess
import sys
import time
from os.path import expanduser
home_path = expanduser('~')
project_path = home_path + '/projects'
d = {'cwd': ''}
#calling from script:
# ./git_cloner.py projectname branchname
# to make a new branch say ./git_cloner.py project branchname
#interactive:
# just run ./git_cloner.py
if len(sys.argv) == 3:
project = sys.argv[1]
branch = sys.argv[2]
if len(sys.argv) < 3:
while True:
project = raw_input('Enter a project name (i.e., mainworkproject):\n')
if not project:
continue
break
while True:
branch = raw_input('Enter a branch name (i.e., dev):\n')
if not branch:
continue
break
def call_sp(command, **arg_list):
p = subprocess.Popen(command, shell=True, **arg_list)
p.communicate()
print "making new branch \"%s\" in project \"%s\"" % (branch, project)
this_project_path = '%s/%s' % (project_path, project)
branch_path = '%s/%s' % (this_project_path, branch)
d['cwd'] = project_path
call_sp('mkdir %s' % branch, **d)
d['cwd'] = branch_path
git_string = 'git clone ssh://git#git/home/git/repos/{}.git {}'.format(project, d['cwd'])
#see what you're doing to maybe need to cancel
print '\n'
print "{}\n\n".format(git_string)
call_sp(git_string)
time.sleep(30)
call_sp('git checkout dev', **d)
time.sleep(2)
call_sp('git checkout -b {}'.format(branch), **d)
time.sleep(5)
#...then I make some symlinks, which work
call_sp('cp {}/dev/settings.py {}/settings.py'.format(project_path, branch_path))
print 'dont forget "git push -u origin {}"'.format(branch)
call_sp('cd {}'.format(branch_path))
You cannot use Popen to change the current directory of the running script. Popen will create a new process with its own environment. If you do a cd within that, it will change directory for that running process, which will then immediately exit.
If you want to change the directory for the script you could use os.chdir(path), then all subsequent commands in the script will be run from that new path.
Child processes cannot alter the environment of their parents though, so you can't have a process you create change the environment of the caller.

subprocess.popen seems to fail when run from crontab

I'm running a script from crontab that will just ssh and run a command and store the results in a file.
The function that seems to be failing is subprocess.popen.
Here is the python function:
def _executeSSHCommand(sshcommand,user,node):
'''
Simple function to execute an ssh command on a remote node.
'''
sshunixcmd = '/usr/bin/ssh %s#%s \'%s\'' % (user,node,sshcommand)
process = subprocess.Popen([sshunixcmd],
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
process.wait()
result = process.stdout.readlines()
return result
When it's run from the command line, it executes correctly, from cron it seems to fail with the error message below.
Here are the crontab entries:
02 * * * * /home/matt/scripts/check-diskspace.py >> /home/matt/logs/disklog.log
Here are the errors:
Sep 23 17:02:01 timmy CRON[13387]: (matt) CMD (/home/matt/scripts/check-diskspace.py >> /home/matt/logs/disklog.log)
Sep 23 17:02:01 timmy CRON[13386]: (CRON) error (grandchild #13387 failed with exit status 2)
I'm going blind trying to find exactly where I have gone so wrong. Any ideas?
The cron PATH is very limited. You should either set absolute path to your ssh /usr/bin/ssh or set the PATH as a first line in your crontab.
You probably need to pass ssh the -i argument to tell ssh to use a specific key file. The problem is that your environment is not set up to tell ssh which key to use.
The fact that you're using python here is a bit of a red herring.
For everything ssh-related in python, you might consider using paramiko. Using it, the following code should do what you want.
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect(node, username=user)
stdout = client.exec_command(ssh_command)[0]
return stdout.readlines()
When running python scripts from cron, the environment PATH can be a hangup, as user1652558 points out.
To expand on this answer with example code to add custom PATH values to the environment for a subprocess call:
import os
import subprocess
#whatever user PATH values you need
my_path = "/some/custom/path1:/some/custom/path2"
#append the custom values to the current PATH settings
my_env = os.environ.copy()
my_env["PATH"] = my_path + ":" + my_env["PATH"]
#subprocess call
resp = subprocess.check_output([cmd], env=my_env, shell=True)

Categories