Python scripts executes Kubernetes command: kubectl apply -f config_file.yaml - python

I have generated a python script that opens a deployment config_file.yaml, modifies some parameters and saves it again, using pyyaml. This python script will be executed in the master node of a Kubernetes cluster.
Once is generated the new file, my intention is to execute
kubectl apply -f config_file.yaml
in the python script to apply the modifications to the deployment.
I have been reading how to do it using kubernetes python client, but it seems it is not prepare to execute kubectl apply.
So the other option is to create a bash script and execute it from python script.
Bash scripts:
#!/bin/bash
sudo kubectl apply -f config_file.yaml
I give it permissions
chmod +x shell_scipt.sh
Python script:
import subprocess
subprocess.call(['./shell_script.sh'])
But an error appears:
File "/usr/lib/python2.7/subprocess.py", line 1047, in _execute_child
raise child_exception
OSError: [Errno 13] Permission denied
I don't know how to resolve this error, I have tested givin permissions to the bash script, but nothing worked.

I do not know anything about Kubernetes but I think I might help.
I am basically suggesting that you run the command directly from Python script, not having Python running a bash script which runs a command.
import os
command = 'kubectl apply -f config_file.yaml'
password = 'yourpassword'
p = os.system('echo %s|sudo -S %s' % (passs, command))

If I understand correctly you are using python to dynamically modify static yaml files. If this is the case I would recommend using helm which if perfect for making static yaml file dynamic :-)

How are you running python script ?
I think you are running python script with a non sudo user. Try to run python script as sudo user this way your subprocess will have access to that file.
if this fixes your issue let me know.

Related

docker and python using symfony process

I am using Laradock and want to be able to run a python script from my laravel app using Symfony Process. From inside the root on my container I can run "python3 script_name.py arg1" and it runs just fine. pip list shows all modules needed. When I run it from inside Laravel, it tells me:
"import pymysql ImportError: No module named 'pymysql'"
I have used a non-docker Laravel app to do this just fine, using:
$script = storage_path().'/app/script.py';
$process = new Process('python3 '. $script." ".session('division'));
What am I missing?
On *nix make sure that PYTHONPATH is configured correctly for all users or try to set full path to python3.
How to check
At first your php user
php -r "print shell_exec( 'whoami' );" // somebody
When run
su somebody python3 script_name.py arg1

Python Heroku allow pushed .exe to run - OSError: [Errno 13] Permission denied

I had to push an .exe file to heroku to be able to create invoice pdfs. It works localy without any problems but on heroku I get an error:
OSError: [Errno 13] Permission denied
Probably because I am not allowed to execute .exe files. So I need somehow to create a rule that this file is allowed to execute.
I pushed wkhtmltopdf.exe to heroku and I access this file in my method to create a pdf:
MYDIR = os.path.dirname(__file__)
path_wkthmltopdf = os.path.join(MYDIR + "/static/executables/", "wkhtmltopdf.exe")
config = pdfkit.configuration(wkhtmltopdf=path_wkthmltopdf)
Was not able to find a solution yet.
EDIT:
Tryed giving permission with chmod through heroku bash and also adding a linux executable but still the same error:
~/static/executables $ chmod a+x wkhtmltopdf-linux.exe
~ $ chmod a+x static/executables/wkhtmltopdf-linux.exe
Using sudo gave me:
bash: sudo: command not found
I'm not very familiar with heroku, but if you can somehow get access to terminal of environment of your application (for example ssh to your server), you need to change permissions of that file so it can be executed. To do that, you need to run in that terminal:
sudo chmod a+x /path/to/file/FILENAME
Also,i'm pretty sure your app on Heroku runs on Linux, specifically on Ubuntu, since it's the default (link)
It means there might be difficulties with running Windows executables.
Okay I managed to fix this with a buildpack. In addition wkhtmltopdf-pack must be installed and added to the requirements.txt.
Then you have to set a config var in heroku for the wkhtmltopdf executable which will be generated from the files provided in the buildpack. Do not search for an .exe file.
heroku config:set WKHTMLTOPDF_BINARY=wkhtmltopdf-pack
You can see all your config vars also in the heroku dashboard under settings, you can also create it there and not use the CLI.
Then you have to tell the pdfkit configuration where to find the WKHTMLTOPDF_BINARY:
In my config.py:
import subprocess
WKHTMLTOPDF_CMD = subprocess.Popen(
['which', os.environ.get('WKHTMLTOPDF_BINARY', 'wkhtmltopdf')], # Note we default to 'wkhtmltopdf' as the binary name
stdout=subprocess.PIPE).communicate()[0].strip()
For the pdfkit configuration:
config = pdfkit.configuration(wkhtmltopdf=app.config['WKHTMLTOPDF_CMD'])
Now you should be able to create the pdf, example:
the_pdf = pdfkit.from_string("something", False, configuration=config)
Credit to this tutorial:
https://artandlogic.com/2016/12/generating-pdfs-wkhtmltopdf-heroku/

manage.py command in crontab not working

I have created a executeable script .sh which contains code to run a django managemenet command.
cron.sh
#!/bin/sh
. /path/to/env/activate
cd /path/to/project
/path/to/env/bin/python manage.py some_command
I can confirm this script and manage.py command is working by executing it directly on terminal
$ /path/to/cron.sh
When i do it same via crontab its not working as expected.
** What am i doing wrong ?? I can confirm there is nothing wrong with crontab, it executing the cron.sh file but path/to/env/bin/python manage.py some_command is not working as expected.
cron log also showing
CRON[14768]: (root) CMD /path/to/cron.sh > /dev/null 2>&1
I am using bitnami django ami (ubuntu 14.04.5 LTS)
Update
After removing /dev/null i am getting this error now
"Cannot locate wrapped file"
It seems that it is a PATH problem. I do not know if django uses specific paths that must be set but AFAIK the crontab PATH is really limited due to security reasons. Just to check if that is the problem you could do in a shell terminal the following:
echo $PATH
You will get a complete PATH for instance:
/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl
In your crontab, put it above your code:
PATH=/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl
Tell me if this works. If does, try to purge the provided PATH or even better provide absolute locations in your code.
I have to say that I don't know if you can perform a cd in the cron like this. I always used absolute paths or cd /some/dir && /path/to/script args.
P.S: I cannot make comments yet, for this reason I put it in an answer.
The problem is that your not using the script that Bitnami uses to load all the environment variables (/opt/bitnami/scritps/setenv.sh).
I would try using this script:
#!/bin/sh
. /opt/bitnami/scritps/setenv.sh
. /path/to/env/activate
cd /path/to/project
/path/to/env/bin/python manage.py some_command

Using Fabric in Python, trying to run tsch commands on remote bash server

Sorry if that Title is confusing. I am running a remote python script over the local network using Fabric:
#Path to file being run
env.source = 'c/yadda/yadda'
env.file = 'run_tests.py'
env.set = 'source ~USERNAME/ENVIRONMENT'
#create the task of changing the directory and running the test file from there
def link():
print('Connecting to remote computer and setting envirnoment...')
run ('%s' % env.set)
run ('cd %s && ./%s' % (env.source, env.file))
--- The env.set is used to bring the Python version up to a more recent version so that the file commands work (open with was causing problems).
THE PROBLEM is that the env.set has tsch commands such as setenv which, when run in the Bash shell, give errors. Is there any way that I can write the above to incorporate tsch commands?
Nevermind I solved it myself :)
Setting in the top:
env.shell = '/bin/rbash -l -c'
I got around this environment issue.

How to execute remote pysftp commands under a specific shell

I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash

Categories