I am struggling to launch my fabfile within my Python script. I have looked at similar posts on Stack Overflow regarding this but they don't solve my problem... Or maybe they do but I am not understanding them...Not sure.
My script writes to the fab file depending on what the user wants to run on the remote host. Here is an example of the fabfile:
[root#ip-50-50-50-50 bakery]# cat fabfile.py
from fabric.api import run
def deploy():
run('wget -P /tmp https://s3.amazonaws.com/MyBucket/httpd-2.2.26-1.1.amzn1.x86_64.rpm')
run('sudo yum localinstall /tmp/httpd-2.2.26-1.1.amzn1.x86_64.rpm')
I then need to run the fabfile from my script. If I run the following manually form the Command Line, then it works fine:
fab -f fabfile.py -u ec2-user -i id_rsa -H 10.10.15.150 deploy
1) How do I run that from inside my script with all of the options?
2) The IP address is a variable called "bakery_internalip". How do I call that variable as part of the fab line?
Try with subprocess :
import subprocess
subprocess.call(['fab', '-f', 'fabfile.py', '-u ec2-user', '-i', 'id_rsa', '-H', bakery_internalip, 'deploy'])
should do the trick
You can call fabric enabled commands directly from code. Usually you will have to set the env dictionary first to specify keys and host but is is very straightforward:
# the settings for the env dict
from fabric.api import env, execute
env.hosts = ["10.10.15.150", ]
env.user = "ec2-user"
env.key = "id_rsa"
# and call the function itself
from fabfile import deplot
execute(deploy, hosts=env.hosts)
You can find more inspiration in the fabric documentation:
http://docs.fabfile.org/en/1.11/usage/execution.html#using-execute-with-dynamically-set-host-lists
Related
I have a python script that i run with the following command :
python3 scan.py --api_token 5563ff177863e97a70a45dd4 --base_api_url http://101.102.34.66:4242/scanjob/ --base_report_url http://101.102.33.66:4242/ --job_id 42
This works perfectly when I run it on the command line
IN my Dockerfile , I have tried ARG and ENV . none seem to work
#ARG api_token
#ARG username
#ARG password
# Configure AWS arguments
#RUN aws configure set aws_access_key_id $AWS_KEY \
# && aws configure set aws_secret_access_key $AWS_SECRET_KEY \
# && aws configure set default.region $AWS_REGION
### copy bash script and change permission
RUN mkdir workspace
COPY scan-api.sh /workspace
RUN chmod +x /workspace/scan-api.py
CMD ["/python3", "/workspace/scan-api.py"]
so how do i define this flagged argument in docker file ?
And whats the command run when running the image ?
You can do this in two ways as you want to override at run time.
As args to Docker run command
As an ENV to Docker run command
1st is simplest and you will not need to change anything Dockerfile
docker run --rm my_image python3 /workspace/scan-api.py --bar tet --api_token 5563ff177863e97a70a45dd4 --base_api_url http://101.102.34.66:4242/scanjob/ --base_report_url http://101.102.33.66:4242/ --job_id
and my simple script
import sys
print ("All ARGs",sys.argv[1:])
Using ENV you will need to change Dockerfile
I am posting the way for one, you can do this for all args
FROM python:3.7-alpine3.9
ENV API_TOKEN=default_token
CMD ["sh", "-c", "python /workspace/scan-api.py $API_TOKEN"]
So you can override them during run time or have the ability to run with some default value.
docker run -it --rm -e API_TOKEN=new_token my_image
CMD takes exactly the same arguments you used from the command line.
CMD ["/python3", "scan.py", "--api_token", "5563ff177863e97a70a45dd4", "--base_api_url", "http://101.102.34.66:4242/scanjob/", "--base_report_url", "http://101.102.33.66:4242/", "--job_id", "42"]
It's confusing.
You will need to use the SHELL form of ENTRYPOINT (or CMD) in order to have environment variable substitution, e.g.
ENTRYPOINT "/python3","/workspace/scan-api.py","--api-token=${TOKEN}" ...
And then run the container using something of the form:
docker run --interactive --tty --env=TOKEN=${TOKEN} ...
HTH!
I'm trying to run a shell file on python:
mongod --config /opt/mongodb/mongod.conf
and call it on python:
subprocess.call(['bash', 'run.sh'])
but it says mongod : not found.
When I run it in the terminal it works.
How can I fix this?
You don't need to use bash. Just run it as a normal script as you do in terminal:
import subprocess
subprocess.call(['./run.sh'])
Also it seems that mongod is not in your system environment path so you need to add absolute path of mongod to your run.sh:
#!/bin/bash
/opt/mongodb-linux-x86_64-ubuntu1404-3.0.6/bin/mongod --config /opt/mongodb/mongod.conf
try :
import os
os.system('bash run.sh')
update command to :
#!/bin/sh
/usr/bin/mongod --quiet --config /opt/mongodb/mongod.conf
Overview
I'm trying to use python fabric to run an ssh command as root on a remote server.
The command: nohup ./foo &
foo is expected to command run for several days. I must be able to disassociate foo from fabric's remote ssh session, and put foo in the background.
The Fabric FAQ says you should use something like screen or tmux when you run your fabric script (which runs the backgrounded command). I tried that, but my fabric script still hung. foo is not hanging.
Question
How do I use fabric to run this command on a remote server without the script hanging: nohup ./foo &
Details
This is my script:
#!/bin/sh
# Credit: https://unix.stackexchange.com/a/20895/6766
if "true" : '''\'
then
exec "/nfs/it/network_python/$OSREL/bin/python" "$0" "$#"
exit 127
fi
'''
from getpass import getpass
import os
from fabric import Connection, Config
assert os.geteuid()==0, "ERROR: Must run as root"
for host in ['host1.foo.local', 'host2.foo.local']:
# Make an ssh connection to the host...
conn = Connection(host)
# The script always hangs at this line
result = conn.run('nohup ./foo &', warn=True, hide=True)
I always open a tmux session to run the aforementioned script in; even doing so, the script hangs when I get to conn.run(), above.
I'm running the script on a vanilla CentOS 6.5 VM; it runs under python 2.7.10 and fabric 2.1.
The Fabric FAQ is unclear... I thought the FAQ wanted tmux used on the local side when I executed the Fabric script.
The correct way to fix this problem is to replace nohup in the remote command, with screen -d -m <command>. Now I can run the whole script locally with no hangs (and I don't have to use tmux in the local term).
Explicitly, I have to rewrite the last line of my script in my question as:
# Remove &, and nohup...
result = conn.run('screen -d -m ./foo', warn=True, hide=True)
I'll try to explain this as simply as possible.
I have a dockerised python app. Within this python app at some point I try to run a docker command in another (libreoffice) container as such:
import subprocess
file_path = 'path_to_file'
args = ['docker', 'run', '-it', '-v', '/tmp:/tmp',
'lcrea/libreoffice-headless', '--headless', '--convert-to', 'pdf', file_path,
'--outdir', '/tmp']
process = subprocess.run(args,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
timeout=timeout)
I end my python app's Dockerfile with a command which starts the server:
CMD python3 -m app.run_app
What is interesting is when I start the python app like this it works fine:
docker-compose run -p 9090:9090 backend /bin/bash
root#74430c3f1f0c:/src python3 -m app.run_app
But when I start it just using docker-compose up, the libreoffice container is never called. I am sure of it because when I do docker ps -a in the first instance a libreoffice container has been created while in the second there is none.
What is going on here?
I found the error. I was passing in the -it option which was failing the process because of the input device is not a TTY. All I had to do was take it out...
I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash