I’m using Python 3.6 and Fabric 2.4. I’m using Fabric to SSH into a server and run some commands. I need to set an environment variable for the commands being run on the remote server. The documentation indicates that something like this should work:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.run("command_to_execute", env={"KEY": "VALUE"})
But that doesn’t work. Something like this should also be possible:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run("command_to_execute")
But that doesn’t work either. I feel like I’m missing something. Can anyone help?
I was able to do it by setting inline_ssh_env=True, and then explicitly setting the env variable, ex:
with Connection(host=hostname, user=username, inline_ssh_env=True) as c:
c.config.run.env = {"MY_VAR": "this worked"}
c.run('echo $MY_VAR')
As stated on the site of Fabric:
The root cause of this is typically because the SSH server runs non-interactive commands via a very limited shell call: /path/to/shell -c "command" (for example, OpenSSH). Most shells, when run this way, are not considered to be either interactive or login shells; and this then impacts which startup files get loaded.
You read more on this page link
So what you try to do won't work, and the solution is to pass the environment variable you want to set explicitly:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run('echo export %s >> ~/.bashrc ' % 'ENV_VAR=VALUE' )
c.run('source ~/.bashrc' )
c.run('echo $ENV_VAR') # to verify if it's set or not!
c.run("command_to_execute")
You can try that:
#task
def qa(ctx):
ctx.config.run.env['counter'] = 22
ctx.config.run.env['conn'] = Connection('qa_host')
#task
def sign(ctx):
print(ctx.config.run.env['counter'])
conn = ctx.config.run.env['conn']
conn.run('touch mike_was_here.txt')
And run:
fab2 qa sign
When creating the Connection object, try adding inline_ssh_env=True.
Quoting the documentation:
Whether to send environment variables “inline” as prefixes in front of command strings (export VARNAME=value && mycommand here), instead of trying to submit them through the SSH protocol itself (which is the default behavior). This is necessary if the remote server has a restricted AcceptEnv setting (which is the common default).
According to that part of the official doc, the connect_kwargs attribute of the Connection object is intended to replace the env dict. I use it, and it works as expected.
Related
I have a single file script for operations automation (log file downloads, stop/start several containers. User is choosing what to do via command arguments) and want to have fabric functions in the same script as well as argument parsing class and possibly some other. How do I call fabric functions from within the same python script? I do not want to use "fab" as it is.
And as a side note, I'd like to have these calls parallel as well.
This is a model class that would ideally contain all necessary fabric functions:
class fabricFuncs:
def appstate(self):
env.hosts = hosts
run('sudo /home/user/XXX.sh state')
This is launcher section:
if __name__ == "__main__":
argParser().argParse()
fabricFuncs().ihsstate()
argParser sets variables globaly using command line arguments specified (just to clarify what that part does).
Which sadly results in a failure where no hosts are defined (env.hosts should contain that inside the function...or is it too late to declare them there?)
EDIT1:
I have tried launching the fabric function using this:
for h in env.hosts:
with settings(hosts_string=user + "#" + h):
fabricFuncs().ihsstate()
It kind of works. I kind of hoped though, that I will be able to paralelize the whole process using fabric module as it is (via decorators) without wraping the whole thing in threading code.
EDIT2:
I have tried this as well:
execute(fabricFuncs().ihsstate())
Which fails with:
Fatal error: Needed to prompt for the target host connection string (host: None)
Can i put a whole env.hosts variable into "settings" above without iterating over that list with a "for" statement?
EDIT3:
I have tried editing the fab function like this to see if env.hosts are set properly:
class fabricFuncs:
def appstate(self):
env.hosts = hosts
print env.hosts
run('sudo /home/user/XXX.sh state')
And it prints out correctly, but still the "run" command fails with:
Fatal error: Needed to prompt for the target host connection string (host: None)
Use the execute command:
from fabric.api import execute
execute(argParser().argParse())
execute(fabricFuncs().ihsstate())
if you run the script without fab command env.host will set to None.
so if you want to use 'execute' you have to pass also 'hosts' parameter.
try this:
from fabric.api import execute, run
if __name__ == "__main__":
hosts = ["host1", "host2"]
execute(run('sudo /home/user/XXX.sh state'), hosts=hosts)
I would like to ssh to another server to run some script.
But before I run the script, I need to change directory to the path where the script is locate and set some environment variables.
In my local host, it can be done by
os.chdir(path)
os.environ["xxx"] = "xxx"
But in paramiko, I am not sure if any method can accomplish the things above. The closest thing I found is
ssh.exec_command("cd /xxx/yyy;xxx.sh")
But I would not like to execute several commands connect together with ; .
Would like to ask is there any other way that can change directory/set environment variables when ssh using paramiko?
For environment variables I could not get them to be set, however using an interactive shell will load the environment variables of the user. Those you can change in the .bashrc file.
For how to set up an interactive shell:
http://snipplr.com/view/12940/
I haven't found a solution yet for how to change the host directory; like you, I've been trying to use sshClient.exec_command("cd " + directory_name), but to no effect.
However, I can help with you question of issuing multiple commands. You could simply call sshClient.exec_command("command1; command2; command3;"). Alternatively, you could create a helper method such as:
def execCmd(ssh_client, *commands):
for command in commands:
stdin, stdout, stderr = ssh_client.exec_command(command)
for line in stdout.readlines():
print line
for line in stderr.readlines():
print line
cmds = [command1,command2,command3]
execCmd(SSH_Client,*cmds)
you can use '|' pipe to combine different commands.
It will work with ssh.exec_command().
I have just one command in fabfile.py:
#role('dev')
def test():
local('...')
Now, I can use --role=dev in every command, but this is extremely stupid.
What I want is to install my project in a host once, with a certain role, then use it without repeating this parameter.
I typically include the following in my fabfile.py:
if not len(env.roles):
env.roles = ["test"]
This says if env.roles is not defined (via the command line for instance) that it should be defined as "test" in my case. So in your case I would alter the above to substitute dev for test and thus you would have:
if not len(env.roles):
env.roles = ["dev"]
By doing this you should find that you get the behavior you are looking for while providing you the ability to override if you so desire at any point in the future.
EDIT: I'm editing this to include a small example fabfile.py and explanation of usage.
env.roledefs = {
'test': ['test.fabexample.com'],
'stage': ['stage.fabexample.com'],
'prod': ['web01.fabexample.com', 'web02.fabexample.com', 'web03.fabexample.com'],
}
# default role will be test
env.roles = ['test']
def git_pull():
run("git pull")
def deploy():
target = "/opt/apps/FOO"
with cd(target):
git_pull()
sudo("service apache2 restart")
Now this fabfile will allow me to deploy code to any of three different environments: "test", "stage", or "prod". I select which environment I want to deploy to via the command line:
fab -R stage deploy
or,
fab --role=stage deploy
If I do not specify a role fabric will default to 'test' due to env.roles being set. Not that fabric isn't used to do anything to the local box, instead it acts on the local box (or boxes) as defined in env.roledefs although with some modifications it could be made to work locally as well.
Typically the fabric command is used from a development box to perform these operations remotely on the testing, staging, or production boxes, therefore specifying the role via the command line is not "extremely stupid" but is by design in this case.
You can use env.roledefs to associates roles with groups of hosts.
I'm writing the script to setup the replaset for mongo in python.
The first part of the script starts the processes and the second should configure the replicaset.
From the command line I ussually do:
config={_id:"aaa",members:[{_id:0,host:"localhost:27017"},{_id:1,host:"localhost:27018"},{_id:2,host:"localhost:27019",arbiterOnly:true}]}
rs.initiate(config)
rs.status();
And then I'm looking from rs.status() that all members are initialized
I want to do the same in python script.
In general i'm looking for a good reference of setup scripts for mongodb (also sharding). I saw the python script in their site, it is a good start point (but it only for single machine and sinle node in replSet). I need to setup all on different machines.
Thanks
If you run rs.initiate (without the (config)) the shell tells you which command it would run. In this case, it would be:
function (c) {
return db._adminCommand({replSetInitiate:c});
}
In python this should be something like:
>>> from pymongo import Connection
>>> c = Connection("morton.local:27017", slave_okay=True)
>>> d.command( "replSetInitiate", c );
With c being your replicaset configuration. http://api.mongodb.org/python/current/api/pymongo/database.html#pymongo.database.Database.command has some more information on calling commands.
Thanks Derick. Here are some remarks to your answer. 'replSetInitiate' is DBA command. Run it agains 'admin' database. As here:
conn = Connection("localhost:27017", slave_okay=True)
conn.admin.command( "replSetInitiate" );
To get the output of rs.status in pymongo we can use like this
def __init__(self):
'''Constructor'''
self.mdb=ReplicaSetConnection('localhost:27017',replicaSet='rs0')
def statusofcluster(self):
'''Check the status of Cluster and gives the output as true'''
print "We are Inside Status of Cluster"
output=self.mdb.admin.command('replSetGetStatus')
I have some years of solid experience working with asterisk but am new to python.
I want to connect from a python script and receive some events. I have created a manager user with AMIUSERNAME and AMIPASSWORD as credentials and tested working OK. I have also installed StarPy.
Then I run with the command python ami.py USERNAME PASSWORD the following script:
import sys
from starpy import manager
f = manager.AMIFactory(sys.argv[1], sys.argv[2])
df = f.login('127.0.0.1',5038)
While monitoring the asterisk console and nothing happens.
Does anyone know what I am missing?
I would like to send a Ping action and wait for a Pong response.
I suppose that f.login() returns you an AMIProtocol instance that has a ping() method.
I don't know anything about starpy, so some vague advice:
Start Python as an interactive shell. Execute code and examine results on the spot. help function is your friend; try help(df) after the last line of your script.
Look at the examples directory in starpy distribution. Maybe 90% of the code you need is already there.
The following is pulled from the ami module (and a few other places) in the Asterisk Test Suite. We use starpy extensively throughout the Test Suite, so you may want to check it out for some examples. Assume that the following code resides in some class with member method login.
def login(self):
def on_login_success(self, ami):
self.ami_factory.ping().addCallback(ping_response)
return ami
def on_login_error(self, reason):
print "Failed to log into AMI"
return reason
def ping_response(self, ami)
print "Got a ping response!"
return ami
self.ami_factory = manager.AMIFactory("user", "mysecret")
self.ami_factory.login("127.0.0.1", 5038).addCallbacks(on_login_success, on_login_error)
Make sure as well that your manager.conf is configured properly. For the Asterisk Test Suite, we use the following:
[general]
enabled = yes
webenabled = yes
port = 5038
bindaddr = 127.0.0.1
[user]
secret = mysecret
read = system,call,log,verbose,agent,user,config,dtmf,reporting,cdr,dialplan,test
write = system,call,agent,user,config,command,reporting,originate