Fabric run command on different hosts simultaneously - python

I'm using fabric and I want to download a file simultaneously on different hosts at the same time but when I use
env.hosts = ['192.168.1.2', '192.168.1.3', '192.168.1.4']
I always get No hosts found. Please specify (single) host string for connection:
from fabric.api import env , run, sudo, settings
env.user = 'root' #all the servers have the same username
env.hosts = ['192.168.1.2', '192.168.1.3', '192.168.1.4']
env.key_filename = "~/.ssh/id_rsa" # I have their ssh key
run('wget file') #The command I need to run in parrallel
I want to run this from a python code without using the fab command.

I usually use the #parallel decorator (http://docs.fabfile.org/en/1.13/usage/parallel.html) and do something like this.
env.use_ssh_config = True
env.user = 'ubuntu'
env.sudo_user = 'ubuntu'
env.roledefs = {
'uat': ['website_uat'],
'prod': ['website01', 'website02']
}
#task
def deploy(role_from_arg, **kwargs):
# on each remote download file
execute(download_file, role=role_from_arg, **kwargs)
#parallel
def download_file(*args, **kwargs):
# some code to download file here
Then i can run fab deploy:prod

Related

python fabric run() sudo command without prompting for a password

I have this python fabric fabfile. I want to run the sudo command without prompting for a password. Would like to save the password in the file. Does Fabric3 no longer support the watchers option ? Is there any way I can put the password in the script?
from fabric.api import *
from invoke import Responder
env.user = "usera"
env.password = "password"
env.sudo_user = "usera"
env.password = "password"
env.sudo_prompt = "Password:"
sudopass = Responder (
pattern=r'Password:',
response=env.password + '\n'
)
def itm_run ():
# result = run("sudo systemctl restart S99itm", pty=True, watchers=[sudopass])
result = run("sudo systemctl restart S99itm")
print(result)
The watchers still work, otherwise check your pattern. You can also add echo=True.
For the password it is best to keep it in an env variable or such. so you can do env.password = getenv('SOMEHOST_SOMEUSER_PASSWORD')

How to detect expired password using Python Fabric?

Using the following code, I'm unable to get Fabric to detect an expired password prompt at login. The session doesn't timeout, and the abort_on_prompts parameter doesn't appear to be triggering. How I can configure Fabric to detect this state?
from fabric.api import env, run, execute
from fabric import network
from fabric.context_managers import settings
def host_type():
with settings(abort_on_prompts=True):
print ("Using abort mode %(abort_on_prompts)s" % env)
result = run('uname -s')
return result
if __name__ == '__main__':
print ("Fabric v%(version)s" % env)
env.user = 'myuser'
env.password = 'user67user'
env.hosts = ['10.254.254.143']
host_types = execute(host_type)
Executing this script results in a hung script, as depicted below:
Fabric v1.11.1.post1
[10.254.254.143] Executing task 'host_type'
Using abort mode True
[10.254.254.143] run: uname -s
[10.254.254.143] out: WARNING: Your password has expired.
[10.254.254.143] out: You must change your password now and login again!
[10.254.254.143] out: Changing password for myuser.
[10.254.254.143] out: (current) UNIX password:
Fabric include a way to answer prompts questions.
You can use prompts dictionary on the with settings. In the dictionary every key is the question in standard output you want to answer, and the value is your answer.
So in your example:
from fabric.api import env, run, execute
from fabric import network
from fabric.context_managers import settings
def host_type():
with settings(prompts={"(current) UNIX password": "new_password"}):
result = run('uname -s')
return result
if __name__ == '__main__':
print ("Fabric v%(version)s" % env)
env.user = 'myuser'
env.password = 'user67user'
env.hosts = ['10.254.254.143']
host_types = execute(host_type)

Python Fabric decorator

I've some fabric tasks in my fabfile and I need to initialize, the env variable before their execution. I'm trying to use a decorator, it works but fabric always says "no host found Please specify (single)" however if I print the content of my variable "env" all seems good.
Also I call my tasks from another python script.
from fabric.api import *
from instances import find_instances
def init_env(func):
def wrapper(*args, **kwargs):
keysfolder = 'keys/'
env.user = 'admin'
env.key_filename = '%skey_%s_prod.pem'%(keysfolder, args[0])
env.hosts = find_instances(args[1])
return func(args[0], args[1])
return wrapper
#init_env
def restart_apache2(region, groupe):
print(env.hosts)
run('/etc/init.d/apache2 restart')
return True
My script which call the fabfile:
from fabfile import init_env, restart_apache2
restart_apache2('eu-west-1', 'apache2')
Output of print in restart apache2:
[u'10.10.0.1', u'10.10.0.2']
Any idea why my task restart_apache2 doesn't use the env variable?
Thanks
EDIT:
Which is interesting it's if in my script which calls the fabfile, I use settings from fabric.api and set a host ip, it works. This show that my decorator has well initialized the env variable because the key and user are send to fabric. It's only the env.hosts that's not read by fabric...
EDIT2:
I can reach my goal with using settings from fabric.api, like that:
#init_env
def restart_apache2(region, groupe):
for i in env.hosts:
with settings(host_string = '%s#%s' % (env.user, i)):
run('/etc/init.d/apache2 restart')
return True
Bonus question, has there a solution to use directly the env.hosts without settings?
I'm guessing here a little, but I'm assuming you've got into trouble because you're trying to solve two problems at once.
The first issue relates to the issue of multiple hosts. Fabric includes the concepts of roles, which are just groups of machines that you can issue commands to in one go. The information in the find_instances function could be used to populate this data.
from fabric import *
from something import find_instances
env.roledefs = {
'eu-west-1' : find_instances('eu-west-1'),
'eu-west-2' : find_instances('eu-west-2'),
}
#task
def restart_apache2():
run('/etc/init.d/apache2 restart')
The second issue is that you have different keys for different groups of servers. One way to resolve this problem is to use an SSH config file to prevent you from having to mix the details of the keys / users accounts with your fabric code. You can either add an entry per instance into your ~/.ssh/config, or you can use local SSH config (env.use_ssh_config and env.ssh_config_path)
Host instance00
User admin
IdentityFile keys/key_instance00_prod.pem
Host instance01
User admin
IdentityFile keys/key_instance01_prod.pem
# ...
On the command line, you should then be able to issue the commands like:
fab restart_apache2 -R eu-west-1
Or, you can still do single hosts:
fab restart_apache2 -H apache2
In your script, these two are equivalent to the execute function:
from fabric.api import execute
from fabfile import restart_apache2
execute(restart_apache2, roles = ['eu-west-1'])
execute(restart_apache2, hosts = ['apache2'])

Fabric Sudo No Password Solution

This question is about best practices. I'm running a deployment script with Fabric. My deployment user 'deploy' needs sudo to restart services. So I am using the sudo function from fabric to run these commands in my script. This works fine but prompts for password during script execution. I DON'T want to type a password during deployments. What's the best practice here. The only solution I can think of is changing the sudo permissions to not require password for the commands my deployment user runs. This doesn't seem right to me.
The ideal solution is to create a user on your server that is used only for deployment (eg, deploy). Then, set env.user=deploy in your fabfile. Then on your servers, you can give the user the necessary permission on a command-by-command basis in a sudoers file:
IMPORTANT: Always use sudo visudo to modify a sudoers file
Cmnd_Alias RELOAD_SITE = /bin/bash -l -c supervisorctl*, /usr/bin/supervisorctl*
deploy ALL = NOPASSWD: RELOAD_SITE
You can add as many Cmnd_Alias directives as is needed by the deploy user, then grant NOPASSWD access for each of those commands. See man sudoers for more details.
I like to keep my deploy-specific sudoers config in /etc/sudoers.d/deploy and include that file from /etc/sudoers by adding: includedir /etc/suoders.d at the end.
You can use:
fabric.api import env
# [...]
env.password = 'yourpassword'
The best way to do this is with subtasks. You can prompt for a password in the fabfile and never expose any passwords, nor make reckless configuration changes to sudo on the remote system(s).
import getpass
from fabric.api import env, parallel, run, task
from fabric.decorators import roles
from fabric.tasks import execute
env.roledefs = {'my_role': ['host1', 'host2']}
#task
# #parallel -- uncomment if you need parallel execution, it'll work!
#roles('my_role')
def deploy(*args, **kwargs):
print 'deploy args:', args, kwargs
print 'password:', env.password
run('echo hello')
#task
def prompt(task_name, *args, **kwargs):
env.password = getpass.getpass('sudo password: ')
execute(task_name, *args, role='my_role', **kwargs)
Note that you can even combine this with parallel execution and the prompt task still only runs once, while the deploy task runs for each host in the role, in parallel.
Finally, an example of how you would invoke it:
$ fab prompt:deploy,some_arg,another_arg,key=value
Seems like sudo may not be that bad of an option after all. You can specify which commands a user can run and the arguments the command may take (man sudoers). If the problem is just having to type the password, an option would involve using the pexpect module to login automatically, maybe with a password that you could store encrypted:
import pexpect, sys
pwd = getEncryptedPassword()
cmd = "yourcommand"
sCmd = pexpect.spawn('sudo {0}'.format(cmd))
sCmd.logfile_read = sys.stdout
sCmd.expect('Password:')
sCmd.sendline(pwd)
sCmd.expect(pexpect.EOF)
Use the keyring module to store and access passwords securely.
Here's how I do it with Fabric 2:
from fabric import task
import keyring
#task
def restart_apache(connection):
# set the password with keyring.set_password('some-host', 'some-user', 'passwd')
connection.config.sudo.password = keyring.get_password(connection.host, 'some-user')
connection.sudo('service apache2 restart')
You could also use GPG or any other command-line password tool. For example:
connection.config.sudo.password = connection.local('gpg --quiet -d /path/to/secret.gpg', hide=True).strip()
The secret.gpg file can be generated with echo "mypassword" | gpg -e > secret.gpg. The hide argument avoids echoing the password to the console.
To retain support for --prompt-for-sudo-password, add a conditional:
if not connection.config.sudo.password:
connection.config.sudo.password = keyring.get_password(connection.host, 'some-user')
You can also use passwords for multiple machines:
from fabric import env
env.hosts = ['user1#host1:port1', 'user2#host2.port2']
env.passwords = {'user1#host1:port1': 'password1', 'user2#host2.port2': 'password2'}
See this answer: https://stackoverflow.com/a/5568219/552671
As Bartek also suggests, enable password-less sudo for the deployment 'user' in the sudoers file.
Something like:
run('echo "{0} ALL=(ALL) ALL" >> /etc/sudoers'.format(env.user))

How to force the fabric connect to remote host before run() executed?

I am use fabric to write my rsync wrapper, the variable env.host_string will be set by execute() to run task. To get env.host_string, I run('pwd') first, and run rsync.
Is It possible to make sure user set env.hosts before some checkpoint arrived, such as the condition src == env.host_string ?
from fabric.api import run, env, task, abort
from string import Template
#task
def sync_to_same_dir(src, path):
env.host_string
cmd_template = Template("""rsync --dry-run -e ssh -avr $user#$src_host:$path/ $path/""")
if path.endswith("/"):
path = path[:-1]
run("pwd") # it's a work around
if src == env.host_string:
abort("cannot rysnc to the same host")
cmd = cmd_template.substitute(path=path, user=env.user, src_host=src)
run(cmd)
I find the answer from fabric's source code. There is a simple idea: how run check host as needed ?
#needs_host
def put(local_path=None, remote_path=None, use_sudo=False,
mirror_local_mode=False, mode=None):
"""
Upload one or more files to a remote host.
"""
I trace the needs_host it will prompt to ask hosts, when the user don't assign any hosts:
No hosts found. Please specify (single) host string for connection:
We can rewrite the task as:
from fabric.network import needs_host
#task
#needs_host
def the_task_needs_host(): pass
What are you trying to do? A task knows what host it's using w/o having any other fabric calls:
fab -f host-test.py foo
[98.98.98.98] Executing task 'foo'
98.98.98.98
98.98.98.98
Done.
And here is the script eg:
#!/user/bin/env python
from fabric.api import *
env.user = 'mgoose'
#task
#hosts("98.98.98.98")
def foo():
print(env.host)
print(env.host_string)
So you don't have to do anything special to know what host your task is on.

Categories