How can I pass a parameter to a fabric task when calling "fab" from the command line? For example:
def task(something=''):
print "You said %s" % something
$ fab task "hello"
You said hello
Done.
Is it possible to do this without prompting with fabric.operations.prompt?
Fabric 2 task arguments documentation:
http://docs.pyinvoke.org/en/latest/concepts/invoking-tasks.html#task-command-line-arguments
Fabric 1.X uses the following syntax for passing arguments to tasks:
fab task:'hello world'
fab task:something='hello'
fab task:foo=99,bar=True
fab task:foo,bar
You can read more about it in Fabric docs.
In Fabric 2, simply add the argument to your task function. For example, to pass the version argument to task deploy:
#task
def deploy(context, version):
...
Run it as follows:
fab -H host deploy --version v1.2.3
Fabric even documents the options automatically:
$ fab --help deploy
Usage: fab [--core-opts] deploy [--options] [other tasks here ...]
Docstring:
none
Options:
-v STRING, --version=STRING
Fabric 1.x arguments are understood with very basic string parsing, so you have to be a bit careful with how you send them.
Here are a few examples of different ways to pass arguments to the following test function:
#task
def test(*args, **kwargs):
print("args:", args)
print("named args:", kwargs)
$ fab "test:hello world"
('args:', ('hello world',))
('named args:', {})
$ fab "test:hello,world"
('args:', ('hello', 'world'))
('named args:', {})
$ fab "test:message=hello world"
('args:', ())
('named args:', {'message': 'hello world'})
$ fab "test:message=message \= hello\, world"
('args:', ())
('named args:', {'message': 'message = hello, world'})
I use double quote here to take the shell out of the equation, but single quotes may be better for some platforms. Also note the escapes for characters that fabric considers delimiters.
More details in the docs:
http://docs.fabfile.org/en/1.14/usage/fab.html#per-task-arguments
You need to pass all Python variables as strings, especially if you are using sub-process to run the scripts, or you will get an error. You will need to convert the variables back to int/boolean types separately.
def print_this(var):
print str(var)
fab print_this:'hello world'
fab print_this='hello'
fab print_this:'99'
fab print_this='True'
If someone is looking to pass parameters from one task to another in fabric2, just use the environment dictionary for that:
#task
def qa(ctx):
ctx.config.run.env['counter'] = 22
ctx.config.run.env['conn'] = Connection('qa_host')
#task
def sign(ctx):
print(ctx.config.run.env['counter'])
conn = ctx.config.run.env['conn']
conn.run('touch mike_was_here.txt')
And run:
fab2 qa sign
Related
I have the following CSV:
ip, arg1, arg2
1.2.3.4, foo, bar
1.3.4.5, baz, bub
I am trying to perform:
For each row in the CSV: ssh into the ip specificed,
Execute commands such as apt-get with arg1 for that row
Execute custom commands that will utilize arg2 for that row
Fabric has execute(do_work, hosts=host_list) but I can't really specify the right context for that. So far, I have hacked together something:
from fabric.api import env, execute
arguments = {}
def _deploy():
print env.host_string, arguments[env.host_string]
def deploy():
global arguments
arguments['1.2.3.4'] = ('foo', 'bar')
arguments['2.3.4.5'] = ('baz', 'bub')
execute(_deploy, hosts=arguments.keys())
This is printed:
[1.2.3.4] Executing task '_deploy'
1.2.3.4 ('foo', 'bar')
[2.3.4.5] Executing task '_deploy'
2.3.4.5 ('baz', 'bub')
Currently, this hasn't broken anything. Is there a better way or even a better lib for me to do this?
Note: I am not a fan of paramiko because it's too low level.
Not elegant per se, but this is the solution that I am settling with:
def special_echo(matrix):
key = 'ubuntu#' + env.host
name = matrix[key]
run('echo %s `hostname --ip-address`' % name)
A = {}
A['ubuntu#54.219.171.62'] = 'first'
A['ubuntu#52.53.149.140'] = 'second'
A['ubuntu#54.183.255.58'] = 'third'
execute(special_echo, A, hosts=A.keys())
Results in:
[ubuntu#54.219.171.62] Executing task 'special_echo'
[ubuntu#54.219.171.62] run: echo first `hostname --ip-address`
[ubuntu#54.219.171.62] out: first 172.31.1.234
[ubuntu#54.219.171.62] out:
[ubuntu#54.183.255.58] Executing task 'special_echo'
[ubuntu#54.183.255.58] run: echo third `hostname --ip-address`
[ubuntu#54.183.255.58] out: third 172.31.15.36
[ubuntu#54.183.255.58] out:
[ubuntu#52.53.149.140] Executing task 'special_echo'
[ubuntu#52.53.149.140] run: echo second `hostname --ip-address`
[ubuntu#52.53.149.140] out: second 172.31.8.138
[ubuntu#52.53.149.140] out:
This is an edge case in fabric that comes up from time to time. I can show you a strategy I've used in the past, but I'll start on a safer variation of your strategy.
The env dictionary is designed to store environment data that can be used by other tasks. Any data that should be shared between tasks but that cannot be passed as arguments can go here, so I'd change your code to keep arguments in env:
from fabric.api import env, execute, task
def _deploy():
print env.host_string, env.arguments[env.host_string]
#task
def deploy():
env.arguments = {}
env.arguments['1.2.3.4'] = ('foo', 'bar')
env.arguments['2.3.4.5'] = ('baz', 'bub')
execute(_deploy, hosts=env.arguments.keys())
The second strategy is one that I've used myself in the past to perform work like this and involves passing all arguments as part of the host string and letting the task be responsible for parsing that in to usable data. It keeps you from passing a large dictionary around globally, and with your use-case might be effective:
hosts.csv
1.2.3.4, foo, bar
1.3.4.5, baz, bub
1.4.5.6, complex-args, maybe with spaces
fabfile.py
from fabric.api import env, execute, run, settings, task
def _parse_args(argument_string):
# you should really use the csv module here to split properly (escapes, etc)
args = argument_string.split(', ')
if len(args) != 3:
raise ValueError("Only the host and 2 arguments are allowed")
return args
def _deploy():
host, arg1, arg2 = _parse_args(env.host_string)
with settings(host_string=host):
# run("apt-get '{}'".format(arg1))
# run("service '{}' restart".format(arg2)
print "[{}] apt-get '{}'".format(env.host_string, arg1)
print "[{}] service '{}' restart".format(env.host_string, arg2)
#task
def deploy():
with open('hosts.csv') as f:
hosts = [ line.strip() for line in f ]
execute(_deploy, hosts=hosts)
Here's the output you'd see using this strategy:
$ fab deploy
[1.2.3.4, foo, bar] Executing task '_deploy'
[1.2.3.4] apt-get 'foo'
[1.2.3.4] service 'bar' restart
[1.3.4.5, baz, bub] Executing task '_deploy'
[1.3.4.5] apt-get 'baz'
[1.3.4.5] service 'bub' restart
[1.4.5.6, complex-args, maybe with spaces] Executing task '_deploy'
[1.4.5.6] apt-get 'complex-args'
[1.4.5.6] service 'maybe with spaces' restart
Done.
I am trying click (Command line interface package for Python), while running the following code I get error Error: No such command "abcd"
#click.group()
#click.option('--source', required=True)
#click.pass_context
def cli(ctx, source):
ctx.obj = "pass it"
#cli.command()
#click.argument('abcd')
#click.pass_context
def hello(ctx, abcd):
click.echo("Hello, World")
if __name__ == '__main__':
cli()
I am running it as follows
python playclick.py --source this abcd
"abcd" is being treated as a separate command because of the space (this is a characteristic of your shell, not of click specifically).
If you want the value of source to be "this abcd", use quotes:
python playclick.py --source "this abcd"
To actually provide the abcd argument, you need to call the hello command – the argument is for that command:
python playclick.py --source this hello 123456
The hello command will have an argument of 123456.
Breaking down the entire line:
--source this provides the source argument to the main cli command.
hello is the command to run (try python playclick.py --source this and you'll get an error because there is no command), and 123456 is the argument named abcd to that command.
For those who are only using #click.argument but still getting the same no such command found error, what finally helped me solve this issue with (7.1.2) was just to remove uppercase letters in the command name.
I always use fabric to deploy my processes from my local pc to remote servers.
If I have a python script like this:
test.py:
import time
while True:
print "Hello world."
time.sleep(1)
Obviously, this script is a continuous running script.
And I deploy this script to remote server and execute my fabric script like this:
...
sudo("python test.py")
The fabric will always wait the return of test.py and won't exit.How can I stop the fabric script at once and ignore the return of test.py
Usually for this kind of asynchronous task processing Celery is preferred .
This explains in detail the use of Celery and Fabric together.
from fabric.api import hosts, env, execute,run
from celery import task
env.skip_bad_hosts = True
env.warn_only = True
#task()
def my_celery_task(testhost):
host_string = "%s#%s" % (testhost.SSH_user_name, testhost.IP)
#hosts(host_string)
def my_fab_task():
env.password = testhost.SSH_password
run("ls")
try:
result = execute(my_fab_task)
if isinstance(result.get(host_string, None), BaseException):
raise result.get(host_string)
except Exception as e:
print "my_celery_task -- %s" % e.message
sudo("python test.py 2>/dev/null >/dev/null &")
or redirect the output to some other file instead of /dev/null
This code worked for me:
fabricObj.execute("(nohup python your_file.py > /dev/null < /dev/null &)&")
Where fabricObj is an object to fabric class(defined internally) which speaks to fabric code.
When I run this fabfile.py...
from fabric.api import env, run, local, cd
def setenv(foo):
env.hosts = ['myhost']
def mycmd(foo):
setenv(foo)
print(env.hosts)
run('ls')
with this command fab mycmd:bar. I get this output...
['myhost']
No hosts found. Please specify (single) host string for connection:
What, what?! I don't get it? I've set the env.hosts and it seems to be valid "inside" the mycmd function, but for some reason that run command doesn't know about the hosts I've specified.
Color me confused. Any help would be appreciated!
#Chris, the reason you're seeing this behavior is because the host list is constructed before the task function is called. So, even though you're changing env.hosts inside the function, it is too late for it to have any effect.
Whereas the command fab setenv:foo mycmd:bar, would have resulted in something you would have expected:
$ fab setenv:foo mycmd:bar
[myhost] Executing task 'mycmd'
['myhost']
[myhost] run: ls
This is the same as the accepted answer, but because of the way setenv is defined, an argument is needed.
Another example:
from fabric.api import env, run, local, cd
env.hosts = ['other_host']
def setenv(foo):
env.hosts = ['myhost']
def mycmd(foo):
setenv(foo)
print('env.hosts inside mycmd: %s' % env.hosts)
run('ls')
The output of this is:
$ fab mycmd:bar
[other_host] Executing task 'mycmd'
env.hosts inside mycmd: ['myhost']
[other_host] run: ls
Fatal error: Name lookup failed for other_host
Underlying exception:
(8, 'nodename nor servname provided, or not known')
Aborting.
As you can see, the host-list is already set to ['other_host', ] when fabric starts to execute mycmd.
The way you are doing it is not normally how I would use Fabric.
from fabric.api import *
def hostname():
env.hosts = ['myhosts']
def mycmd():
print env.hosts
run('ls -l')
To run this I would then do
fab hostname mycmd
this allows you to seperate which host/hosts you want to perform the command on.
hope it helps.
Have you tried to used the hosts decorator?
from fabric.api import env, run, hosts
#hosts('myhost')
def mycmd(foo):
print(env.hosts)
run('ls')
I know this question is super old, but just in case someone stumbles across this, I have found that you don't need to call this as a fab file per se (your file doesn't need to be called "fabfile.py" and command doesn't need to be fab setenv(foo) mycmd(bar). Since you are importing the needed fab elements, you can call the file anything you want (let's call it "testfile.py") and simply use the execute function in the file. That would make your command python testfile.py.
Inside the testfile, set everything up like you would normally, but start the function using the execute keyword. Your file would look something like this:
from fabric.api import env, run
def setenv(foo):
env.hosts = ['myhost']
execute(mycmd, bar)
def mycmd(bar):
run('ls')
setenv(foo)
** It's important to note that the execute command does look like a regular function call. It will call the function and send the arguments in a single comma separated line. You can find more information here
So you'd start your program which would first call setenv, then setenv would execute the mycmd function. With this, you can also set multiple hosts within the same array. Something like:
env.hosts=['myhost1','myhost2','myhost3']
I have figured out how to make it work:
from fabric.api import env, run, local, cd
def setenv(foo):
env.hosts = ['myhost']
return env
def mycmd(foo):
env = setenv(foo)
print(env.hosts)
run('ls')
What is the "cleanest" way to implement an command-line UI, similar to git's, for example:
git push origin/master
git remote add origin git://example.com master
Ideally also allowing the more flexible parsing, for example,
jump_to_folder app theappname v2
jump_to_folder app theappname source
jump_to_folder app theappname source v2
jump_to_folder app theappname build v1
jump_to_folder app theappname build 1
jump_to_folder app theappname v2 build
jump_to_folder is the scripts name, app is the command, theappname is a "fixed-location" parameter, "build" and "v2" etc are arguments (For example, possible arguments would be any number/any number prefixed with a v, or build/source/tmp/config)
I could just manually parse the arguments with a series of if/else/elifs, but there must be a more elegant way to do this?
As an entirely theoretically example, I could describe the UI schema..
app:
fixed: application_name
optional params:
arg subsection:
"build"
"source"
"tmp"
"config"
arg version:
integer
"v" + integer
Then parse the supplied arguments though the above schema, and get a dictionary:
>>> print schema.parse(["app", "theappname", "v1", "source"])
{
"application_name": "theappname",
"params":{
"subsection": "source",
"version":"v1"
}
}
Does such a system exist? If not, how would I go about implementing something along these lines?
argparse is perfect for this, specifically "sub-commands" and positional args
import argparse
def main():
arger = argparse.ArgumentParser()
# Arguments for top-level, e.g "subcmds.py -v"
arger.add_argument("-v", "--verbose", action="count", default=0)
subparsers = arger.add_subparsers(dest="command")
# Make parser for "subcmds.py info ..."
info_parser = subparsers.add_parser("info")
info_parser.add_argument("-m", "--moo", dest="moo")
# Make parser for "subcmds.py create ..."
create_parser = subparsers.add_parser("create")
create_parser.add_argument("name")
create_parser.add_argument("additional", nargs="*")
# Parse
opts = arger.parse_args()
# Print option object for debug
print opts
if opts.command == "info":
print "Info command"
print "--moo was %s" % opts.moo
elif opts.command == "create":
print "Creating %s" % opts.name
print "Additional: %s" % opts.additional
else:
# argparse will error on unexpected commands, but
# in case we mistype one of the elif statements...
raise ValueError("Unhandled command %s" % opts.command)
if __name__ == '__main__':
main()
This can be used like so:
$ python subcmds.py create myapp v1 blah
Namespace(additional=['v1', 'blah'], command='create', name='myapp', verbose=0)
Creating myapp
Additional: ['v1', 'blah']
$ python subcmds.py info --moo
usage: subcmds.py info [-h] [-m MOO]
subcmds.py info: error: argument -m/--moo: expected one argument
$ python subcmds.py info --moo 1
Namespace(command='info', moo='1', verbose=0)
Info command
--moo was 1
The cmd module would probably work well for this.
Example:
import cmd
class Calc(cmd.Cmd):
def do_add(self, arg):
print sum(map(int, arg.split()))
if __name__ == '__main__':
Calc().cmdloop()
Run it:
$python calc.py
(Cmd) add 4 5
9
(Cmd) help
Undocumented commands:
======================
add help
(Cmd)
See the Python docs or PyMOTW site for more info.
Straight from one of my scripts:
import sys
def prog1_func1_act1(): print "pfa1"
def prog2_func2_act2(): print "pfa2"
commands = {
"prog1 func1 act1": prog1_func1_act1,
"prog2 func2 act2": prog2_func2_act2
}
try:
commands[" ".join(sys.argv[1:])]()
except KeyError:
print "Usage: ", commands.keys()
It's a pretty quick and dirty solution, but works great for my usage. If I were to clean it up a bit, I would probably add argparse to the mix for parsing positional and keyword arguments.
Python has a module for parsing command line options, optparse.
You might want to take a look at cliff – Command Line Interface Formulation Framework
Here's my suggestion.
Change your grammar slightly.
Use optparse.
Ideally also allowing the more flexible parsing, for example,
jump_to_folder -n theappname -v2 cmd
jump_to_folder -n theappname cmd source
jump_to_folder -n theappname -v2 cmd source
jump_to_folder -n theappname -v1 cmd build
jump_to_folder -n theappname -1 cmd build
jump_to_folder -n theappname -v2 cmd build
Then you have 1 or 2 args: the command is always the first arg. It's optional argument is always the second arg.
Everything else is options, in no particular order.