I have a script with several functions:
def a():
pass
def b():
pass
def c():
pass
Which by design will be invoked depending on cmd line argument. I can create several if statements which will evaluate which function should run:
if args.function == "a":
a()
elif args.function == "b":
b()
elif args.function == "c":
c()
But is there a better way to do this?
You could make a dictionary like so
d = {"a" : a,
"b" : b}
and then dispatch
d[args.function]()
Perhaps you are looking for a library like click? It lets you easily add command-line subcommands with a decorator.
import click
#click.group()
def cli():
pass
#cli.command()
def a():
print("I am a")
#cli.command()
def b():
print("Je suis b")
if __name__ == '__main__':
cli()
Sample output:
bash$ ./ick.py --help
Usage: ick.py [OPTIONS] COMMAND [ARGS]...
Options:
--help Show this message and exit.
Commands:
a
b
bash$ ./ick.py a
I am a
Try using eval
eval(function_name_passed_as_argument + "()")
def a():
pass
def b():
pass
eval(args.function + "()")
This doesn't require the use of if-else logic. Function name passed as argument will be executed directly.
You make a dictionary as already pointed out, but how are you going to handle a bad input? I would create a default method and use the dict.get method:
def fallback():
print('That command does not exist')
# add any code you want to be run for
# a bad input here...
functions = {
'a': a,
'b': b
}
Then call the function by retrieving it:
functions.get(args.function.lower(), fallback)()
Python has several built-in functions that we can utilize for instance Argparse, this method pretty common in python for command line programmable application.
The basics:
import argparse
parser = argparse.ArgumentParser()
parser.parse_args()
By this method, you can have something like this:
$ python3 prog.py -v
verbosity turned on
$ python3 prog.py --help
usage: prog.py [-h] [-v]
optional arguments:
-h, --help show this help message and exit
-v, --verbose increase output verbosity
Related
I'm using cloup for my CLI for its constraints feature.
I have some commands a and b which have no common arguments.
import cloup
#cloup.group()
def cli():
pass
#cli.command(show_constraints=True)
#cloup.option("--foo")
def a(**kwargs):
print("hello")
#cli.command(show_constraints=True)
#cloup.option("--bar")
def b():
pass
cli()
I want a to be the default command. So, I'd like the following output:
$ python3 main.py
Usage: main.py [OPTIONS] COMMAND [ARGS]...
Options:
--help Show this message and exit.
Commands:
a
b
$ python3 main.py a --foo hey
hello
So far, this works as expected. Now I also want a to be the default command, thus I'd like to see:
$ python3 main.py --foo hey
hello
I know that I can have behaviour in cli as follows:
#cloup.group(invoke_without_command=True)
def cli():
print("custom behaviour")
That will give
$ python3 main.py
custom behaviour
I thought that I could forward the call to a in the cli function, but the group cli does not know the option --foo of command a:
$ python3 main.py --foo hey
Usage: main.py [OPTIONS] COMMAND [ARGS]...
Try 'main.py --help' for help.
Error: No such option: --foo
I'm stuck here. I found an answer to the question here (A command without name, in Click), but I have to use cloup.group. So if I applied the solution there ...
#cloup.group(cls=DefaultGroup, default='a',default_if_no_args=True)
def cli():
pass
... I'd get
Traceback (most recent call last):
File "main.py", line 11, in <module>
#cloup.option("--foo")
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/core.py", line 1834, in decorator
cmd = command(*args, **kwargs)(f)
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/decorators.py", line 184, in decorator
cmd = _make_command(f, name, attrs, cls) # type: ignore
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/decorators.py", line 152, in _make_command
**attrs,
TypeError: __init__() got an unexpected keyword argument 'show_constraints'
And that's only the tip of the spear - any other features from cloup.group also become unavailable.
I guess one could merge the groups of cloup and click-default-group, but that looks horribly time-consuming. Is there an easier way to get a default command in cloup?
I also found https://click.palletsprojects.com/en/8.0.x/api/?highlight=group#click.Context.ignore_unknown_options. But if I understood correctly, only commands have a context and groups do not, so it wouldn't help.
Author of Cloup here. You can try this:
"""
This example requires click-default-group.
"""
import cloup
from click import Context, HelpFormatter
from click_default_group import DefaultGroup
class GroupWithDefaultCommand(cloup.Group, DefaultGroup):
# Optional: mark default command with "*"
def format_subcommand_name(
self, ctx: click.Context, name: str, cmd: click.Command
) -> str:
if name == self.default_cmd_name:
name = name + "*"
return super().format_subcommand_name(ctx, name, cmd)
#cloup.group(cls=GroupWithDefaultCommand, default='alice')
def cli():
pass
#cli.command()
#cloup.option("--foo")
def alice(**kwargs):
print("Called alice with", kwargs)
#cli.command()
#cloup.option("--bar")
def bob(**kwargs):
print("Called bob with", kwargs)
if __name__ == '__main__':
cli()
AFAICS now, you'll only lose the "Did you mean" suggestion for mistyped commands (from Cloup) and the "*" indicating the dafault command (from click-default-group) (it was actually pretty easy to implement that with the method Group.format_subcommand_name introduced by Cloup). Let me know if you find any other problems. If it works well, I'll maybe add it to the examples folder.
Nonetheless, I'd suggest you to not use a default command at all. In click-default-group issue tracker, you can see it conflicts with click-help-colors and click-repl. So, unless you're not afraid of fixing issue that may potentially arise from having a default command, don't have one. As an alternative, you can just suggest your users to define an alias for the default command (e.g. by using the alias unix command).
This question is about the Python click package and relates to control flow based on arguments passed to the CLI.
I'm trying to build a master CLI to sit at the top directory of my repo, which will control many different modes of operation from one central place. When you invoke this command, it will take perhaps n options, where the first will determine which module to invoke, and then pass along the n-1 args to another module. But I want each module's commands and options to be defined within its respective module, not the main CLI controller, as I'm trying to keep the main controller simple and each module nicely abstracted away.
Here is a simple example of what I want:
import click
#click.command()
#click.option('--foo-arg', default='asdf')
def foo(foo_arg):
click.echo(f"Hello from foo with arg {foo_arg}")
#click.command()
#click.option('--bar-arg', default='zxcv')
def bar(bar_arg):
click.echo(f"Hello from bar with arg {bar_arg}")
#click.command()
#click.option('--mode', type=click.Choice(["foo", "bar"]))
def cli(mode):
if mode == "foo":
foo()
if mode == "bar":
bar()
if __name__ == '__main__':
cli()
In this example, foo() and bar() should be thought of as modules that are buried within the repo and which may also require a large number of CLI options, which I don't want to overload the main cli.py with. I feel like foo()'s CLI options should live within foo for contextual reasons. The following is how I want it to work.
Example 1:
python -m cli --mode foo --foo-arg asdf
should produce
Hello from foo with arg asdf
Example 2:
python -m cli --mode bar --bar-arg zxcv
should produce
Hello from bar with arg zxcv
Example 3:
python -m cli --mode foo --bar-arg qwer
should fail since foo() doesn't have the --bar-arg option.
Disclaimer: I know that I could register foo and bar as separate commands (invoked via python -m cli foo --foo-arg asd, i.e. with foo instead of --foo). However, due to a reason beyond the scope of this question, I need foo or bar to be specified by the --mode option identifier. This is a limitation of a tool that interacts with my app, which is unfortunately outside my control.
Is there a way to parse the args and make control flow possible based on a subset of args and then pass the remaining ones to a subsequent module, while not defining every module's options as decorators on def cli()?
Using an option to invoke a subcommand can be achieved by using click.MultiCommand and a custom parse_args() method like:
Custom Class
def mode_opts_cmds(mode_opt_name, namespace=None):
class RemoteCommandsAsModeOpts(click.MultiCommand):
def __init__(self, *args, **kwargs):
super(RemoteCommandsAsModeOpts, self).__init__(*args, **kwargs)
self.mode_opt_name = '--{}'.format(mode_opt_name)
opt = next(p for p in self.params if p.name == mode_opt_name)
assert isinstance(opt.type, click.Choice)
choices = set(opt.type.choices)
self.commands = {k: v for k, v in (
namespace or globals()).items() if k in choices}
for command in self.commands.values():
assert isinstance(command, click.Command)
def parse_args(self, ctx, args):
try:
args.remove(self.mode_opt_name)
except ValueError:
pass
super(RemoteCommandsAsModeOpts, self).parse_args(ctx, args)
def list_commands(self, ctx):
return sorted(self.commands)
def get_command(self, ctx, name):
return self.commands[name]
return RemoteCommandsAsModeOpts
Using the Custom Class
To make use of the custom class, invoke the mode_opts_cmds() function to create the custom class and then use the cls parameter to pass that class to the click.group() decorator.
#click.group(cls=mode_opts_cmds('mode'))
#click.option('--mode', type=click.Choice(["foo", "bar"]))
def cli(mode):
"""My wonderful cli"""
How does this work?
This works because click is a well designed OO framework. The #click.group() decorator usually instantiates a click.Group object but allows this behavior to be over ridden with the cls parameter. So it is a relatively easy matter to inherit from click.Group in our own class and over ride desired methods.
In this case we over ride click.Group.parse_args() so that when the command line is parsed we simply remove --mode and then the command line parses just like a normal sub-command.
To make this a library function
If the custom class is going to reside in a library, then instead of using the default globals() from the library file, the namespace will need to be passed in. The custom class creator method would then need to be called with something like:
#click.group(cls=mode_opts_cmds('mode', namespace=globals()))
Test Code
import click
#click.command()
#click.option('--foo-arg', default='asdf')
def foo(foo_arg):
"""the foo command is ok"""
click.echo(f"Hello from foo with arg {foo_arg}")
#click.command()
#click.option('--bar-arg', default='zxcv')
def bar(bar_arg):
"""bar is my favorite"""
click.echo(f"Hello from bar with arg {bar_arg}")
#click.group(cls=mode_opts_cmds('mode'))
#click.option('--mode', type=click.Choice(["foo", "bar"]))
def cli(mode):
"""My wonderful cli command"""
if __name__ == "__main__":
commands = (
'--mode foo --foo-arg asdf',
'--mode bar --bar-arg zxcv',
'--mode foo --bar-arg qwer',
'--mode foo --help',
'',
)
import sys, time
time.sleep(1)
print('Click Version: {}'.format(click.__version__))
print('Python Version: {}'.format(sys.version))
for command in commands:
try:
time.sleep(0.1)
print('-----------')
print('> ' + command)
time.sleep(0.1)
cli(command.split())
except BaseException as exc:
if str(exc) != '0' and \
not isinstance(exc,
(click.ClickException, SystemExit)):
raise
Test Results
Click Version: 7.0
Python Version: 3.6.3 (v3.6.3:2c5fed8, Oct 3 2017, 18:11:49) [MSC v.1900 64 bit (AMD64)]
-----------
> --mode foo --foo-arg asdf
Hello from foo with arg asdf
-----------
> --mode bar --bar-arg zxcv
Hello from bar with arg zxcv
-----------
> --mode foo --bar-arg qwer
Usage: test.py foo [OPTIONS]
Try "test.py foo --help" for help.
Error: no such option: --bar-arg
-----------
> --mode foo --help
Usage: test.py foo [OPTIONS]
the foo command is ok
Options:
--foo-arg TEXT
--help Show this message and exit.
-----------
>
Usage: test.py [OPTIONS] COMMAND [ARGS]...
My wonderful cli command
Options:
--mode [foo|bar]
--help Show this message and exit.
Commands:
bar bar is my favorite
foo the foo command is ok
You could use Callbacks for validations: https://click.palletsprojects.com/en/7.x/options/#callbacks-for-validation
Also, use multi chaining commands, so each argument will be passed to the command it needs.
https://click.palletsprojects.com/en/7.x/commands/#multi-command-chaining
Using the CLI library click I have an application script app.py with the two sub commands read and write:
#click.group()
#click.pass_context
def cli(ctx):
pass
#cli.command()
#click.pass_context
def read(ctx):
print("read")
#cli.command()
#click.pass_context
def write(ctx):
print("write")
I want to declare a common option --format. I know I can add it as an option to the command group via
#click.group()
#click.option('--format', default='json')
#click.pass_context
def cli(ctx, format):
ctx.obj['format'] = format
But then I cannot give the option after the command, which in my use case is a lot more natural. I want to be able to issue in the shell:
app.py read --format XXX
But with the outlined set-up I get the message Error: no such option: --format. The script only accepts the option before the command.
So my question is: How can I add a common option to both sub commands so that it works as if the option were given to each sub command?
AFAICT, this is not possible with Click. The docs state that:
Click strictly separates parameters between commands and subcommands.
What this means is that options and arguments for a specific command
have to be specified after the command name itself, but before any
other command names.
A possible workaround is writing a common_options decorator. The following example is using the fact that click.option is a function that returns a decorator function which expects to be applied in series. IOW, the following:
#click.option("-a")
#click.option("-b")
def hello(a, b):
pass
is equivalent to the following:
def hello(a, b):
pass
hello = click.option("-a")(click.option("-b")(hello))
The drawback is that you need to have the common argument set on all your subcommands. This can be resolved through **kwargs, which collects keyword arguments as a dict.
(Alternately, you could write a more advanced decorator that would feed the arguments into the context or something like that, but my simple attempt didn't work and i'm not ready to try more advanced approaches. I might edit the answer later and add them.)
With that, we can make a program:
import click
import functools
#click.group()
def cli():
pass
def common_options(f):
options = [
click.option("-a", is_flag=True),
click.option("-b", is_flag=True),
]
return functools.reduce(lambda x, opt: opt(x), options, f)
#cli.command()
#common_options
def hello(**kwargs):
print(kwargs)
# to get the value of b:
print(kwargs["b"])
#cli.command()
#common_options
#click.option("-c", "--citrus")
def world(citrus, a, **kwargs):
print("citrus is", citrus)
if a:
print(kwargs)
else:
print("a was not passed")
if __name__ == "__main__":
cli()
Yes you can add common options to sub commands which can go after the name of the sub command.
You can have options on parent command as well as on sub commands.
Check out below code snippet
import click
from functools import wraps
#click.group()
def cli():
pass
def common_options(f):
#wraps(f)
#click.option('--option1', '-op1', help='Option 1 help text', type=click.FLOAT)
#click.option('--option2', '-op2', help='Option 2 help text', type=click.FLOAT)
def wrapper(*args, **kwargs):
return f(*args, **kwargs)
return wrapper
#cli.group(invoke_without_command=True)
#common_options
#click.pass_context
def parent(ctx, option1, option2):
ctx.ensure_object(dict)
if ctx.invoked_subcommand is None:
click.secho('Parent group is invoked. Perform specific tasks to do!', fg='bright_green')
#parent.command()
#click.option('--sub_option1', '-sop1', help='Sub option 1 help text', type=click.FLOAT)
#common_options
def sub_command1(option1, option2, sub_option1):
click.secho('Perform sub command 1 operations', fg='bright_green')
#parent.command()
#click.option('--sub_option2', '-sop2', help='Sub option 2 help text', type=click.FLOAT)
#common_options
def sub_command2(option1, option2, sub_option2):
click.secho('Perform sub command 2 operations', fg='bright_green')
if __name__ == "__main__":
cli()
Usage
parent --help
=> prints parent group help text with options and sub commands
parent --option1 10 --option2 12
=> Parent group is invoked. Perform specific tasks to do!
parent sub_command1 --help
=> prints sub command 1 help text with options on sub commands
parent sub_command1 --option1 15 --option2 7 --sub_option1 5
=> Perform sub command 1 operations
parent sub_command2 --option1 15 --option2 7 --sub_option2 4
=> Perform sub command 2 operations
I am able to pass command line arguments when running
python <filename>.py arg1
But when am trying to pass the command line arguments for running pytest it fails and gives error as below. Can you please advise.
pytest <filename>.py arg1
ERROR: file not found: arg1
EDIT:
For example am thinking of using it this way assuming I have passed an argument and am reading it via sys.argv:
import sys
arg = sys.argv[3]
def f():
return 3
def test_function():
assert f() == arg
Your pytest <filename>.py arg1 command is trying to call pytest on two modules <filename>.py and arg1 , But there is no module arg1.
If you want to pass some argument before running pytest then run the pytest from a python script after extracting your variable.
As others suggested though you would probably want to parameterize your tests in some other way, Try:Parameterized pytest.
# run.py
import pytest
import sys
def main():
# extract your arg here
print('Extracted arg is ==> %s' % sys.argv[2])
pytest.main([sys.argv[1]])
if __name__ == '__main__':
main()
call this using python run.py filename.py arg1
Here's the method I just cooked up from reading the parameterized pytest docs and hacking for a while... I don't know how stable or good it is going to be overall since I just got it working.
I did however check that HTML coverage generation works with this method.
add a file to your test directory for configuring the command-line args you want to pass:
tests\conftest.py
# this is just so we can pass --server and --port from the pytest command-line
def pytest_addoption(parser):
''' attaches optional cmd-line args to the pytest machinery '''
parser.addoption("--server", action="append", default=[], help="real server hostname/IP")
parser.addoption("--port", action="append", default=[], help="real server port number")
and then add a test file, with this special pytest_generate_tests function which is called when collecting a test function
tests\test_junk.py
def pytest_generate_tests(metafunc):
''' just to attach the cmd-line args to a test-class that needs them '''
server_from_cmd_line = metafunc.config.getoption("server")
port_from_cmd_line = metafunc.config.getoption("port")
print('command line passed for --server ({})'.format(server_from_cmd_line))
print('command line passed for --port ({})'.format(port_from_cmd_line))
# check if this function is in a test-class that needs the cmd-line args
if server_from_cmd_line and port_from_cmd_line and hasattr(metafunc.cls, 'real_server'):
# now set the cmd-line args to the test class
metafunc.cls.real_server = server_from_cmd_line[0]
metafunc.cls.real_port = int(port_from_cmd_line[0])
class TestServerCode(object):
''' test-class that might benefit from optional cmd-line args '''
real_server=None
real_port = None
def test_valid_string(self):
assert self.real_server!=None
assert self.real_port!=None
def test_other(self):
from mypackage import my_server_code
if self.real_server != None:
assert "couldn\'t find host" not in my_server_code.version(self.real_server, self.real_port)
then run (with HTML coverage, for example) with:
pytest tests\test_junk.py --server="abc" --port=123 --cov-report html --cov=mypackage
It seems monkeypatch also works.
Example:
import sys
def test_example(monkeypatch):
monkeypatch.setattr(sys, 'argv', ['/path/to/binary', 'opt1', '...'])
assert f() == '...'
def test_another():
# sys.argv is not modified here
assert f() != '...'
This question already has answers here:
How to dynamically select a method call?
(5 answers)
Closed 9 years ago.
So I have a little script in python and for the help I want to print each method docstring. For example
~$ myscript.py help update
would print myClass.update.__doc__ to the screen. The code I was trying to run is this:
import sys
class myClass:
def update(self):
""" update method help """
def help(self):
method = sys.argv[2:3][0]
if method == "update":
print "Help: " + self.update.__doc__
myClass = myClass()
myClass.help()
It works, but as my methods collection grows it will be a pain in the ass to make the help work as intend. Is there anyway to call something like self.method.__doc__ dynamically? Thanks.
Instead of using this:
if method == 'update':
help_string = self.update.__doc__
you could use more flexible solution:
help_string = getattr(self, method).__doc__
Just make sure that you catch AttributeErrors (it will be thrown when there is no method with given name).
This will do it:
method = sys.argv[2:3][0] # This is a bit odd; why not sys.argv[2]?
print "Help: " + getattr(self, method).__doc__
I would use argparse for this:
import argparse
import inspect
class myClass(object):
"""description for program"""
def update(self):
"""update method help"""
print 'update command'
def something(self):
"""something command help"""
print 'something command'
if __name__ == '__main__':
program = myClass()
parser = argparse.ArgumentParser(description=program.__doc__)
subparsers = parser.add_subparsers()
for name, method in inspect.getmembers(program, predicate=inspect.ismethod):
subparser = subparsers.add_parser(name, help=method.__doc__)
subparser.set_defaults(method=method)
args = parser.parse_args()
args.method()
Example on the command line:
$ python ~/test/docargparse.py --help
usage: docargparse.py [-h] {something,update} ...
description for program
positional arguments:
{something,update}
something something command help
update update method help
optional arguments:
-h, --help show this help message and exit
$ python ~/test/docargparse.py
usage: docargparse.py [-h] {something,update} ...
docargparse.py: error: too few arguments
$ python ~/test/docargparse.py update
update command
$ python ~/test/docargparse.py something
something command
Something like
import inspect
class T:
def test(self):
'''test'''
pass
for t in inspect.getmembers(T, predicate=inspect.ismethod):
print t[1].__doc__
should scale pretty well.