Custom conflict handling for ArgumentParser - python

What I need
I need an ArgumentParser, with a conflict handling scheme, that resolves some registered set of duplicate arguments, but raises on all other arguments.
What I tried
My initial approach (see also the code example at the bottom) was to subclass ArgumentParser, add a _handle_conflict_custom method, and then instantiate the subclass with ArgumentParser(conflict_handler='custom'), thinking that the _get_handler method would pick it up.
The Problem
This raises an error, because the ArgumentParser inherits from _ActionsContainer, which provides the _get_handler and the _handle_conflict_{strategy} methods, and then internally instantiates an _ArgumentGroup (that also inherits from _ActionsContainer), which in turn doesn't know about the newly defined method on ArgumentParser and thus fails to get the custom handler.
Overriding the _get_handler method is not feasible for the same reasons.
I have created a (rudimentary) class diagram illustrating the relationships, and therefore hopefully the problem in subclassing ArgumentParser to achieve what I want.
Motivation
I (think, that I) need this, because I have two scripts, that handle distinct parts of a workflow, and I would like to be able to use those separately as scripts, but also have one script, that imports the methods of both of these scripts, and does everything in one go.
This script should support all the options of the two individual scripts, but I don't want to duplicate the (extensive) argument definitions, so that I would have to make changes in multiple places.
This is easily solved by importing the ArgumentParsers of the (part) scripts and using them as parents, like so combined_parser = ArgumentParser(parents=[arg_parser1, arg_parser2]).
In the scripts I have duplicate options, e.g. for the work directory, so I need to resolve those conflicts.
This could also be done, with conflict_handler='resolve'.
But because there are a lot of possible arguments (which is not up to our team, because we have to maintain compatibility), I also want the script to raise an error if something gets defined that causes a conflict, but hasn't been explicitly allowed to do so, instead of quietly overriding the other flag, potentially causing unwanted behavior.
Other suggestions to achieve these goals (keeping both scripts separate, enabling use of one script that wraps both, avoiding code duplication and raising on unexpected duplicates) are welcome.
Example Code
from argparse import ArgumentParser
class CustomParser(ArgumentParser):
def _handle_conflict_custom(self, action, conflicting_actions):
registered = ['-h', '--help', '-f']
conflicts = conflicting_actions[:]
use_error = False
while conflicts:
option_string, action = conflicts.pop()
if option_string in registered:
continue
else:
use_error = True
break
if use_error:
self._handle_conflict_error(action, conflicting_actions)
else:
self._handle_conflict_resolve(action, conflicting_actions)
if __name__ == '__main__':
ap1 = ArgumentParser()
ap2 = ArgumentParser()
ap1.add_argument('-f') # registered, so should be resolved
ap2.add_argument('-f')
ap1.add_argument('-g') # not registered, so should raise
ap2.add_argument('-g')
# this raises before ever resolving anything, for the stated reasons
ap3 = CustomParser(parents=[ap1, ap2], conflict_handler='custom')
Other questions
I am aware of these similar questions:
python argparse subcommand with dependency and conflict
argparse conflict when used with two connected python3 scripts
Handling argparse conflicts
... and others
But even though some of them provide interesting insights into argparse usage and conflicts, they seem to address issues that are not related to mine.

While I agree that FMc's approach is probably the better one in terms of long term viability, I have found a way to override a custom handler into the ArgumentParser.
The key is to override the _ActionsContainer class which actually defines the handler functions. Then to override the base classes that the ArgumentParser and _ArgumentGroup inherit from.
In the case below, I've simply added a handler that ignores any conflicts, but you could add any custom logic you want.
import argparse
class IgnorantActionsContainer(argparse._ActionsContainer):
def _handle_conflict_ignore(self, action, conflicting_actions):
pass
argparse.ArgumentParser.__bases__ = (argparse._AttributeHolder, IgnorantActionsContainer)
argparse._ArgumentGroup.__bases__ = (IgnorantActionsContainer,)
parser = argparse.ArgumentParser(conflict_handler="ignore")
parser.add_argument("-a", type=int, default=1)
parser.add_argument("-a", type=int, default=2)
parser.add_argument("-a", type=int, default=3)
parser.add_argument("-a", type=int, default=4)
print(parser.parse_args())
Running python custom_conflict_handler.py -h prints:
usage: custom_conflict_handler.py [-h] [-a A] [-a A] [-a A] [-a A]
optional arguments:
-h, --help show this help message and exit
-a A
-a A
-a A
-a A
Running python custom_conflict_handler.py prints:
Namespace(a=1)
Running python custom_conflict_handler.py -a 5 prints:
Namespace(a=5)

For a various reasons -- notably the needs of testing -- I have adopted the
habit of always defining argparse configuration in the form of a data
structure, typically a sequence of dicts. The actual creation of the
ArgumentParser is done in a reusable function that simply builds the parser
from the dicts. This approach has many benefits, especially for more complex
projects.
If each of your scripts were to shift to that model, I would think that you
might be able to detect any configuration conflicts in that function and raise
accordingly, thus avoiding the need to inherit from ArgumentParser and mess
around with understanding its internals.
I'm not certain I understand your conflict-handling needs very well, so the
demo below simply hunts for duplicate options and raises if it sees one, but I
think you should be able to understand the approach and assess whether it might
work for your case. The basic idea is to solve your problem in the realm
of ordinary data structures rather than in the byzantine world of argparse.
import sys
import argparse
from collections import Counter
OPTS_CONFIG1 = (
{
'names': 'path',
'metavar': 'PATH',
},
{
'names': '--nums',
'nargs': '+',
'type': int,
},
{
'names': '--dryrun',
'action': 'store_true',
},
)
OPTS_CONFIG2 = (
{
'names': '--foo',
'metavar': 'FOO',
},
{
'names': '--bar',
'metavar': 'BAR',
},
{
'names': '--dryrun',
'action': 'store_true',
},
)
def main(args):
ap = define_parser(OPTS_CONFIG1, OPTS_CONFIG2)
opts = ap.parse_args(args)
print(opts)
def define_parser(*configs):
# Validation: adjust as needed.
tally = Counter(
nm
for config in configs
for d in config
for nm in d['names'].split()
)
for k, n in tally.items():
if n > 1:
raise Exception(f'Duplicate argument configurations: {k}')
# Define and return parser.
ap = argparse.ArgumentParser()
for config in configs:
for d in config:
kws = dict(d)
xs = kws.pop('names').split()
ap.add_argument(*xs, **kws)
return ap
if __name__ == '__main__':
main(sys.argv[1:])

There is an answer that's marginally less hacky than Hans's approach. You can simply subclass argparse.ActionsContainer and argparse.ArgumentGroup and make sure you inherit from ActionsContainer after argparse.ArgumentParser, this way it'll be later in the MRO and will take precedence. Here's an example:
import argparse
from typing import Iterable, Any
class ArgumentGroup(argparse._ArgumentGroup):
def _handle_conflict_custom(
self,
action: argparse.Action,
conflicting_actions: Iterable[tuple[str, argparse.Action]],
) -> None:
...
class ActionsContainer(argparse._ActionsContainer):
def _handle_conflict_custom(
self,
action: argparse.Action,
conflicting_actions: Iterable[tuple[str, argparse.Action]],
) -> None:
...
def add_argument_group(self, *args: Any, **kwargs: Any) -> ArgumentGroup:
group = ArgumentGroup(self, *args, **kwargs)
self._action_groups.append(group)
return group
class ArgumentParser(argparse.ArgumentParser, ActionsContainer):
...

Based on FMcs approach I have created something a little more elaborate, I know this isn't code review, but feedback is still welcome. Also, maybe it helps someone to see this fleshed out a bit more.
import argparse
from collections import Counter, OrderedDict
from typing import List, Dict, Any
from copy import deepcopy
class OptionConf:
def __init__(self):
self._conf = OrderedDict() # type: Dict[str, List[Dict[str, Any]]]
self._allowed_dupes = list() # type: List[str]
def add_conf(self, command, *conf_args, **conf_kwargs):
if command not in self._conf:
self._conf[command] = []
conf_kwargs['*'] = conf_args
self._conf[command].append(conf_kwargs)
def add_argument(self, *conf_args, **conf_kwargs):
self.add_conf('add_argument', *conf_args, **conf_kwargs)
def register_allowed_duplicate(self, flag):
self._allowed_dupes.append(flag)
def generate_parser(self, **kwargs):
argument_parser = argparse.ArgumentParser(**kwargs)
for command, conf_kwargs_list in self._conf.items():
command_func = getattr(argument_parser, command)
for conf_kwargs in conf_kwargs_list:
list_args = conf_kwargs.pop('*', [])
command_func(*list_args, **conf_kwargs)
conf_kwargs['*'] = list_args
return argument_parser
def _get_add_argument_conf_args(self):
for command, kwargs_list in self._conf.items():
if command != 'add_argument':
continue
return kwargs_list
return []
def resolve_registered(self, other):
if self.__class__ == other.__class__:
conf_args_list = self._get_add_argument_conf_args() # type: List[Dict[str, Any]]
other_conf_args_list = other._get_add_argument_conf_args() # type: List[Dict[str, Any]]
# find all argument names of both parsers
all_names = []
for conf_args in conf_args_list:
all_names += conf_args.get('*', [])
all_other_names = []
for other_conf_args in other_conf_args_list:
all_other_names += other_conf_args.get('*', [])
# check for dupes and throw if appropriate
found_allowed_dupes = []
tally = Counter(all_names + all_other_names)
for name, count in tally.items():
if count > 1 and name not in self._allowed_dupes:
raise Exception(f'Duplicate argument configurations: {name}')
elif count > 1:
found_allowed_dupes.append(name)
# merge them in a new OptionConf, preferring the args of self (AS OPPOSED TO ORIGINAL RESOLVE)
new_opt_conf = OptionConf()
for command, kwargs_list in self._conf.items():
for kwargs in kwargs_list:
list_args = kwargs.get('*', [])
new_opt_conf.add_conf(command, *list_args, **kwargs)
for command, kwargs_list in other._conf.items():
for kwargs in kwargs_list:
# if it's another argument, we remove dupe names
if command == 'add_argument':
all_names = kwargs.pop('*', [])
names = [name for name in all_names if name not in found_allowed_dupes]
# and only add if there are names left
if names:
new_opt_conf.add_argument(*deepcopy(names), **deepcopy(kwargs))
# put names back
kwargs['*'] = all_names
else:
# if not, we just add it
list_args = kwargs.pop('*', [])
new_opt_conf.add_conf(command, *deepcopy(list_args), **deepcopy(kwargs))
# put list args back
kwargs['*'] = list_args
return new_opt_conf
raise NotImplementedError()
if __name__ == '__main__':
opts_conf = OptionConf()
opts_conf.add_argument('pos_arg')
opts_conf.add_argument('-n', '--number', metavar='N', type=int)
opts_conf.add_argument('-i', '--index')
opts_conf.add_argument('-v', '--verbose', action='store_true')
opts_conf2 = OptionConf()
opts_conf2.add_argument('-n', '--number', metavar='N', type=int)
opts_conf2.add_argument('-v', action='store_true')
opts_conf.register_allowed_duplicate('-n')
opts_conf.register_allowed_duplicate('--number')
try:
resolved_opts = opts_conf.resolve_registered(opts_conf2)
except Exception as e:
print(e) # raises on -v
opts_conf.register_allowed_duplicate('-v')
resolved_opts = opts_conf.resolve_registered(opts_conf2)
ap = resolved_opts.generate_parser(description='does it work?')
ap.parse_args(['-h'])

Related

How to check input arguments in a python script with CLI?

I'm writing a small script to learn Python. The script prints a chess tournament table for N players. It has a simple CLI with a single argument N. Now I'm trying the following approach:
import argparse
def parse_args(argv: list[str] | None = None) -> int:
parser = argparse.ArgumentParser(description="Tournament tables")
parser.add_argument('N', help="number of players (2 at least)", type=int)
args = parser.parse_args(argv)
if args.N < 2:
parser.error("N must be 2 at least")
return args.N
def main(n: int) -> None:
print(F"Here will be the table for {n} players")
if __name__ == '__main__':
main(parse_args())
But this seems to have a flaw. The function main doesn't check n for invalid input (as it's the job of CLI parser). So if somebody calls main directly from another module (a tester for example), he may call it with lets say 0, and the program most likely crashes.
How should I properly handle this issue?
I'm considering several possible ways, but not sure what is the best.
Add a proper value checking and error handling to main. This option looks ugly to me, as it violates the DRY principle and forces main to double the job of CLI.
Just document that main must take only n >= 2, and its behaviour is unpredicted otherwise. Possibly to combine with adding an assertion check to main, like this:
assert n >= 2, "n must be 2 or more"
Perhaps such a function should not be external at all? So the whole chosen idiom is wrong and the script's entry point should be rewritten another way.
???
You could have main do all the checking aind raise ArgumentError if something is amiss. Then catch that exception and forward it to the parser for display. Something along these lines:
import argparse
def run_with_args(argv: list[str] | None = None) -> int:
parser = argparse.ArgumentParser(description="Tournament tables")
parser.add_argument('N', help="number of players (2 at least)", type=int)
args = parser.parse_args(argv)
try:
main(args.N)
except argparse.ArgumentError as ex:
parser.error(str(ex))
def main(n: int) -> None:
if N < 2:
raise argparse.ArgumentError("N must be 2 at least")
print(F"Here will be the table for {n} players")
if __name__ == '__main__':
run_with_args()
If you don't want to expose argparse.ArgumentError to library users of main, you can also create a custom exception type instead of it.
A common way of running argparse when wanting to test functions/CLI is to have the main function take a the sys.argv list and then call parse_args from within main like so:
arg.py
import argparse
def parse_args(argv: list[str] | None = None) -> int:
parser = argparse.ArgumentParser(description="Tournament tables", prog="prog")
parser.add_argument("N", help="number of players (2 at least)", type=int)
args = parser.parse_args(argv)
if args.N < 2:
parser.error("N must be 2 at least")
return args
def main(argv: list[str] | None = None) -> None:
args = parse_args(argv)
print(f"Here will be the table for {args.N} players")
if __name__ == "__main__":
main()
This way a test can call main with a hypothetical CLI:
test_main.py
import pytest
from arg import main
def test_main(capsys):
with pytest.raises(SystemExit):
main(["0"])
out, err = capsys.readouterr()
assert err.splitlines()[-1] == "prog: error: N must be 2 at least"
I’ve been using Pydantic liberally for enforcing data typing at runtime, within my code itself. Your N>=2 is easily enforced with a validator.
It’s a very robust, extremely widely used, library and very fast as it’s more of a data ingestion validator than a type checker.
And you could write the call as follows. How you call main is entirely up to you: direct call, argparse, click…
class ParamChecker(BaseModel):
n : int
#validator('n')
def n2(cls, v):
if v <2:
raise ValueError('must be 2+')
return v
def main(n: int) -> None:
params = ParamChecker(**locals())
Pydantic also gets you informative, if not really end user friendly, error messages.

Avoid redundancy in argparse defaults

import argparse
class A:
def __init__(self,val=1):
self.val = val
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--val', default=1)
args = parser.parse_args()
a = A(val=args.val)
Quite often I find myself with a structure as shown above, where a class takes an optional argument with a default value. For the argparse I need to define this default value again. If it changes, I have to change it in multiple spots.
Leaving the default value in the class is not an option, as the class might be initialised from somewhere else as well.
I considered to set the argparse default value to None and only pass it when it's not None, but with multiple variables that will lead to convoluted code.
The only other thing coming to my mind is a section on the top of the page to set constants like "DEFAULT_val" so the defaults are defined only once, but I'm wondering if there's a better solution.
import argparse
import inspect
def get_default_args(func):
signature = inspect.signature(func)
return {
k: v.default
for k, v in signature.parameters.items()
if v.default is not inspect.Parameter.empty
}
class A:
def __init__(self,val=1):
self.val = val
if __name__ == '__main__':
default_values = get_default_args(A.__init__)
parser = argparse.ArgumentParser()
parser.add_argument('--val', default=default_values['val'])
args = parser.parse_args()
a = A(val=args.val)
Found the solution in the answers to another question

Can't pickle local object 'ArgumentParser.__init__.<locals>.identity

import argparse
import pickle
parser = argparse.ArgumentParser(description='Process some integers.')
_ = pickle.dumps(parser)
In my code, the ArgumentParser object is serialized. But in runtime I get the error Can't pickle local object 'ArgumentParser.__init__.<locals>.identity.
In Lib/argparse.py the identity is function defined locally inside __init__ method and this prevents serialization. If convert this function to a method, then serialization is successful. But I think that this way is not the best solution, since the python library file is being changed. How serialize parser object best way?
I had a program that used sub-parsers and initializing the ArgumentParser noticeably delayed startup, even for —help. I experimented with several things encountered this. I found this to work.
from argparse import ArgumentParser
from pickle import dumps
def identity(string):
return string
p = ArgumentParser()
p.register('type', None, identity)
x = dumps(p)
I created the heir class class ArgumentParserSerializable(ArgumentParser) and definied identity method as static. It also works.
class ArgumentParserSerializable(ArgumentParser):
def __init__(self,
prog=None,
usage=None,
description=None,
epilog=None,
parents=[],
formatter_class=HelpFormatter,
prefix_chars='-',
fromfile_prefix_chars=None,
argument_default=None,
conflict_handler='error',
add_help=True,
allow_abbrev=True):
superinit = super(ArgumentParser, self).__init__
superinit(description=description,
prefix_chars=prefix_chars,
argument_default=argument_default,
conflict_handler=conflict_handler)
# default setting for prog
if prog is None:
prog = _os.path.basename(_sys.argv[0])
self.prog = prog
self.usage = usage
self.epilog = epilog
self.formatter_class = formatter_class
self.fromfile_prefix_chars = fromfile_prefix_chars
self.add_help = add_help
self.allow_abbrev = allow_abbrev
add_group = self.add_argument_group
self._positionals = add_group(_('positional arguments'))
self._optionals = add_group(_('optional arguments'))
self._subparsers = None
self.register('type', None, self.identity)
# add help argument if necessary
# (using explicit default to override global argument_default)
default_prefix = '-' if '-' in prefix_chars else prefix_chars[0]
if self.add_help:
self.add_argument(
default_prefix + 'h', default_prefix * 2 + 'help',
action='help', default=SUPPRESS,
help=_('show this help message and exit'))
# add parent arguments and defaults
for parent in parents:
self._add_container_actions(parent)
try:
defaults = parent._defaults
except AttributeError:
pass
else:
self._defaults.update(defaults)
# register types
#staticmethod
def identity(string):
return string

How to run the original function after decorating it as a click command? [duplicate]

I have a function which is wrapped as a command using click. So it looks like this:
#click.command()
#click.option('-w', '--width', type=int, help="Some helping message", default=0)
[... some other options ...]
def app(width, [... some other option arguments...]):
[... function code...]
I have different use cases for this function. Sometimes, calling it through the command line is fine, but sometime I would also like to call directly the function
from file_name import app
width = 45
app(45, [... other arguments ...])
How can we do that? How can we call a function that has been wrapped as a command using click? I found this related post, but it is not clear to me how to adapt it to my case (i.e., build a Context class from scratch and use it outside of a click command function).
EDIT: I should have mentioned: I cannot (easily) modify the package that contains the function to call. So the solution I am looking for is how to deal with it from the caller side.
You can call a click command function from regular code by reconstructing the command line from parameters. Using your example it could look somthing like this:
call_click_command(app, width, [... other arguments ...])
Code:
def call_click_command(cmd, *args, **kwargs):
""" Wrapper to call a click command
:param cmd: click cli command function to call
:param args: arguments to pass to the function
:param kwargs: keywrod arguments to pass to the function
:return: None
"""
# Get positional arguments from args
arg_values = {c.name: a for a, c in zip(args, cmd.params)}
args_needed = {c.name: c for c in cmd.params
if c.name not in arg_values}
# build and check opts list from kwargs
opts = {a.name: a for a in cmd.params if isinstance(a, click.Option)}
for name in kwargs:
if name in opts:
arg_values[name] = kwargs[name]
else:
if name in args_needed:
arg_values[name] = kwargs[name]
del args_needed[name]
else:
raise click.BadParameter(
"Unknown keyword argument '{}'".format(name))
# check positional arguments list
for arg in (a for a in cmd.params if isinstance(a, click.Argument)):
if arg.name not in arg_values:
raise click.BadParameter("Missing required positional"
"parameter '{}'".format(arg.name))
# build parameter lists
opts_list = sum(
[[o.opts[0], str(arg_values[n])] for n, o in opts.items()], [])
args_list = [str(v) for n, v in arg_values.items() if n not in opts]
# call the command
cmd(opts_list + args_list)
How does this work?
This works because click is a well designed OO framework. The #click.Command object can be introspected to determine what parameters it is expecting. Then a command line can be constructed that will look like the command line that click is expecting.
Test Code:
import click
#click.command()
#click.option('-w', '--width', type=int, default=0)
#click.option('--option2')
#click.argument('argument')
def app(width, option2, argument):
click.echo("params: {} {} {}".format(width, option2, argument))
assert width == 3
assert option2 == '4'
assert argument == 'arg'
width = 3
option2 = 4
argument = 'arg'
if __name__ == "__main__":
commands = (
(width, option2, argument, {}),
(width, option2, dict(argument=argument)),
(width, dict(option2=option2, argument=argument)),
(dict(width=width, option2=option2, argument=argument),),
)
import sys, time
time.sleep(1)
print('Click Version: {}'.format(click.__version__))
print('Python Version: {}'.format(sys.version))
for cmd in commands:
try:
time.sleep(0.1)
print('-----------')
print('> {}'.format(cmd))
time.sleep(0.1)
call_click_command(app, *cmd[:-1], **cmd[-1])
except BaseException as exc:
if str(exc) != '0' and \
not isinstance(exc, (click.ClickException, SystemExit)):
raise
Test Results:
Click Version: 6.7
Python Version: 3.6.3 (v3.6.3:2c5fed8, Oct 3 2017, 18:11:49) [MSC v.1900 64 bit (AMD64)]
-----------
> (3, 4, 'arg', {})
params: 3 4 arg
-----------
> (3, 4, {'argument': 'arg'})
params: 3 4 arg
-----------
> (3, {'option2': 4, 'argument': 'arg'})
params: 3 4 arg
-----------
> ({'width': 3, 'option2': 4, 'argument': 'arg'},)
params: 3 4 arg
I tried with Python 3.7 and Click 7 the following code:
import click
#click.command()
#click.option('-w', '--width', type=int, default=0)
#click.option('--option2')
#click.argument('argument')
def app(width, option2, argument):
click.echo("params: {} {} {}".format(width, option2, argument))
assert width == 3
assert option2 == '4'
assert argument == 'arg'
app(["arg", "--option2", "4", "-w", 3])
app(["arg", "-w", 3, "--option2", "4" ])
app(["-w", 3, "--option2", "4", "arg"])
All the app calls are working fine!
This use-case is described in the docs.
Sometimes, it might be interesting to invoke one command from another command. This is a pattern that is generally discouraged with Click, but possible nonetheless. For this, you can use the Context.invoke() or Context.forward() methods.
cli = click.Group()
#cli.command()
#click.option('--count', default=1)
def test(count):
click.echo('Count: %d' % count)
#cli.command()
#click.option('--count', default=1)
#click.pass_context
def dist(ctx, count):
ctx.forward(test)
ctx.invoke(test, count=42)
They work similarly, but the difference is that Context.invoke() merely invokes another command with the arguments you provide as a caller, whereas Context.forward() fills in the arguments from the current command.
If you just want to call the underlying function, you can directly access it as click.Command.callback. Click stores the underlying wrapped Python function as a class member. Note that directly calling the function will bypass all Click validation and any Click context information won't be there.
Here is an example code that iterates all click.Command objects in the current Python module and makes a dictionary of callable functions out from them.
from functools import partial
from inspect import getmembers
import click
all_functions_of_click_commands = {}
def _call_click_command(cmd: click.Command, *args, **kwargs):
result = cmd.callback(*args, **kwargs)
return result
# Pull out all Click commands from the current module
module = sys.modules[__name__]
for name, obj in getmembers(module):
if isinstance(obj, click.Command) and not isinstance(obj, click.Group):
# Create a wrapper Python function that calls click Command.
# Click uses dash in command names and dash is not valid Python syntax
name = name.replace("-", "_")
# We also set docstring of this function correctly.
func = partial(_call_click_command, obj)
func.__doc__ = obj.__doc__
all_functions_of_click_commands[name] = func
A full example can be found in binance-api-test-tool source code.
Here is a solution to call a click function with a dictionary of options:
Sometimes, it might be interesting to invoke one command from another
command. This is a pattern that is generally discouraged with Click,
but possible nonetheless. For this, you can use the Context.invoke()
or Context.forward() methods.
They work similarly, but the difference is that Context.invoke()
merely invokes another command with the arguments you provide as a
caller, whereas Context.forward() fills in the arguments from the
current command. Both accept the command as the first argument and
everything else is passed onwards as you would expect.
Example:
cli = click.Group()
#cli.command()
#click.option('--opt1', default=1)
#click.option('--opt2', default=2)
def test(opt1, opt2):
print(opt1)
print(opt2)
#cli.command()
#click.pass_context
def dist(ctx):
args = {"opt1":3, "opt2": 4}
ctx.invoke(test, **args)
if __name__ == "__main__":
dist()

argparse action or type for comma-separated list

I want to create a command line flag that can be used as
./prog.py --myarg=abcd,e,fg
and inside the parser have this be turned into ['abcd', 'e', 'fg'] (a tuple would be fine too).
I have done this successfully using action and type, but I feel like one is likely an abuse of the system or missing corner cases, while the other is right. However, I don't know which is which.
With action:
import argparse
class SplitArgs(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values.split(','))
parser = argparse.ArgumentParser()
parser.add_argument('--myarg', action=SplitArgs)
args = parser.parse_args()
print(args.myarg)
Instead with type:
import argparse
def list_str(values):
return values.split(',')
parser = argparse.ArgumentParser()
parser.add_argument('--myarg', type=list_str)
args = parser.parse_args()
print(args.myarg)
The simplest solution is to consider your argument as a string and split.
#!/usr/bin/env python3
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--myarg", type=str)
d = vars(parser.parse_args())
if "myarg" in d.keys():
d["myarg"] = [s.strip() for s in d["myarg"].split(",")]
print(d)
Result:
$ ./toto.py --myarg=abcd,e,fg
{'myarg': ['abcd', 'e', 'fg']}
$ ./toto.py --myarg="abcd, e, fg"
{'myarg': ['abcd', 'e', 'fg']}
I find your first solution to be the right one. The reason is that it allows you to better handle defaults:
names: List[str] = ['Jane', 'Dave', 'John']
parser = argparse.ArumentParser()
parser.add_argument('--names', default=names, action=SplitArgs)
args = parser.parse_args()
names = args.names
This doesn't work with list_str because the default would have to be a string.
Your custom action is the closest way to how it is done internally for other argument types. IMHO there should be a _StoreCommaSeperatedAction added to argparse in the stdlib since it is a somewhat common and useful argument type,
It can be used with an added default as well.
Here is an example without using an action (no SplitArgs class):
class Test:
def __init__(self):
self._names: List[str] = ["Jane", "Dave", "John"]
#property
def names(self):
return self._names
#names.setter
def names(self, value):
self._names = [name.strip() for name in value.split(",")]
test_object = Test()
parser = ArgumentParser()
parser.add_argument(
"-n",
"--names",
dest="names",
default=",".join(test_object.names), # Joining the default here is important.
help="a comma separated list of names as an argument",
)
print(test_object.names)
parser.parse_args(namespace=test_object)
print(test_object.names)
Here is another example using SplitArgs class inside a class completely
"""MyClass
Demonstrates how to split and use a comma separated argument in a class with defaults
"""
import sys
from typing import List
from argparse import ArgumentParser, Action
class SplitArgs(Action):
def __call__(self, parser, namespace, values, option_string=None):
# Be sure to strip, maybe they have spaces where they don't belong and wrapped the arg value in quotes
setattr(namespace, self.dest, [value.strip() for value in values.split(",")])
class MyClass:
def __init__(self):
self.names: List[str] = ["Jane", "Dave", "John"]
self.parser = ArgumentParser(description=__doc__)
self.parser.add_argument(
"-n",
"--names",
dest="names",
default=",".join(self.names), # Joining the default here is important.
action=SplitArgs,
help="a comma separated list of names as an argument",
)
self.parser.parse_args(namespace=self)
if __name__ == "__main__":
print(sys.argv)
my_class = MyClass()
print(my_class.names)
sys.argv = [sys.argv[0], "--names", "miigotu, sickchill,github"]
my_class = MyClass()
print(my_class.names)
And here is how to do it in a function based situation, with a default included
class SplitArgs(Action):
def __call__(self, parser, namespace, values, option_string=None):
# Be sure to strip, maybe they have spaces where they don't belong and wrapped the arg value in quotes
setattr(namespace, self.dest, [value.strip() for value in values.split(",")])
names: List[str] = ["Jane", "Dave", "John"]
parser = ArgumentParser(description=__doc__)
parser.add_argument(
"-n",
"--names",
dest="names",
default=",".join(names), # Joining the default here is important.
action=SplitArgs,
help="a comma separated list of names as an argument",
)
parser.parse_args()
I know this post is old but I recently found myself solving this exact problem. I used functools.partial for a lightweight solution:
import argparse
from functools import partial
csv_ = partial(str.split, sep=',')
p = argparse.ArgumentParser()
p.add_argument('--stuff', type=csv_)
p.parse_args(['--stuff', 'a,b,c'])
# Namespace(stuff=['a', 'b', 'c'])
If you're not familiar with functools.partial, it allows you to create a partially "frozen" function/method. In the above example, I created a new function (csv_) that is essentially a copy of str.split() except that the sep argument has been "frozen" to the comma character.

Categories