Python argparse subparser dest parameter doesn't work with a parent - python

I'm writing a utility that will have multiple modules, and which module gets run is determined by an argument. Each module has it's own arguments, but all modules will share 4 standard arguments. To get this to work I just set the 'parent' param when creating the subparsers, but the problem is I also need to be able to determine which module was called on the command line. It looks like the 'dest' param is the way to do this, but for some reason having both 'parent' and 'dest' set at the same time does not work.
import argparse
parser = argparse.ArgumentParser() # main parser
parser.addArgument("--foo", action='store_true')
subparsers = parser.add_subparsers(dest='cmd')
# without 'parents=[parser]' it properly stores 'bar' in cmd
# however '--foo' MUST be before 'bar'
bar = subparsers.add_parser("bar", parents=[parser], add_help=False)
bar.add_argument("--test", action='store_true')
# should be able to have '--foo' before OR after 'bar'
parser.parse_args(['--foo', 'bar', '--test'])
In this code, the add_subparsers call sets the dest to 'cmd.' Then, I could parse the arguments and call args.cmd to get the name of the module called (in this case, bar). However when parents is set the value of cmd is always None. Currently my workaround is to just have an empty main parser and simply copy-paste the 4 standard args to every subparser, which works but is not exactly desirable.
My question: Is there another way to determine which module was called? Why does this even happen?

Thanks to the information provided by #hpaulj in the comment to the OP, I managed to get this working.
Basically, you need your main parser and a parent parser. You will then set the parents attribute of your subparsers to be the parent parser. Based on the example you gave, the following should be a working example:
import argparse
# Create parsers
parser = argparse.ArgumentParser()
parent_parser = argparse.ArgumentParser()
# Add arguments to parent parser
parent_parser.add_argument("--foo", action='store_true')
# Create subparser
subparsers = parser.add_subparsers(dest='cmd')
# Add to the subparser
bar = subparsers.add_parser("bar", parents=[parent_parser], add_help=False)
bar.add_argument("--test", action='store_true')
baz = subparsers.add_parser("baz", parents=[parent_parser], add_help=False)
baz.add_argument("--baz-test", action="store_true")
# should be able to have '--foo' before OR after 'bar'
print(parser.parse_args(['bar', '--test']))
print(parser.parse_args(["baz", "--baz-test"]))
This outputs the following:
Namespace(cmd='bar', foo=False, test=True)
Namespace(baz_test=True, cmd='baz', foo=False)
You should then be able to do things like this:
args = parser.parse_args()
if args.cmd == "bar":
print("bar was specified")
elif args.cmd == "baz":
print("baz was specified")
It might not be the perfect solution, but it should work.
(Tested using Python 3.5.2)

Related

Disable argparse arguments from being overwritten

I have a legacy Python application which uses some options in its CLI, using argparse, like:
parser = argparse.ArgumentParser()
parser.add_argument('-f', default='foo')
Now I need to remove this option, since now its value cannot be overwritten by users but it has to assume its default value (say 'foo' in the example). Is there a way to keep the option but prevent it to show up and be overwritten by users (so that I can keep the rest of the code as it is)?
It's not entirely clear what you can do, not with the parser. Can you edit the setup? Or just modify results of parsing?
If you can edit the setup, you could replace the add_argument line with a
parser.setdefaults(f='foo')
https://docs.python.org/3/library/argparse.html#parser-defaults
The -f won't appear in the usage or help, but it will appear in the args
Or you could leave it in, but suppress the help display
parser.add_argument('-f', default='foo', help=argparse.SUPPRESS)
https://docs.python.org/3/library/argparse.html#help
Setting the value after parsing is also fine.
Yes you can do that. After the parser is parsed (args = parser.parse_args()) it is a NameSpace so you can do this:
parser = argparse.ArgumentParser()
args = parser.parse_args()
args.foo = 'foo value'
print(args)
>>> Namespace(OTHER_OPTIONS, foo='foo value')
I assumed that you wanted to add test to your parser, so your original code will still work, but you do not want it as an option for the user.
I think it doesn't make sense for the argparse module to provide this as a standard option, but there are several easy ways to achieve what you want.
The most obvious way is to just overwrite the value after having called parse_args() (as already mentioned in comments and in another answer):
args.f = 'foo'
However, the user may not be aware that the option is not supported anymore and that the application is now assuming the value "foo". Depending on the use case, it might be better to warn the user about this. The argparse module has several options to do this.
Another possibility is to use an Action class to do a little magic. For example, you could print a warning if the user provided an option that is not supported anymore, or even use the built-in error handling.
import argparse
class FooAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
if values != 'foo':
print('Warning: option `-f` has been removed, assuming `-f foo` now')
# Or use the built-in error handling like this:
# parser.error('Option "-f" is not supported anymore.')
# You could override the argument value like this:
# setattr(namespace, self.dest, 'foo')
parser = argparse.ArgumentParser()
parser.add_argument('-f', default='foo', action=FooAction)
args = parser.parse_args()
print('option=%s' % args.f)
You could also just limit the choices to only "foo" and let argparse create an error for other values:
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-f', default='foo', choices=['foo'])
args = parser.parse_args()
print('option=%s' % args.f)
Calling python test.py -f bar would then result in:
usage: test.py [-h] [-f {foo}]
test.py: error: argument -f: invalid choice: 'bar' (choose from 'foo')

How to make conditional arguments using argparse? [duplicate]

I have done as much research as possible but I haven't found the best way to make certain cmdline arguments necessary only under certain conditions, in this case only if other arguments have been given. Here's what I want to do at a very basic level:
p = argparse.ArgumentParser(description='...')
p.add_argument('--argument', required=False)
p.add_argument('-a', required=False) # only required if --argument is given
p.add_argument('-b', required=False) # only required if --argument is given
From what I have seen, other people seem to just add their own check at the end:
if args.argument and (args.a is None or args.b is None):
# raise argparse error here
Is there a way to do this natively within the argparse package?
I've been searching for a simple answer to this kind of question for some time. All you need to do is check if '--argument' is in sys.argv, so basically for your code sample you could just do:
import argparse
import sys
if __name__ == '__main__':
p = argparse.ArgumentParser(description='...')
p.add_argument('--argument', required=False)
p.add_argument('-a', required='--argument' in sys.argv) #only required if --argument is given
p.add_argument('-b', required='--argument' in sys.argv) #only required if --argument is given
args = p.parse_args()
This way required receives either True or False depending on whether the user as used --argument. Already tested it, seems to work and guarantees that -a and -b have an independent behavior between each other.
You can implement a check by providing a custom action for --argument, which will take an additional keyword argument to specify which other action(s) should become required if --argument is used.
import argparse
class CondAction(argparse.Action):
def __init__(self, option_strings, dest, nargs=None, **kwargs):
x = kwargs.pop('to_be_required', [])
super(CondAction, self).__init__(option_strings, dest, **kwargs)
self.make_required = x
def __call__(self, parser, namespace, values, option_string=None):
for x in self.make_required:
x.required = True
try:
return super(CondAction, self).__call__(parser, namespace, values, option_string)
except NotImplementedError:
pass
p = argparse.ArgumentParser()
x = p.add_argument("--a")
p.add_argument("--argument", action=CondAction, to_be_required=[x])
The exact definition of CondAction will depend on what, exactly, --argument should do. But, for example, if --argument is a regular, take-one-argument-and-save-it type of action, then just inheriting from argparse._StoreAction should be sufficient.
In the example parser, we save a reference to the --a option inside the --argument option, and when --argument is seen on the command line, it sets the required flag on --a to True. Once all the options are processed, argparse verifies that any option marked as required has been set.
Your post parsing test is fine, especially if testing for defaults with is None suits your needs.
http://bugs.python.org/issue11588 'Add "necessarily inclusive" groups to argparse' looks into implementing tests like this using the groups mechanism (a generalization of mutuall_exclusive_groups).
I've written a set of UsageGroups that implement tests like xor (mutually exclusive), and, or, and not. I thought those where comprehensive, but I haven't been able to express your case in terms of those operations. (looks like I need nand - not and, see below)
This script uses a custom Test class, that essentially implements your post-parsing test. seen_actions is a list of Actions that the parse has seen.
class Test(argparse.UsageGroup):
def _add_test(self):
self.usage = '(if --argument then -a and -b are required)'
def testfn(parser, seen_actions, *vargs, **kwargs):
"custom error"
actions = self._group_actions
if actions[0] in seen_actions:
if actions[1] not in seen_actions or actions[2] not in seen_actions:
msg = '%s - 2nd and 3rd required with 1st'
self.raise_error(parser, msg)
return True
self.testfn = testfn
self.dest = 'Test'
p = argparse.ArgumentParser(formatter_class=argparse.UsageGroupHelpFormatter)
g1 = p.add_usage_group(kind=Test)
g1.add_argument('--argument')
g1.add_argument('-a')
g1.add_argument('-b')
print(p.parse_args())
Sample output is:
1646:~/mypy/argdev/usage_groups$ python3 issue25626109.py --arg=1 -a1
usage: issue25626109.py [-h] [--argument ARGUMENT] [-a A] [-b B]
(if --argument then -a and -b are required)
issue25626109.py: error: group Test: argument, a, b - 2nd and 3rd required with 1st
usage and error messages still need work. And it doesn't do anything that post-parsing test can't.
Your test raises an error if (argument & (!a or !b)). Conversely, what is allowed is !(argument & (!a or !b)) = !(argument & !(a and b)). By adding a nand test to my UsageGroup classes, I can implement your case as:
p = argparse.ArgumentParser(formatter_class=argparse.UsageGroupHelpFormatter)
g1 = p.add_usage_group(kind='nand', dest='nand1')
arg = g1.add_argument('--arg', metavar='C')
g11 = g1.add_usage_group(kind='nand', dest='nand2')
g11.add_argument('-a')
g11.add_argument('-b')
The usage is (using !() to mark a 'nand' test):
usage: issue25626109.py [-h] !(--arg C & !(-a A & -b B))
I think this is the shortest and clearest way of expressing this problem using general purpose usage groups.
In my tests, inputs that parse successfully are:
''
'-a1'
'-a1 -b2'
'--arg=3 -a1 -b2'
Ones that are supposed to raise errors are:
'--arg=3'
'--arg=3 -a1'
'--arg=3 -b2'
For arguments I've come up with a quick-n-dirty solution like this.
Assumptions: (1) '--help' should display help and not complain about required argument and (2) we're parsing sys.argv
p = argparse.ArgumentParser(...)
p.add_argument('-required', ..., required = '--help' not in sys.argv )
This can easily be modified to match a specific setting.
For required positionals (which will become unrequired if e.g. '--help' is given on the command line) I've come up with the following: [positionals do not allow for a required=... keyword arg!]
p.add_argument('pattern', ..., narg = '+' if '--help' not in sys.argv else '*' )
basically this turns the number of required occurrences of 'pattern' on the command line from one-or-more into zero-or-more in case '--help' is specified.
Here is a simple and clean solution with these advantages:
No ambiguity and loss of functionality caused by oversimplified parsing using the in sys.argv test.
No need to implement a special argparse.Action or argparse.UsageGroup class.
Simple usage even for multiple and complex deciding arguments.
I noticed just one considerable drawback (which some may find desirable): The help text changes according to the state of the deciding arguments.
The idea is to use argparse twice:
Parse the deciding arguments instead of the oversimplified use of the in sys.argv test. For this we use a short parser not showing help and the method .parse_known_args() which ignores unknown arguments.
Parse everything normally while reusing the parser from the first step as a parent and having the results from the first parser available.
import argparse
# First parse the deciding arguments.
deciding_args_parser = argparse.ArgumentParser(add_help=False)
deciding_args_parser.add_argument(
'--argument', required=False, action='store_true')
deciding_args, _ = deciding_args_parser.parse_known_args()
# Create the main parser with the knowledge of the deciding arguments.
parser = argparse.ArgumentParser(
description='...', parents=[deciding_args_parser])
parser.add_argument('-a', required=deciding_args.argument)
parser.add_argument('-b', required=deciding_args.argument)
arguments = parser.parse_args()
print(arguments)
Until http://bugs.python.org/issue11588 is solved, I'd just use nargs:
p = argparse.ArgumentParser(description='...')
p.add_argument('--arguments', required=False, nargs=2, metavar=('A', 'B'))
This way, if anybody supplies --arguments, it will have 2 values.
Maybe its CLI result is less readable, but code is much smaller. You can fix that with good docs/help.
This is really the same as #Mira 's answer but I wanted to show it for the case where when an option is given that an extra arg is required:
For instance, if --option foo is given then some args are also required that are not required if --option bar is given:
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument('--option', required=True,
help='foo and bar need different args')
if 'foo' in sys.argv:
parser.add_argument('--foo_opt1', required=True,
help='--option foo requires "--foo_opt1"')
parser.add_argument('--foo_opt2', required=True,
help='--option foo requires "--foo_opt2"')
...
if 'bar' in sys.argv:
parser.add_argument('--bar_opt', required=True,
help='--option bar requires "--bar_opt"')
...
It's not perfect - for instance proggy --option foo --foo_opt1 bar is ambiguous but for what I needed to do its ok.
Add additional simple "pre"parser to check --argument, but use parse_known_args() .
pre = argparse.ArgumentParser()
pre.add_argument('--argument', required=False, action='store_true', default=False)
args_pre=pre.parse_known_args()
p = argparse.ArgumentParser()
p.add_argument('--argument', required=False)
p.add_argument('-a', required=args_pre.argument)
p.add_argument('-b', required=not args_pre.argument)

How ca I get Python ArgParse to stop overwritting positional arguments in child parser

I am attempting to get my script working, but argparse keeps overwriting my positional arguments from the parent parser. How can I get argparse to honor the parent's value for these? It does keep values from optional args.
Here is a very simplified version of what I need. If you run this, you will see that the args are overwritten.
testargs.py
#! /usr/bin/env python3
import argparse
import sys
def main():
preparser = argparse.ArgumentParser(add_help=False)
preparser.add_argument('first',
nargs='?')
preparser.add_argument('outfile',
nargs='?',
type=argparse.FileType('w', encoding='utf-8'),
default=sys.stdout,
help='Output file')
preparser.add_argument(
'--do-something','-d',
action='store_true')
# Parse args with preparser, and find config file
args, remaining_argv = preparser.parse_known_args()
print(args)
parser = argparse.ArgumentParser(
parents=[preparser],
description=__doc__)
parser.add_argument(
'--clear-screen', '-c',
action='store_true')
args = parser.parse_args(args=remaining_argv,namespace=args )
print(args)
if __name__ == '__main__':
main()
And call it with testargs.py something /tmp/test.txt -d -c
You will see it keeps the -d but drops both the positional args and reverts them to defaults.
EDIT: see additional comments in the accepted answer for some caveats.
When you specify parents=[preparser] it means that parser is an extension of preparser, and will parse all arguments relevent to preparser which it is never given.
Lets say the preparser only has one positional argument first and the parser only has one positional argument second, when you make parser a child of preparser it expects both arguments:
import argparse
parser1 = argparse.ArgumentParser(add_help=False)
parser1.add_argument("first")
parser2 = argparse.ArgumentParser(parents=[parser1])
parser2.add_argument("second")
args2 = parser2.parse_args(["arg1","arg2"])
assert args2.first == "arg1" and args2.second == "arg2"
However passing only the remaining arguments that are left over from parser1 would just be ['second'] which is not the correct arguments to parser2:
parser1 = argparse.ArgumentParser(add_help=False)
parser1.add_argument("first")
args1, remaining_args = parser1.parse_known_args(["arg1","arg2"])
parser2 = argparse.ArgumentParser(parents=[parser1])
parser2.add_argument("second")
>>> args1
Namespace(first='arg1')
>>> remaining_args
['arg2']
>>> parser2.parse_args(remaining_args)
usage: test.py [-h] first second
test.py: error: the following arguments are required: second
To only process the arguments that were not handled by the first pass, do not specify it as the parent to the second parser:
parser1 = argparse.ArgumentParser(add_help=False)
parser1.add_argument("first")
args1, remaining_args = parser1.parse_known_args(["arg1","arg2"])
parser2 = argparse.ArgumentParser() #parents=[parser1]) #NO PARENT!
parser2.add_argument("second")
args2 = parser2.parse_args(remaining_args,args1)
assert args2.first == "arg1" and args2.second == "arg2"
The 2 positionals are nargs='?'. A positional like that is always 'seen', since an empty list matches that nargs.
First time through 'text.txt' matches with first and is put in the Namespace. Second time through there isn't any string to match, so the default is used - same as if you had not given that string the first time.
If I change first to have the default nargs, I get
error: the following arguments are required: first
from the 2nd parser. Even though there's a value in the Namespace it still tries to get a value from the argv. (it's like a default, but not quite).
Defaults for positionals with nargs='?' (or *) are tricky. They are optional, but not in quite the same way as optionals. The positional Actions are still called, but with a empty list of values.
I don't think the parents feature does anything for you. preparser already handles that set of arguments; there's no need to handle them again in parser, especially since all the relevant argument strings have been stripped out.
Another option is to leave the parents in, but use the default sys.argv[1:] in the 2nd parser. (but beware of side effects like opening files)
args = parser.parse_args(namespace=args )
A third option is to parse the arguments independently and merge them with a dictionary update.
adict = vars(preparse_args)
adict.update(vars(parser_args))
# taking some care in who overrides who
For more details look in argparse.py file at ArgumentParser._get_values, specifically the not arg_strings cases.
A note about the FileType. That type works nicely for small scripts where you will use the files right away and exit. It isn't so good on large programs where you might want to close the file after use (close stdout???), or use files in a with context.
edit - note on parents
add_argument creates an Action object, and adds it to the parser's list of actions. parse_args basically matches input strings with these actions.
parents just copies those Action objects (by reference) from parent to child. To the child parser it is just as though the actions were created with add_argument directly.
parents is most useful when you are importing a parser and don't have direct access to its definition. If you are defining both parent and child, then parents just saves you some typing/cut-n-paste.
This and other SO questions (mostly triggered the by-reference copy) show that the developers did not intend you to use both the parent and child to do parsing. It can be done, but there are glitches that the they did not consider.
===================
I can imagine defining a custom Action class that would 'behave' in a situation like this. It might, for example, check the namespace for some not default value before adding its own (possibly default) value.
Consider, for example if I changed the action of first to 'append':
preparser.add_argument('first', action='append', nargs='?')
The result is:
1840:~/mypy$ python3 stack37147683.py /tmp/test.txt -d -c
Namespace(do_something=True, first=['/tmp/test.txt'], outfile=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'>)
Namespace(clear_screen=True, do_something=True, first=['/tmp/test.txt', None], outfile=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'>)
From the first parser, first=['/tmp/test.txt']; from the second, first=['/tmp/test.txt', None].
Because of the append, the item from the first is preserved, and a new default has been added by the second parser.

arparse python argument from directory

I have a file structure like this:
project/
main_prog.py
tools/
script.py
md_script/
__init__.py
md_script.py
I search in tools for local python modules. In this example it's md_script. And i want to use it as positional argument like install in my code, but when i use it, I'v got an error:
./jsh.py md_script
usage: jsh.py [-h] {install,call,list,log,restore} ... [md_script]
jsh.py: error: invalid choice: 'md_script' (choose from 'install', 'call', 'list', 'log', 'restore')
python3.4 on ubuntu14.10
Here is my code:
parser = argparse.ArgumentParser(prog='jsh.py',
description='Some help.', epilog='Example of usage: some help')
subparsers = parser.add_subparsers()
parser_install = subparsers.add_parser('install', help = 'Install new project.')
parser_install.add_argument('install', nargs='?', help = 'Name of project to be installed')
if os.path.isdir(full/path/to/tools/):
name_arg = next(os.walk(full/path/to/tools))[1]
tools_arg = parser.add_argument_group('Tools', 'Modules from tools')
for element in name_arg:
tools_arg.add_argument(element, nargs='?', help='md_script description')
args = parser.parse_args()
try:
if not len(sys.argv) > 1:
parser.print_help()
elif 'install' in args:
do_some_stuff
elif element in args:
do_some_md_script_stuff
else:
parser.print_help()
The usage line shows what's wrong:
usage: jsh.py [-h] {install,call,list,log,restore} ... [md_script]
You need to use something like
jsh.py install md_script
You specified subparsers, so you have to give it a subparser name.
From the usage it also looks like you created other subparsers, call, list, etc that you don't show in the code.
You also define positional arguments after creating subparser. That's where the [md_script] comes from. Be careful about making a lot of nargs='?' positionals (including the argument for the install subparser). This could make things confusing for your users. In fact it seems to confusing you. Remember that subparser is in effect a positional argument (one that requires 1 string).
I'd suggest experimenting with a simplier parser before creating one this complicated.
So from your comments and examples I see that you goal is let the user name a module, so your script can invoke it in some way or other. For that populating the subparsers with these names makes sense.
I wonder why you also create an optional positional argument with the same name:
module_pars = subparsers.add_parser(element, help = 'Modules from tools')
module_pars.add_argument(element, nargs='?', help=element+' description')
Is that because you are using the presence of the attribute as evidence that this subparser was invoked?
elif element in args:
do_some_md_script_stuff
The argparse documentation has a couple of other ideas.
One particularly effective way of handling sub-commands is to combine the use of the add_subparsers() method with calls to set_defaults() so that each subparser knows which Python function it should execute.
and
However, if it is necessary to check the name of the subparser that was invoked, the dest keyword argument to the add_subparsers() call will work:
These avoid the messiness of a '?' positional argument, freeing you to use subparser arguments for real information.
subparsers = parser.add_subparsers(dest='module')
....
for element in name_arg:
# module_pars = 'parser_'+element # this does nothing
module_pars = subparsers.add_parser(element, help = 'Modules from tools')
module_pars.set_defaults(func = do_some_md_script_stuff)
# or module_pars.set_default(element='I am here')
module_pars.add_argument('real_argument')
Now you can check:
if args.module='md_script':
do_some_md_script_stuff(args)
or
if hasattr(args, 'func'):
func(args)
With the alternative set_defaults, your original test should still work:
if element in args:
do_some_md_script_stuff
I did it like this. It's exactly what I want to.
if os.path.isdir(TOOLS_PATH):
name_arg = next(os.walk(TOOLS_PATH))[1]
for element in name_arg:
module_pars = 'parser_'+element
module_pars = subparsers.add_parser(element, help = 'Modules from tools')
module_pars.add_argument(element, nargs='?', help=element+' description')
I didn't test it, because i dont have a test module, but ./jsh.py md_script goes into elif element in args: print('md_script') and print string. So it looks like it works.
Thanks for all replies.
Edit: I tested it. In add_argument i must change nargs='?' for nargs='*' to catch more than one argument.
And to catch arguments from command line I used this:
elif args:
for element in name_arg:
if element in args:
element_arg = sys.argv[2:]
done_cmd,msg = opt_exec_module(element,*element_arg)
my_logger(done_cmd,msg)
Not very elegant but it works.

argparse - Combining parent parser, subparsers and default values

I wanted to define different subparsers in a script, with both inheriting options from a common parent, but with different defaults. It doesn't work as expected, though.
Here's what I did:
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
base_parser = argparse.ArgumentParser(add_help=False)
base_parser.add_argument('-n', help='number', type=int)
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser])
subparser1.set_defaults(n=50)
subparser2 = subparsers.add_parser('b', help='subparser 2',
parents=[base_parser])
subparser2.set_defaults(n=20)
args = parser.parse_args()
print args
When I run the script from the command line, this is what I get:
$ python subparse.py b
Namespace(n=20)
$ python subparse.py a
Namespace(n=20)
Apparently, the second set_defaults overwrites the first one in the parent. Since there wasn't anything about it in the argparse documentation (which is pretty detailed), I thought this might be a bug.
Is there some simple solution for this? I could check the args variable afterwards and replace None values with the intended defaults for each subparser, but that's what I expected argparse to do for me.
This is Python 2.7, by the way.
set_defaults loops through the actions of the parser, and sets each default attribute:
def set_defaults(self, **kwargs):
...
for action in self._actions:
if action.dest in kwargs:
action.default = kwargs[action.dest]
Your -n argument (an action object) was created when you defined the base_parser. When each subparser is created using parents, that action is added to the ._actions list of each subparser. It doesn't define new actions; it just copies pointers.
So when you use set_defaults on subparser2, you modify the default for this shared action.
This Action is probably the 2nd item in the subparser1._action list (h is the first).
subparser1._actions[1].dest # 'n'
subparser1._actions[1] is subparser2._actions[1] # true
If that 2nd statement is True, that means the same action is in both lists.
If you had defined -n individually for each subparser, you would not see this. They would have different action objects.
I'm working from my knowledge of the code, not anything in the documentation. It was pointed out recently in Cause Python's argparse to execute action for default that the documentation says nothing about add_argument returning an Action object. Those objects are an important part of the code organization, but they don't get much attention in the documentation.
Copying parent actions by reference also creates problems if the 'resolve' conflict handler is used, and the parent needs to be reused. This issue was raised in
argparse conflict resolver for options in subcommands turns keyword argument into positional argument
and Python bug issue:
http://bugs.python.org/issue22401
A possible solution, for both this issue and that, is to (optionally) make a copy of the action, rather than share the reference. That way the option_strings and defaults can be modified in the children without affecting the parent.
What's happening
The problem here is that parser arguments are objects, and when a parser inherits from it's parents, it adds a reference to the parent's action to it's own list. When you call set_default, it sets the default on this object, which is shared across the subparsers.
You can examine the subparsers to see this:
>>> a1 = [ action for action in subparser1._actions if action.dest=='n' ].pop()
>>> a2 = [ action for action in subparser2._actions if action.dest=='n' ].pop()
>>> a1 is a2 # same object in memory
True
>>> a1.default
20
>>> type(a1)
<class 'argparse._StoreAction'>
First solution: Explicitly add this argument to each subparser
You can fix this by adding the argument to each subparser separately rather than adding it to the base class.
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser])
subparser1.add_argument('-n', help='number', type=int, default=50)
subparser2= subparsers.add_parser('b', help='subparser 2',
parents=[base_parser])
subparser2.add_argument('-n', help='number', type=int, default=20)
...
Second solution: multiple base classes
If there are many subparsers which share the same default value, and you want to avoid this, you can create different base classes for each default. Since parents is a list of base classes, you can still group the common parts into another base class, and pass the subparser multiple base classes to inherit from. This is probably unnecessarily complicated.
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
base_parser = argparse.ArgumentParser(add_help=False)
# add common args
# for group with 50 default
base_parser_50 = argparse.ArgumentParser(add_help=False)
base_parser_50.add_argument('-n', help='number', type=int, default=50)
# for group with 50 default
base_parser_20 = argparse.ArgumentParser(add_help=False)
base_parser_20.add_argument('-n', help='number', type=int, default=20)
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser, base_parser_50])
subparser2 = subparsers.add_parser('b', help='subparser 2',
parents=[base_parser, base_parser_20])
args = parser.parse_args()
print args
First solution with shared args
You can also share a dictionary for the arguments and use unpacking to avoid repeating all the arguments:
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
n_args = '-n',
n_kwargs = {'help': 'number', 'type': int}
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1')
subparser1.add_argument(*n_args, default=50, **n_kwargs)
subparser2 = subparsers.add_parser('b', help='subparser 2')
subparser2.add_argument(*n_args, default=20, **n_kwargs)
args = parser.parse_args()
print args
I wanted multiple subparsers to inherit common arguments as well, but the parents functionality from argparse gave me issues too as the others have explained. Fortunately, there's a very simple solution: create a function to add the arguments instead of creating a parent.
I pass both subparser1 and subparser2 to a function, parent_parser, which adds the common argument, -n.
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
def parent_parser(parser_to_update):
parser_to_update.add_argument('-n', help='number', type=int)
return parser_to_update
# subparsers
subparsers = parser.add_subparsers()
subparser1 = subparsers.add_parser('a', help='subparser 1')
subparser1 = parent_parser(subparser1)
subparser1.set_defaults(n=50)
subparser2 = subparsers.add_parser('b', help='subparser 2')
subparser2 = parent_parser(subparser2)
subparser2.set_defaults(n=20)
args = parser.parse_args()
print(args)
When I run the script:
$ python subparse.py b
Namespace(n=20)
$ python subparse.py a
Namespace(n=50)

Categories