Set default for all subparsers on top level parser - python

I have an argparse parser with several subcommands some of which share an option (via a parent parser). Now I want to set a default value for such an option regardless of which subparser will be executed in the end. My non working code looks like this:
from argparse import ArgumentParser
base = ArgumentParser(add_help=False)
base.add_argument('--foo', action='store_true')
parser = ArgumentParser()
subparsers = parser.add_subparsers(dest='action')
s1 = subparsers.add_parser('a', parents=[base])
s2 = subparsers.add_parser('b', parents=[base])
parser.set_defaults(foo=42)
print(parser.parse_args(['a']))
s1.set_defaults(foo=43)
print(parser.parse_args(['a']))
This prints
Namespace(action='a', foo=False)
Namespace(action='a', foo=43)
I have many subparsers and many options so I want to avoid saving every subparser by name and calling set_defauls on it. Can that be done?
I will know the value I want to set there only after creating all the parsers so I can not specify the defaults in the call to add_argument.
Background: what I am actually working on
The defaults I want to set come from a config file. I actually have two parsers, one to find the config file first and one to parse the subcommands. But I need to define both parsers up front in order to overwrite the help method of the first parser with the help method of the second parser in order to display the full --help text before parsing the config (because that might fail and I could not display the help text). A reduced version of my code looks like this:
import argparse
base = argparse.ArgumentParser(add_help=False)
base.add_argument("--config", help="config file to use")
p1 = argparse.ArgumentParser(parents=[base])
p1.add_argument('remainder', nargs=argparse.REMAINDER)
p2 = argparse.ArgumentParser(parents=[base])
s = p2.add_subparsers(dest='action')
s1 = s.add_parser('a') # add some options
s2 = s.add_parser('b') # add some options
# and so on
p1.print_help = p2.print_help
a1 = p1.parse_args()
config = load_my_config(a1.config)
p2.set_defaults(**config.get_my_defaults())
a2 = p2.parse_args(a1.remainder)

I found a solution to listing all the subparsers. The solution is not to remember all the variables for the different sub-parsers but only the _SubParsersAction object that was used to create them:
import argparse
p = argparse.ArgumentParser()
s = p.add_subparsers()
a = s.add_parser('a')
a.add_argument(...)
b = s.add_parser('b')
c = s.add_parser('c')
...
# now I don't need to remember all the variables a, b, c, ...
# in order to set the defaults on all of theses sub-parsers
config = load_my_config_file()
defaults = config.get_defaults()
for name, sparser in s.choises:
print("Setting defaults on sub parser for '{}'").format(name)
sparser.set_defaults(**defaults)

Related

Parsing args with different set of default values

I have a set of parameters for training and a set of parameters for tuning. They share the same name but different default values. I'd like to use argparse to define which group of default values to use and also parse the values.
I have learned it is possible by using add_subparsers to set subparser for each mode. However, their names are identical which means I'll have to set the same parameters twice (which is very long).
I also tried to include two parsers, the first one parse a few args to determine which group of default values to use, and then use parser.set_defaults(**defaults) to set the default values for the second parser, like this:
train_defaults = dict(
optimizer='AdamW',
lr=1e-3,
strategy='linear',
warmup_steps=5_000,
weight_decay=0.3
)
tune_defaults = dict(
optimizer='SGD',
lr=1e-2,
strategy='cosine',
warmup_steps=500,
weight_decay=0.0
)
selector = argparse.ArgumentParser(description='Mode Selector')
mode = selector.add_mutually_exclusive_group()
mode.add_argument('-t', '--train', action='store_true', help='train model')
mode.add_argument('-u', '--tune', action='store_true', help='tune model')
select, unknown = selector.parse_known_args()
defaults = tune_defaults if select.tune else select.train
parser.set_defaults(**defaults)
args, unknown = parser.parse_known_args()
But two parsers will conflict on some args, for example, -td refers to the --train_data in parser, but it will also be parsed by selector which will raise an Exception:
usage: run.py [-h] [-pt | -pa] [-t] [-u] [-v]
run.py: error: argument -t/--train: ignored explicit argument 'd'
(This is a MWE, the actual args could be vary.
The multiple parsers solution, as you are finding, can be error-prone. I see two alternatives:
Use environment variables
Something like this:
import os
do_tuning = os.getenv("DO_THE_TUNING_MODE", None) is not None
...
defaults = tune_defaults if do_tuning else select.train
parser = argparse.ArgumentParser()
...
parser.set_defaults(**defaults)
args, unknown = parser.parse_known_args()
Use like
DO_THE_TUNING_MODE=1 run.py <options>
or
export DO_THE_TUNING_MODE=1
run.py <options>
(or of course, don't set for training mode)
Pros:
Tuning/selection method is outside the parser so you don't get conflicts
A user can set a "state" in their shell session to tuning or training and not have to continuously set the option when running
Cons:
An environment variable is less straightforward to set than calling a command-line option for one-time use
It is easy to forget what your environment variable is set to
Use subparsers
This is probably the best solution. You indicated that you did not want to do that because you have so many options, but that's what functions are for.
def add_parsing_options(parser):
# All your 40 options go here
parser.add_argument(...)
parser.add_argument(...)
parser = argparse.ArgumentParser()
subparsers = parser.add_subparsers()
tuning_parser = subparser.add_parser("tune")
training_parser = subparser.add_parser("train")
add_parsing_options(tuning_parser)
add_parsing_options(training_parser)
tuning_parser.set_defaults(**tune_defaults)
training_parser.set_defaults(**train_defaults)
args, unknown = parser.parse_known_args()
Call like
run.py train <options>
or
run.py tune <options>
Pros:
It is explicit when using the tool which mode is being used
Cons:
It is an extra parameter to type every time the tool is used
I partially resolved the question by some hard code, i.e.
Since the first parser is only used to set the default parameters of the second parser, there is only a few arguments, in my case, 2.
So what I did is to split the sys.argv to two parts:
import sys
select, unknown = selector.parse_known_args(sys.argv[:3])
args, unknown = parser.parse_known_args(sys.argv[3:])
Pros:
have most if not every pros of other methods
no extra parameter to type every time
Cons:
the value 3 is a hyper parameter

How to make conditional arguments using argparse? [duplicate]

I have done as much research as possible but I haven't found the best way to make certain cmdline arguments necessary only under certain conditions, in this case only if other arguments have been given. Here's what I want to do at a very basic level:
p = argparse.ArgumentParser(description='...')
p.add_argument('--argument', required=False)
p.add_argument('-a', required=False) # only required if --argument is given
p.add_argument('-b', required=False) # only required if --argument is given
From what I have seen, other people seem to just add their own check at the end:
if args.argument and (args.a is None or args.b is None):
# raise argparse error here
Is there a way to do this natively within the argparse package?
I've been searching for a simple answer to this kind of question for some time. All you need to do is check if '--argument' is in sys.argv, so basically for your code sample you could just do:
import argparse
import sys
if __name__ == '__main__':
p = argparse.ArgumentParser(description='...')
p.add_argument('--argument', required=False)
p.add_argument('-a', required='--argument' in sys.argv) #only required if --argument is given
p.add_argument('-b', required='--argument' in sys.argv) #only required if --argument is given
args = p.parse_args()
This way required receives either True or False depending on whether the user as used --argument. Already tested it, seems to work and guarantees that -a and -b have an independent behavior between each other.
You can implement a check by providing a custom action for --argument, which will take an additional keyword argument to specify which other action(s) should become required if --argument is used.
import argparse
class CondAction(argparse.Action):
def __init__(self, option_strings, dest, nargs=None, **kwargs):
x = kwargs.pop('to_be_required', [])
super(CondAction, self).__init__(option_strings, dest, **kwargs)
self.make_required = x
def __call__(self, parser, namespace, values, option_string=None):
for x in self.make_required:
x.required = True
try:
return super(CondAction, self).__call__(parser, namespace, values, option_string)
except NotImplementedError:
pass
p = argparse.ArgumentParser()
x = p.add_argument("--a")
p.add_argument("--argument", action=CondAction, to_be_required=[x])
The exact definition of CondAction will depend on what, exactly, --argument should do. But, for example, if --argument is a regular, take-one-argument-and-save-it type of action, then just inheriting from argparse._StoreAction should be sufficient.
In the example parser, we save a reference to the --a option inside the --argument option, and when --argument is seen on the command line, it sets the required flag on --a to True. Once all the options are processed, argparse verifies that any option marked as required has been set.
Your post parsing test is fine, especially if testing for defaults with is None suits your needs.
http://bugs.python.org/issue11588 'Add "necessarily inclusive" groups to argparse' looks into implementing tests like this using the groups mechanism (a generalization of mutuall_exclusive_groups).
I've written a set of UsageGroups that implement tests like xor (mutually exclusive), and, or, and not. I thought those where comprehensive, but I haven't been able to express your case in terms of those operations. (looks like I need nand - not and, see below)
This script uses a custom Test class, that essentially implements your post-parsing test. seen_actions is a list of Actions that the parse has seen.
class Test(argparse.UsageGroup):
def _add_test(self):
self.usage = '(if --argument then -a and -b are required)'
def testfn(parser, seen_actions, *vargs, **kwargs):
"custom error"
actions = self._group_actions
if actions[0] in seen_actions:
if actions[1] not in seen_actions or actions[2] not in seen_actions:
msg = '%s - 2nd and 3rd required with 1st'
self.raise_error(parser, msg)
return True
self.testfn = testfn
self.dest = 'Test'
p = argparse.ArgumentParser(formatter_class=argparse.UsageGroupHelpFormatter)
g1 = p.add_usage_group(kind=Test)
g1.add_argument('--argument')
g1.add_argument('-a')
g1.add_argument('-b')
print(p.parse_args())
Sample output is:
1646:~/mypy/argdev/usage_groups$ python3 issue25626109.py --arg=1 -a1
usage: issue25626109.py [-h] [--argument ARGUMENT] [-a A] [-b B]
(if --argument then -a and -b are required)
issue25626109.py: error: group Test: argument, a, b - 2nd and 3rd required with 1st
usage and error messages still need work. And it doesn't do anything that post-parsing test can't.
Your test raises an error if (argument & (!a or !b)). Conversely, what is allowed is !(argument & (!a or !b)) = !(argument & !(a and b)). By adding a nand test to my UsageGroup classes, I can implement your case as:
p = argparse.ArgumentParser(formatter_class=argparse.UsageGroupHelpFormatter)
g1 = p.add_usage_group(kind='nand', dest='nand1')
arg = g1.add_argument('--arg', metavar='C')
g11 = g1.add_usage_group(kind='nand', dest='nand2')
g11.add_argument('-a')
g11.add_argument('-b')
The usage is (using !() to mark a 'nand' test):
usage: issue25626109.py [-h] !(--arg C & !(-a A & -b B))
I think this is the shortest and clearest way of expressing this problem using general purpose usage groups.
In my tests, inputs that parse successfully are:
''
'-a1'
'-a1 -b2'
'--arg=3 -a1 -b2'
Ones that are supposed to raise errors are:
'--arg=3'
'--arg=3 -a1'
'--arg=3 -b2'
For arguments I've come up with a quick-n-dirty solution like this.
Assumptions: (1) '--help' should display help and not complain about required argument and (2) we're parsing sys.argv
p = argparse.ArgumentParser(...)
p.add_argument('-required', ..., required = '--help' not in sys.argv )
This can easily be modified to match a specific setting.
For required positionals (which will become unrequired if e.g. '--help' is given on the command line) I've come up with the following: [positionals do not allow for a required=... keyword arg!]
p.add_argument('pattern', ..., narg = '+' if '--help' not in sys.argv else '*' )
basically this turns the number of required occurrences of 'pattern' on the command line from one-or-more into zero-or-more in case '--help' is specified.
Here is a simple and clean solution with these advantages:
No ambiguity and loss of functionality caused by oversimplified parsing using the in sys.argv test.
No need to implement a special argparse.Action or argparse.UsageGroup class.
Simple usage even for multiple and complex deciding arguments.
I noticed just one considerable drawback (which some may find desirable): The help text changes according to the state of the deciding arguments.
The idea is to use argparse twice:
Parse the deciding arguments instead of the oversimplified use of the in sys.argv test. For this we use a short parser not showing help and the method .parse_known_args() which ignores unknown arguments.
Parse everything normally while reusing the parser from the first step as a parent and having the results from the first parser available.
import argparse
# First parse the deciding arguments.
deciding_args_parser = argparse.ArgumentParser(add_help=False)
deciding_args_parser.add_argument(
'--argument', required=False, action='store_true')
deciding_args, _ = deciding_args_parser.parse_known_args()
# Create the main parser with the knowledge of the deciding arguments.
parser = argparse.ArgumentParser(
description='...', parents=[deciding_args_parser])
parser.add_argument('-a', required=deciding_args.argument)
parser.add_argument('-b', required=deciding_args.argument)
arguments = parser.parse_args()
print(arguments)
Until http://bugs.python.org/issue11588 is solved, I'd just use nargs:
p = argparse.ArgumentParser(description='...')
p.add_argument('--arguments', required=False, nargs=2, metavar=('A', 'B'))
This way, if anybody supplies --arguments, it will have 2 values.
Maybe its CLI result is less readable, but code is much smaller. You can fix that with good docs/help.
This is really the same as #Mira 's answer but I wanted to show it for the case where when an option is given that an extra arg is required:
For instance, if --option foo is given then some args are also required that are not required if --option bar is given:
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument('--option', required=True,
help='foo and bar need different args')
if 'foo' in sys.argv:
parser.add_argument('--foo_opt1', required=True,
help='--option foo requires "--foo_opt1"')
parser.add_argument('--foo_opt2', required=True,
help='--option foo requires "--foo_opt2"')
...
if 'bar' in sys.argv:
parser.add_argument('--bar_opt', required=True,
help='--option bar requires "--bar_opt"')
...
It's not perfect - for instance proggy --option foo --foo_opt1 bar is ambiguous but for what I needed to do its ok.
Add additional simple "pre"parser to check --argument, but use parse_known_args() .
pre = argparse.ArgumentParser()
pre.add_argument('--argument', required=False, action='store_true', default=False)
args_pre=pre.parse_known_args()
p = argparse.ArgumentParser()
p.add_argument('--argument', required=False)
p.add_argument('-a', required=args_pre.argument)
p.add_argument('-b', required=not args_pre.argument)

Python argparse subparser dest parameter doesn't work with a parent

I'm writing a utility that will have multiple modules, and which module gets run is determined by an argument. Each module has it's own arguments, but all modules will share 4 standard arguments. To get this to work I just set the 'parent' param when creating the subparsers, but the problem is I also need to be able to determine which module was called on the command line. It looks like the 'dest' param is the way to do this, but for some reason having both 'parent' and 'dest' set at the same time does not work.
import argparse
parser = argparse.ArgumentParser() # main parser
parser.addArgument("--foo", action='store_true')
subparsers = parser.add_subparsers(dest='cmd')
# without 'parents=[parser]' it properly stores 'bar' in cmd
# however '--foo' MUST be before 'bar'
bar = subparsers.add_parser("bar", parents=[parser], add_help=False)
bar.add_argument("--test", action='store_true')
# should be able to have '--foo' before OR after 'bar'
parser.parse_args(['--foo', 'bar', '--test'])
In this code, the add_subparsers call sets the dest to 'cmd.' Then, I could parse the arguments and call args.cmd to get the name of the module called (in this case, bar). However when parents is set the value of cmd is always None. Currently my workaround is to just have an empty main parser and simply copy-paste the 4 standard args to every subparser, which works but is not exactly desirable.
My question: Is there another way to determine which module was called? Why does this even happen?
Thanks to the information provided by #hpaulj in the comment to the OP, I managed to get this working.
Basically, you need your main parser and a parent parser. You will then set the parents attribute of your subparsers to be the parent parser. Based on the example you gave, the following should be a working example:
import argparse
# Create parsers
parser = argparse.ArgumentParser()
parent_parser = argparse.ArgumentParser()
# Add arguments to parent parser
parent_parser.add_argument("--foo", action='store_true')
# Create subparser
subparsers = parser.add_subparsers(dest='cmd')
# Add to the subparser
bar = subparsers.add_parser("bar", parents=[parent_parser], add_help=False)
bar.add_argument("--test", action='store_true')
baz = subparsers.add_parser("baz", parents=[parent_parser], add_help=False)
baz.add_argument("--baz-test", action="store_true")
# should be able to have '--foo' before OR after 'bar'
print(parser.parse_args(['bar', '--test']))
print(parser.parse_args(["baz", "--baz-test"]))
This outputs the following:
Namespace(cmd='bar', foo=False, test=True)
Namespace(baz_test=True, cmd='baz', foo=False)
You should then be able to do things like this:
args = parser.parse_args()
if args.cmd == "bar":
print("bar was specified")
elif args.cmd == "baz":
print("baz was specified")
It might not be the perfect solution, but it should work.
(Tested using Python 3.5.2)

Argparse: defaults from file

I have a Python script which takes a lot of arguments.
I currently use a configuration.ini file (read using configparser), but would like to allow the user to override specific arguments using command line.
If I'd only have had two arguments I'd have used something like:
if not arg1:
arg1 = config[section]['arg1']
But I don't want to do that for 30 arguments.
Any easy way to take optional arguments from cmd line, and default to the config file?
Try the following, using dict.update():
import argparse
import configparser
config = configparser.ConfigParser()
config.read('config.ini')
defaults = config['default']
parser = argparse.ArgumentParser()
parser.add_argument('-a', dest='arg1')
parser.add_argument('-b', dest='arg2')
parser.add_argument('-c', dest='arg3')
args = vars(parser.parse_args())
result = dict(defaults)
result.update({k: v for k, v in args.items() if v is not None}) # Update if v is not None
With this example of ini file:
[default]
arg1=val1
arg2=val2
arg3=val3
and
python myargparser.py -a "test"
result would contain:
{'arg1': 'test', 'arg2': 'val2', 'arg3': 'val3'}
You can use a ChainMap from the collections module.
From the doc:
A ChainMap groups multiple dicts or
other mappings together to create a single, updateable view. [...]
Lookups search the underlying mappings successively until a key is
found. [...]
So, you could create
a config dict containing the key-value pairs from your config file,
a cmd_line_args dict containing the ones given on the command line
Then, create a ChainMap:
from collections import ChainMap
combined = ChainMap(cmd_line_args, config)
When you access combined['arg1'], arg1 will first be looked up in the cmd_line_args dict, and if it isn't found there, config[arg1] will be returned.
You can chain as many dicts as you wish, which lets you combine as many levels of defaults as you wish.
You can use parser.set_defaults() to do a bulk override of defaults (so that non-entered arguments get populated from the config). Conveniently, this allows the argparse argument default field to specify a last-resort default for the case where the argument was not provided on the commandline nor in the config. However, the arguments still need to be added to the parser somehow so that the parser is willing to recognize them. Mostly, set_defaults() is useful if you already have an argparse parser set up, but you just want to override the defaults to come from the config if they aren't specified on the commandline:
import argparse
config = dict(
a=11,
b=13,
c=19
)
parser = argparse.ArgumentParser()
# add arguments to parser ...
parser.add_argument('-a', type=int)
parser.add_argument('-b', type=int)
parser.add_argument('-c', type=int)
parser.set_defaults(**config)
args = parser.parse_args()
print(args)
If you weren't planning on already having a parser set up with all available parameters, then you could alternatively use your config to set one up (given the defaults directly for each argument, so no need to do the additional set_defaults() step:
import argparse
parser = argparse.ArgumentParser()
config = dict(
a=11,
b=13,
c=19
)
for key, value in config.items():
parser.add_argument(f'-{key}', default=value, type=type(value))
args = parser.parse_args()
print(args)

argparse - Combining parent parser, subparsers and default values

I wanted to define different subparsers in a script, with both inheriting options from a common parent, but with different defaults. It doesn't work as expected, though.
Here's what I did:
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
base_parser = argparse.ArgumentParser(add_help=False)
base_parser.add_argument('-n', help='number', type=int)
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser])
subparser1.set_defaults(n=50)
subparser2 = subparsers.add_parser('b', help='subparser 2',
parents=[base_parser])
subparser2.set_defaults(n=20)
args = parser.parse_args()
print args
When I run the script from the command line, this is what I get:
$ python subparse.py b
Namespace(n=20)
$ python subparse.py a
Namespace(n=20)
Apparently, the second set_defaults overwrites the first one in the parent. Since there wasn't anything about it in the argparse documentation (which is pretty detailed), I thought this might be a bug.
Is there some simple solution for this? I could check the args variable afterwards and replace None values with the intended defaults for each subparser, but that's what I expected argparse to do for me.
This is Python 2.7, by the way.
set_defaults loops through the actions of the parser, and sets each default attribute:
def set_defaults(self, **kwargs):
...
for action in self._actions:
if action.dest in kwargs:
action.default = kwargs[action.dest]
Your -n argument (an action object) was created when you defined the base_parser. When each subparser is created using parents, that action is added to the ._actions list of each subparser. It doesn't define new actions; it just copies pointers.
So when you use set_defaults on subparser2, you modify the default for this shared action.
This Action is probably the 2nd item in the subparser1._action list (h is the first).
subparser1._actions[1].dest # 'n'
subparser1._actions[1] is subparser2._actions[1] # true
If that 2nd statement is True, that means the same action is in both lists.
If you had defined -n individually for each subparser, you would not see this. They would have different action objects.
I'm working from my knowledge of the code, not anything in the documentation. It was pointed out recently in Cause Python's argparse to execute action for default that the documentation says nothing about add_argument returning an Action object. Those objects are an important part of the code organization, but they don't get much attention in the documentation.
Copying parent actions by reference also creates problems if the 'resolve' conflict handler is used, and the parent needs to be reused. This issue was raised in
argparse conflict resolver for options in subcommands turns keyword argument into positional argument
and Python bug issue:
http://bugs.python.org/issue22401
A possible solution, for both this issue and that, is to (optionally) make a copy of the action, rather than share the reference. That way the option_strings and defaults can be modified in the children without affecting the parent.
What's happening
The problem here is that parser arguments are objects, and when a parser inherits from it's parents, it adds a reference to the parent's action to it's own list. When you call set_default, it sets the default on this object, which is shared across the subparsers.
You can examine the subparsers to see this:
>>> a1 = [ action for action in subparser1._actions if action.dest=='n' ].pop()
>>> a2 = [ action for action in subparser2._actions if action.dest=='n' ].pop()
>>> a1 is a2 # same object in memory
True
>>> a1.default
20
>>> type(a1)
<class 'argparse._StoreAction'>
First solution: Explicitly add this argument to each subparser
You can fix this by adding the argument to each subparser separately rather than adding it to the base class.
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser])
subparser1.add_argument('-n', help='number', type=int, default=50)
subparser2= subparsers.add_parser('b', help='subparser 2',
parents=[base_parser])
subparser2.add_argument('-n', help='number', type=int, default=20)
...
Second solution: multiple base classes
If there are many subparsers which share the same default value, and you want to avoid this, you can create different base classes for each default. Since parents is a list of base classes, you can still group the common parts into another base class, and pass the subparser multiple base classes to inherit from. This is probably unnecessarily complicated.
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
base_parser = argparse.ArgumentParser(add_help=False)
# add common args
# for group with 50 default
base_parser_50 = argparse.ArgumentParser(add_help=False)
base_parser_50.add_argument('-n', help='number', type=int, default=50)
# for group with 50 default
base_parser_20 = argparse.ArgumentParser(add_help=False)
base_parser_20.add_argument('-n', help='number', type=int, default=20)
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1',
parents=[base_parser, base_parser_50])
subparser2 = subparsers.add_parser('b', help='subparser 2',
parents=[base_parser, base_parser_20])
args = parser.parse_args()
print args
First solution with shared args
You can also share a dictionary for the arguments and use unpacking to avoid repeating all the arguments:
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
n_args = '-n',
n_kwargs = {'help': 'number', 'type': int}
# subparsers
subparsers = parser.add_subparsers()
subparser1= subparsers.add_parser('a', help='subparser 1')
subparser1.add_argument(*n_args, default=50, **n_kwargs)
subparser2 = subparsers.add_parser('b', help='subparser 2')
subparser2.add_argument(*n_args, default=20, **n_kwargs)
args = parser.parse_args()
print args
I wanted multiple subparsers to inherit common arguments as well, but the parents functionality from argparse gave me issues too as the others have explained. Fortunately, there's a very simple solution: create a function to add the arguments instead of creating a parent.
I pass both subparser1 and subparser2 to a function, parent_parser, which adds the common argument, -n.
import argparse
# this is the top level parser
parser = argparse.ArgumentParser(description='bla bla')
# this serves as a parent parser
def parent_parser(parser_to_update):
parser_to_update.add_argument('-n', help='number', type=int)
return parser_to_update
# subparsers
subparsers = parser.add_subparsers()
subparser1 = subparsers.add_parser('a', help='subparser 1')
subparser1 = parent_parser(subparser1)
subparser1.set_defaults(n=50)
subparser2 = subparsers.add_parser('b', help='subparser 2')
subparser2 = parent_parser(subparser2)
subparser2.set_defaults(n=20)
args = parser.parse_args()
print(args)
When I run the script:
$ python subparse.py b
Namespace(n=20)
$ python subparse.py a
Namespace(n=50)

Categories