Python Click: custom error message - python

I use the excellent Python Click library for handling command line options in my tool. Here's a simplified version of my code (full script here):
#click.command(
context_settings = dict( help_option_names = ['-h', '--help'] )
)
#click.argument('analysis_dir',
type = click.Path(exists=True),
nargs = -1,
required = True,
metavar = "<analysis directory>"
)
def mytool(analysis_dir):
""" Do stuff """
if __name__ == "__main__":
mytool()
If someone runs the command without any flags, they get the default click error message:
$ mytool
Usage: mytool [OPTIONS] <analysis directory>
Error: Missing argument "analysis_dir".
This is nice, but I'd quite like to tell (very) novice users that more help is available by using the help flag. In other words, add a custom sentence to the error message when the command is invalid telling people to try mytool --help for more information.
Is there an easy way to do this? I know I could remove the required attribute and handle this logic in the main function, but that feels kind of hacky for such a minor addition.

Message construction for most errors in python-click is handled by the show method of the UsageError class: click.exceptions.UsageError.show.
So, if you redefine this method, you will be able to create your own customized error message. Below is an example of a customization which appends the help menu to any error message which answers this SO question:
def modify_usage_error(main_command):
'''
a method to append the help menu to an usage error
:param main_command: top-level group or command object constructed by click wrapper
:return: None
'''
from click._compat import get_text_stderr
from click.utils import echo
def show(self, file=None):
import sys
if file is None:
file = get_text_stderr()
color = None
if self.ctx is not None:
color = self.ctx.color
echo(self.ctx.get_usage() + '\n', file=file, color=color)
echo('Error: %s\n' % self.format_message(), file=file, color=color)
sys.argv = [sys.argv[0]]
main_command()
click.exceptions.UsageError.show = show
Once you define your main command, you can then run the modifier script:
import click
#click.group()
def cli():
pass
modify_usage_error(cli)
I have not explored whether there are runtime invocations of ClickException other than usage errors. If there are, then you might need to modify your custom error handler to first check that ctx is an attribute before you add the line click.exceptions.ClickException.show = show since it does not appear that ClickException is fed ctx at initialization.

Related

Display full help text in python click

I am having the following problem and I am fearful there isn't a straghtforward way to solve it so I am asking here. I am using Click to implement a CLI and I have created several grouped commands under the main command. This is the code:
#click.group()
def main():
pass
#main.command()
def getq():
'''Parameters: --questionnaire_id, --question_id, --session_id, --option_id'''
click.echo('Question Answers')
When I type the main command alone in my terminal it lists all the subcommands with the help text next to each one. However, the text is not displayed fully for the case of getq. Instead, it displays only "Parameters: --questionnaire_id, --question_id,... ."
Is there a way to display it all?
Thank You
The easiest way to do this is to use the command's short_help argument:
#click.group()
def main():
pass
#main.command(short_help='Parameters: --questionnaire_id, --question_id, --session_id, --option_id')
def getq():
click.echo('Question Answers')
If you insist to use the docstring for this and want to override the automatic shortening of it, then you could use a custom Group class overriding the format_commands method to directly use cmd.help instead of the get_short_help_str method:
import click
from gettext import gettext as _
class FullHelpGroup(click.Group):
def format_commands(self, ctx: click.Context, formatter: click.HelpFormatter) -> None:
"""Extra format methods for multi methods that adds all the commands
after the options.
"""
commands = []
for subcommand in self.list_commands(ctx):
cmd = self.get_command(ctx, subcommand)
# What is this, the tool lied about a command. Ignore it
if cmd is None:
continue
if cmd.hidden:
continue
commands.append((subcommand, cmd))
# allow for 3 times the default spacing
if len(commands):
limit = formatter.width - 6 - max(len(cmd[0]) for cmd in commands)
rows = []
for subcommand, cmd in commands:
help = cmd.help if cmd.help is not None else ""
rows.append((subcommand, help))
if rows:
with formatter.section(_("Commands")):
formatter.write_dl(rows)
#click.group(cls=FullHelpGroup)
def main():
pass
#main.command()
def getq():
'''Parameters: --questionnaire_id, --question_id, --session_id, --option_id'''
click.echo('Question Answers')
if __name__ == "__main__":
main()
You most probably want to override the max_content_width (at most 80 columns by default) also. You could do this by overriding the context settings:
import shutil
#click.group(cls=FullHelpGroup,
context_settings={'max_content_width': shutil.get_terminal_size().columns - 10})
def main():
pass

Combine cloup.group and click-default-group

I'm using cloup for my CLI for its constraints feature.
I have some commands a and b which have no common arguments.
import cloup
#cloup.group()
def cli():
pass
#cli.command(show_constraints=True)
#cloup.option("--foo")
def a(**kwargs):
print("hello")
#cli.command(show_constraints=True)
#cloup.option("--bar")
def b():
pass
cli()
I want a to be the default command. So, I'd like the following output:
$ python3 main.py
Usage: main.py [OPTIONS] COMMAND [ARGS]...
Options:
--help Show this message and exit.
Commands:
a
b
$ python3 main.py a --foo hey
hello
So far, this works as expected. Now I also want a to be the default command, thus I'd like to see:
$ python3 main.py --foo hey
hello
I know that I can have behaviour in cli as follows:
#cloup.group(invoke_without_command=True)
def cli():
print("custom behaviour")
That will give
$ python3 main.py
custom behaviour
I thought that I could forward the call to a in the cli function, but the group cli does not know the option --foo of command a:
$ python3 main.py --foo hey
Usage: main.py [OPTIONS] COMMAND [ARGS]...
Try 'main.py --help' for help.
Error: No such option: --foo
I'm stuck here. I found an answer to the question here (A command without name, in Click), but I have to use cloup.group. So if I applied the solution there ...
#cloup.group(cls=DefaultGroup, default='a',default_if_no_args=True)
def cli():
pass
... I'd get
Traceback (most recent call last):
File "main.py", line 11, in <module>
#cloup.option("--foo")
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/core.py", line 1834, in decorator
cmd = command(*args, **kwargs)(f)
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/decorators.py", line 184, in decorator
cmd = _make_command(f, name, attrs, cls) # type: ignore
File "<SNIP>/.venv_3.6.9/lib/python3.6/site-packages/click/decorators.py", line 152, in _make_command
**attrs,
TypeError: __init__() got an unexpected keyword argument 'show_constraints'
And that's only the tip of the spear - any other features from cloup.group also become unavailable.
I guess one could merge the groups of cloup and click-default-group, but that looks horribly time-consuming. Is there an easier way to get a default command in cloup?
I also found https://click.palletsprojects.com/en/8.0.x/api/?highlight=group#click.Context.ignore_unknown_options. But if I understood correctly, only commands have a context and groups do not, so it wouldn't help.
Author of Cloup here. You can try this:
"""
This example requires click-default-group.
"""
import cloup
from click import Context, HelpFormatter
from click_default_group import DefaultGroup
class GroupWithDefaultCommand(cloup.Group, DefaultGroup):
# Optional: mark default command with "*"
def format_subcommand_name(
self, ctx: click.Context, name: str, cmd: click.Command
) -> str:
if name == self.default_cmd_name:
name = name + "*"
return super().format_subcommand_name(ctx, name, cmd)
#cloup.group(cls=GroupWithDefaultCommand, default='alice')
def cli():
pass
#cli.command()
#cloup.option("--foo")
def alice(**kwargs):
print("Called alice with", kwargs)
#cli.command()
#cloup.option("--bar")
def bob(**kwargs):
print("Called bob with", kwargs)
if __name__ == '__main__':
cli()
AFAICS now, you'll only lose the "Did you mean" suggestion for mistyped commands (from Cloup) and the "*" indicating the dafault command (from click-default-group) (it was actually pretty easy to implement that with the method Group.format_subcommand_name introduced by Cloup). Let me know if you find any other problems. If it works well, I'll maybe add it to the examples folder.
Nonetheless, I'd suggest you to not use a default command at all. In click-default-group issue tracker, you can see it conflicts with click-help-colors and click-repl. So, unless you're not afraid of fixing issue that may potentially arise from having a default command, don't have one. As an alternative, you can just suggest your users to define an alias for the default command (e.g. by using the alias unix command).

Save a command line option's value in an object with Python's Click library

I want to parse some command line arguments with Python's Click library and save the provided values in an object.
My first guess would be to do it like this:
import click
class Configuration(object):
def __init__(self):
# configuration variables
self.MyOption = None
# method call
self.parseCommandlineArguments()
#click.command()
#click.option('--myoption', type=click.INT, default=5)
def parseCommandlineArguments(self, myoption):
# save option's value in the object
self.MyOption = myoption
# create an instance
configuration = Configuration()
print(configuration.MyOption)
However, this does not work, instead I get:
TypeError: parseCommandlineArguments() takes exactly 2 arguments (1 given)
Apparently, passing self to the decorated function is not the correct way to do it. If I remove self from the method arguments then I can e.g. do print(myoption) and it will print 5 on the screen but the value will not be known to any instances of my Configuration() class.
What is the correct way to handle this? I assume it has something to do with context handling in Click but I cannot get it working based on the provided examples.
If I'm understanding you correctly, you want a command line tool that will take configuration options and then do something with those options. If this is your objective then have a look at the example I posted. This example uses command groups and passes a context object through each command. Click has awesome documentation, be sure to read it.
import click
import json
class Configuration(object):
"""
Having a custom context class is usually not needed.
See the complex application documentation:
http://click.pocoo.org/5/complex/
"""
my_option = None
number = None
is_awesome = False
uber_var = 900
def make_conf(self):
self.uber_var = self.my_option * self.number
pass_context = click.make_pass_decorator(Configuration, ensure=True)
#click.group(chain=True)
#click.option('-m', '--myoption', type=click.INT, default=5)
#click.option('-n', '--number', type=click.INT, default=0)
#click.option('-a', '--is-awesome', is_flag=True)
#pass_context
def cli(ctx, myoption, number, is_awesome):
"""
this is where I will save the configuration
and do whatever processing that is required
"""
ctx.my_option = myoption
ctx.number = number
ctx.is_awesome = is_awesome
ctx.make_conf()
pass
#click.command('save')
#click.argument('output', type=click.File('wb'))
#pass_context
def save(ctx, output):
"""save the configuration to a file"""
json.dump(ctx.__dict__, output, indent=4, sort_keys=True)
return click.secho('configuration saved', fg='green')
#click.command('show')
#pass_context
def show(ctx):
"""print the configuration to stdout"""
return click.echo(json.dumps(ctx.__dict__, indent=4, sort_keys=True))
cli.add_command(save)
cli.add_command(show)
After this is installed your can run commands like this:
mycli -m 30 -n 40 -a show
mycli -m 30 -n 40 -a save foo.json
mycli -m 30 -n 40 -a show save foo.json
The complex example is an excellent demo for developing a highly configurable multi chaining command line tool.

Use argparse with Setuptools entry_points

I'm writing a script which I want to distribute using Setuptools. I have added this script to the entry_points section in my setup.py.
From the setuptools docs:
The functions you specify are called with no arguments, and their return value is passed to sys.exit(), so you can return an errorlevel or message to print to stderr.
Since the method will return instead of exit it becomes more testable. For testability purposes I accept arguments in the method defaulting to sys.argv. So far so good.
The problem arises when argparse is added to the mix. When argparse fails to parse args it calls sys.exit. Now I would really prefer that argparse doesn't do this as this is handled by the setuptools wrapper. The first thing I could think of to fix this is to override the argparse.ArgumentParser but then I saw this:
# ===============
# Exiting methods
# ===============
def exit(self, status=0, message=None):
if message:
self._print_message(message, _sys.stderr)
_sys.exit(status)
def error(self, message):
"""error(message: string)
Prints a usage message incorporating the message to stderr and
exits.
If you override this in a subclass, it should not return -- it
should either exit or raise an exception.
"""
self.print_usage(_sys.stderr)
self.exit(2, _('%s: error: %s\n') % (self.prog, message))
So the docstring states I should not return and stick with raising an exception. How should I solve this?
The main method if I didn't explain it thoroughly enough:
def main(args=sys.argv):
parser = ArgumentParser(prog='spam')
# parser is configured here
parsed = parser.parse_args(args)
# Parsed args are used here
The reason you don't want to return from error is that the parser will continue parsing. Some errors are raised near the end (e.g. about unparsed strings), but others can occur early (e.g. bad type for the first argument string). The behavior of parse_args is unpredictable if you return from the error method. Normally you want the parser to quit and return control your code.
What you want to do is wrap the parse_args() call in a try: except SystemExit: block. I often use test scripts like this:
for test in ['-o FILE',
...
]:
print(test)
try:
print(parser.parse_args(test.split()))
except SystemExit:
pass
You could use error and/or exit to return other kinds of Exceptions. They could also bypass the usage message. But in one way or other you need to trap the exception in your wrapper.
If you're starting on a fresh project or have time for some refactoring, then you might consider using the Click library. Click has both setuptools integration and 'testability' as features, among other considerations.
Here's an example / test-snippet from the docs that both creates a mini command-line interface, and then tests it immediately:
import click
from click.testing import CliRunner
def test_hello_world():
#click.command()
#click.argument('name')
def hello(name):
click.echo('Hello %s!' % name)
runner = CliRunner()
result = runner.invoke(hello, ['Peter'])
assert result.exit_code == 0
assert result.output == 'Hello Peter!\n'

Python argparse and controlling/overriding the exit status code

Apart from tinkering with the argparse source, is there any way to control the exit status code should there be a problem when parse_args() is called, for example, a missing required switch?
I'm not aware of any mechanism to specify an exit code on a per-argument basis. You can catch the SystemExit exception raised on .parse_args() but I'm not sure how you would then ascertain what specifically caused the error.
EDIT: For anyone coming to this looking for a practical solution, the following is the situation:
ArgumentError() is raised appropriately when arg parsing fails. It is passed the argument instance and a message
ArgumentError() does not store the argument as an instance attribute, despite being passed (which would be convenient)
It is possible to re-raise the ArgumentError exception by subclassing ArgumentParser, overriding .error() and getting hold of the exception from sys.exc_info()
All that means the following code - whilst ugly - allows us to catch the ArgumentError exception, get hold of the offending argument and error message, and do as we see fit:
import argparse
import sys
class ArgumentParser(argparse.ArgumentParser):
def _get_action_from_name(self, name):
"""Given a name, get the Action instance registered with this parser.
If only it were made available in the ArgumentError object. It is
passed as it's first arg...
"""
container = self._actions
if name is None:
return None
for action in container:
if '/'.join(action.option_strings) == name:
return action
elif action.metavar == name:
return action
elif action.dest == name:
return action
def error(self, message):
exc = sys.exc_info()[1]
if exc:
exc.argument = self._get_action_from_name(exc.argument_name)
raise exc
super(ArgumentParser, self).error(message)
## usage:
parser = ArgumentParser()
parser.add_argument('--foo', type=int)
try:
parser.parse_args(['--foo=d'])
except argparse.ArgumentError, exc:
print exc.message, '\n', exc.argument
Not tested in any useful way. The usual don't-blame-me-if-it-breaks indemnity applies.
All the answers nicely explain the details of argparse implementation.
Indeed, as proposed in PEP (and pointed by Rob Cowie) one should inherit ArgumentParser and override the behavior of error or exit methods.
In my case I just wanted to replace usage print with full help print in case of the error:
class ArgumentParser(argparse.ArgumentParser):
def error(self, message):
self.print_help(sys.stderr)
self.exit(2, '%s: error: %s\n' % (self.prog, message))
In case of override main code will continue to contain the minimalistic..
# Parse arguments.
args = parser.parse_args()
# On error this will print help and cause exit with explanation message.
Perhaps catching the SystemExit exception would be a simple workaround:
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('foo')
try:
args = parser.parse_args()
except SystemExit:
print("do something else")
Works for me, even in an interactive session.
Edit: Looks like #Rob Cowie beat me to the switch. Like he said, this doesn't have very much diagnostic potential, unless you want get silly and try to glean info from the traceback.
As of Python 3.9, this is no longer so painful. You can now handle this via the new argparse.ArgumentParser exit_on_error instantiation argument. Here is an example (slightly modified from the python docs: argparse#exit_on_error):
parser = argparse.ArgumentParser(exit_on_error=False)
parser.add_argument('--integers', type=int)
try:
parser.parse_args('--integers a'.split())
except argparse.ArgumentError:
print('Catching an argumentError')
exit(-1)
You'd have to tinker. Look at argparse.ArgumentParser.error, which is what gets called internally. Or you could make the arguments non-mandatory, then check and exit outside argparse.
You can use one of the exiting methods: http://docs.python.org/library/argparse.html#exiting-methods. It should already handle situations where the arguments are invalid, however (assuming you have defined your arguments properly).
Using invalid arguments:
% [ $(./test_argparse.py> /dev/null 2>&1) ] || { echo error }
error # exited with status code 2
I needed a simple method to catch an argparse error at application start and pass the error to a wxPython form. Combining the best answers from above resulted in the following small solution:
import argparse
# sub class ArgumentParser to catch an error message and prevent application closing
class MyArgumentParser(argparse.ArgumentParser):
def __init__(self, *args, **kwargs):
super(MyArgumentParser, self).__init__(*args, **kwargs)
self.error_message = ''
def error(self, message):
self.error_message = message
def parse_args(self, *args, **kwargs):
# catch SystemExit exception to prevent closing the application
result = None
try:
result = super().parse_args(*args, **kwargs)
except SystemExit:
pass
return result
# testing -------
my_parser = MyArgumentParser()
my_parser.add_argument('arg1')
my_parser.parse_args()
# check for an error
if my_parser.error_message:
print(my_parser.error_message)
running it:
>python test.py
the following arguments are required: arg1
While argparse.error is a method and not a class its not possible to "try", "except" all "unrecognized arguments" errors. If you want to do so you need to override the error function from argparse:
def print_help(errmsg):
print(errmsg.split(' ')[0])
parser.error = print_help
args = parser.parse_args()
on an invalid input it will now print:
unrecognised

Categories