I am trying to make exclusion in my argparse parser. Basically what I want is to avoid --all option and filenames argument to be parsed (which I think succeeded).
But I want to create also another check where if I only pass python reader.py read --all, the filenames argument will get populated with all txt files in current directory.
So far I've come up with following code:
import argparse
import glob
parser = argparse.ArgumentParser()
subcommands = parser.add_subparsers(title='subcommands')
read_command = subcommands.add_parser('read')
read_command.add_argument('filenames', type=argparse.FileType(), nargs = '+')
read_command.add_argument('-a', '--all', action='store_true')
parsed = parser.parse_args()
if parsed.all and parsed.filenames:
raise SystemExit
if parsed.all:
parsed.filenames = glob.glob('*.txt')
print parsed
The problem is that if I try to run python reader.py read --all I get error error: too few arguments because of the filenames argument.
Is there a way to have this work like I want to without creating subcommand to read, for example python reader.py read all?
How can I access error messages in argparse? I'd like to have some default message that would say that filenames and --all can't be combined instead of SystemExit error.
Also I want to avoid using add_mutually_exclusive_group because this is just a snippet of my real parser where this approach wouldn't work (already checked in other SO topic).
I've heard about custom actions but it would really help to see example on it.
If filenames gets nargs="*", it should allow you to use --all alone. parsed.filenames will then be a [], which you can replace with the glob.
You could also test giving that argument a default derived from the glob - but see my caution regarding FileType.
Do you want the parser to open all the filenames you give it? Or would you rather open the files latter yourself (preferably in a with context). FileType opens the files (creating if necessary), and in the process checks their existence (which is nice), but leaves it up to you (or the program exit) to close them.
The documentation talks about issuing error messages yourself, and how to change them. parser.error('my message') with display the usage and message, and then exit.
if parsed.all and parsed.filenames:
parsed.error("Do you want to read ALL or just %s?"%parsed.filenames)
It is also possible trap SystemExit exceptions in a try/except clause.
Related
I wonder, is it possible to pass behave arguments, eg. "-D environment". by default they are taken from the behave file.
Maybe there is some way to keep each configuration in a different file? Maybe many behave files? or Such as: behave "path to file with arguments"?
At this point i figured out that I could put bash scripts containing various configuration " #!/bin/bash behave ..."
I asking because i want easily manage my configurations when I run "behave -..." without editing many arguments
I think you could take advantage of using custom json configuration files described in behave Advanced Cases section: https://behave.readthedocs.io/en/latest/new_and_noteworthy_v1.2.5.html#advanced-cases
from documentation:
# -- FILE: features/environment.py
import json
import os.path
def before_all(context):
"""Load and update userdata from JSON configuration file."""
userdata = context.config.userdata
configfile = userdata.get("configfile", "userconfig.json")
if os.path.exists(configfile):
assert configfile.endswith(".json")
more_userdata = json.load(open(configfile))
context.config.update_userdata(more_userdata)
# -- NOTE: Reapplies userdata_defines from command-line, too.
So, if you like to use custom config by specifying it at the command line, you could run it as for example:
behave -D conffile=myconfig.json
Then I would parametrize this line to something like:
myconfigfile = context.config.userdata["conffile"]
configfile = userdata.get("configfile", myconfigfile)
I tend to write a lot of command line utility programs and was wondering if
there is a standard way of messaging the user in Python. Specifically, I would like to print error and warning messages, as well as other more conversational output in a manner that is consistent with Unix conventions. I could produce these myself using the built-in print function, but the messages have a uniform structure so it seems like it would be useful to have a package to handle this for me.
For example, for commands that you run directly in the command line you might
get messages like this:
This is normal output.
error: no files given.
error: parse.c: no such file or directory.
error: parse.c:7:16: syntax error.
warning: /usr/lib64/python2.7/site-packages/simplejson:
not found, skipping.
If the commands might be run in a script or pipeline, they should include their name:
grep: /usr/dict/words: no such file or directory.
It would be nice if could handle levels of verbosity.
These things are all relatively simple in concept, but can result in a lot of
extra conditionals and complexity for each print statement.
I have looked at the logging facility in Python, but it seems overly complicated and more suited for daemons than command line utilities.
I can recommend Inform. It is the only package I have seen that seems to address this need. It provides a variety of print functions that print in different circumstances or with different headers. For example:
log() -- prints to log file, no header
comment() -- prints if verbose, no header
display() -- prints if not quiet, no header
output() -- always prints, no header
warning() -- always prints with warning header
error() -- always prints with error header
fatal() -- always prints with error header, terminates program.
Inform refers to these functions as 'informants'. Informants are very similar to the Python print function in that they take any number of arguments and builds the message by joining them together. It also allows you to specify a culprit, which is added to the front of the message.
For example, here is a simple search and replace program written using Inform.
#!/usr/bin/env python3
"""
Replace a string in one or more files.
Usage:
replace [options] <target> <replacement> <file>...
Options:
-v, --verbose indicate whether file is changed
"""
from docopt import docopt
from inform import Inform, comment, error, os_error
from pathlib import Path
# read command line
cmdline = docopt(__doc__)
target = cmdline['<target>']
replacement = cmdline['<replacement>']
filenames = cmdline['<file>']
Inform(verbose=cmdline['--verbose'], prog_name=True)
for filename in filenames:
try:
filepath = Path(filename)
orig = filepath.read_text()
new = orig.replace(target, replacement)
comment('updated' if orig != new else 'unchanged', culprit=filename)
filepath.write_text(new)
except OSError as e:
error(os_error(e))
Inform() is used to specify your preferences; comment() and error() are the
informants, they actually print the messages; and os_error() is a useful utility that converts OSError exceptions into a string that can be used as an error message.
If you were to run this, you might get the following output:
> replace -v tiger toe eeny meeny miny moe
eeny: updated
meeny: unchanged
replace error: miny: no such file or directory.
replace error: moe: no such file or directory.
Hopefully this gives you an idea of what Inform does. There is a lot more power there. For example, it provides a collection of utilities that are useful when printing messages. An example is os_error(), but there are others. You can also define your own informants, which is a way of handling multiple levels of verbosity.
import logging
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(levelname)s %(message)s')
level specified above controls the verbosity of the output.
You can attach handlers (this is where the complexity outweighs the benefit in my case) to the logging to send output to different places (https://docs.python.org/2/howto/logging-cookbook.html#multiple-handlers-and-formatters) but I haven't needed more than command line output to date.
To produce output you must specify it's verbosity as you log it:
logging.debug("This debug message will rarely appeal to end users")
I hadn't read your very last line, the answer seemed obvious by then and I wouldn't have imagined that single basicConfig line could be described as "overly complicated". It's all I use the 60% of the time when print is not enough.
I can check the presence of a file or folder using OS library very easily.
The following two links have described that
directoryExistance fileExistance
I am attempting to use the subprocess library to do the same
and, I tried a couple of approaches already
1- status = subprocess.call(['test','-e',<path>]), which is always returning 1, no matter what I pass in path.
2- Using getstatusoutput,
/bin/sh: 1: : Permission denied
status, result = subprocess.getstatusoutput([<path>])
print(status)
print(result)
which is working fine because status variable returns 126 if the file/folder exist and 127 when the file/folder doesn't exist. Also the result variable contains message but the "result" variable contains the message : Permission denied
But the second solution looks like a hack to me. Is their a better way, of doing this ?
The test command is a shell builtin, and on many platforms doesn't exist as an independent command you can run.
If you use shell=True to use the shell to run this command, you should pass in a single string, not a list of tokens.
status = subprocess.call("test -e '{}'".format(path), shell=True)
This will produce a malformed command if path contains any single quotes; try path.replace("'", r"\'") if you want to be completely correct and robust, or use one of the existing quoting functions to properly escape any shell metacharacters in the command you pass in.
The subprocess library now offers a function run() which is slightly less unwieldy than the old legacy call() function; if backwards compatibility is not important, you should probably switch to that... or, as several commenters have already implored you, not use subprocess for this task when portable, lightweight native Python solutions are available.
As pointed in the comments section
status = subprocess.call(['test','-e',<path>])
can be made to work with a shell expansion if we use "shell=True"
Although using os.path might be much more efficient anyways.
I want to support a sub-command CLI model, like used by git The particular bit I'm having trouble with is the "change directory" option. Like git, I want a -C DIR option which will have the program change to the specified directory before doing the sub-command. Not really a problem, using sub-parsers, BUT I also want to use the argparse.ArgumentParser(fromfile_prefix_chars='#') mechanism after the -C DIR argument is applied during parsing.
Here's the rub: fromfile argument expansion is performed by argparse before all other argument processing. Thus, any such fromfile arguments must either use absolute paths, or paths relative to the CWD at time the parser is invoked. I don't want absolute paths; I "need" to use fromfile paths that are relative to the -C DIR option. I wrote my own class ChdirAction(argparse.Action) to do the obvious. It worked fine, but since fromfile arguments were already expanded, it didn't give me what I want. (After discovering this not-what-I-want behavior, I looked at python3.5/argparse.py and found the same frustration embedded in cold, hard, unforgiving code.)
Here's a directory diagram that might help explain what I want:
/ foo / aaa / iii / arg.txt
| |
| + jjj / arg.txt
| |
| + arg.txt
|
+ bbb / iii / arg.txt
|
+ jjj / arg.txt
Consider when the CWD is either aaa or bbb at the time command line arguments are parsed. If I run with something like prog -C ./iii #arg.txt
I want the parser to expand #arg.txt with arguments from /foo/aaa/iii/arg.txt. What actually happens is that fromfile expands from the contents of /foo/aaa/arg.txt. When CWD is /foo/aaa this is the "wrong" file; when /foo/bbb it raises "error: [Errno 2] No such file or directory: 'arg.txt'"
More generally, prog -C ./DIR #arg.txt should expand from /foo/aaa/DIR/arg.txt which should work even the fromfile has "up-directory" parts, e.g. prog -C ./iii #../arg.txt should expand from /foo/aaa/arg.txt.
If this behavior can be made to happen, then I could -C DIR to any of {aaa,bbb}/{iii,jjj} and obtain consitent behaviour from a common command line construction.
As described, my problem isn't much of a problem. If can provide the -C DIR, to be realized by an os.chdir(DIR) after argument parsing, then I can also construct appropriate fromfile arguments. They could be either absolute or relative to the CWD at parsing (prior to any -C DIR taking effect). This might look like:
cd /foo/aaa; prog -C ./DIR #arg.txt #./DIR/arg.txt
I don't like it, but it would be okay. The REAL problem is that the actual change-directory argument I'm using is more like -C PATTERN. In my real problem case, PATTERN could be a simple path (absolute or relative). Or, it might be a glob pattern, or a partial name that has "non-trivial" resolution logic to find the actual directory for os.chdir(DIR). In this case (which I am struggling with), I can't have the invoker of the program resolve the actual location of the fromfile path.
Actually, I could, but that would put an inappropriate burden on the invoker. AND, when that invoker is an Eclipse launcher, I don't really have the control-flow power necessary to do it. So, it's back to having the program take care of it's own needs; a nicer abstraction, but how do I implement it?
Even as I was fleshing out the question, I came up with an idea. So I tried it out and it's kinda, sorta, okay(ish). I can get a constrained version of what I really want, but it's good enough for me (for now), so I thought I might as well share. It might be good enough for you, too. Even better, it might elicit a true solution from somewhere, maybe S.Bethard?
My hack is to do parsing in two phases: the first, is just enough to get the -C PATTERN argument by way of ArgumentParser.parse_known_args(...) without enabling the fromfile mechanism. If the result of that first (minimal) parsing yields a directory change argument, then I process it. The program aborts if more than a single -C PATTERN was specified, or the PATTERN can't be unambiguously resolved.
Then, I use a completely separate ArgumentParser object, configured with the full set of argument specifications that I actually want and parse it with the fromfile mechanism enabled.
There is some monkey business to get the --help argument to work (setting the proper conflict resolution policy, then merely accepting the arg in the first parser just to pass along to the second, which actually has all the "real" argument specs). Also, the first parser should support the same verbose/quiet options that the second one does, honoring their setting and also passing along from first to second parser.
Here's a simplified version of my application-level arg parser method. It doens't support verbose/quiet options at the first parser stage. I've elided the complexity of how a -C PATTERN is resolved to an actual directory. Also, I cut out the majority of the second parser's argument specification, leaving just the second parser's -C PATTERN argument (needed for --help output).
NOTE: Both parsers have a -C PATTERN argument. In the chdirParser it is meaningful; in the argParser it's present only so it will show up in the help output. Something similar should be done for verbose/quiet options - probably not that trixy, but it's not (yet) important to me so I don't mind always reporting a change of directory, even in quiet mode.
def cli_args_from_argv():
import argparse
import glob
import os
import sys
chdirParser = argparse.ArgumentParser(conflict_handler='resolve')
chdirParser.add_argument("-C", dest="chdir_pattern", action="append" , default=None)
chdirParser.add_argument("--help", "-h", dest="help", action="store_true", default=False)
(partial, remainder) = chdirParser.parse_known_args()
if partial.help:
remainder = ['--help']
elif partial.chdir_pattern:
if len(partial.chdir_pattern) > 1:
print(r'Too many -C options - at most one may be given, but received: {!r}'.format(partial.chdir_pattern), file=sys.stderr)
sys.exit(1)
pattern = partial.chdir_pattern[0]
resolved_dir = pattern
if os.path.exists(resolved_dir):
resolved_dir = pattern
else:
### ELIDED: resolution of pattern into an unambiguous and existing directory
if not resolved_dir:
print("Failed to resolve -C {!r}".format(pattern), file=sys.stderr)
sys.exit(1)
print("Changing to directory: {!r}".format(resolved_dir))
print("");
os.chdir(target_dir)
argParser = argparse.ArgumentParser(usage="usage: PROG [common-args] SUBCMD [subcmd-args]", fromfile_prefix_chars=':')
### ELIDED: a bunches of add_argument(...)
argParser.add_argument("-C", dest="chdir_spec", action="store", default=None, help="Before anything else, chdir to SPEC", metavar="SPEC")
return argParser.parse_args(args=remainder)
I have a feeling that there's probably a better way... Do you know?
I think the resolve bit can be replaced with
chdirParser = argparse.ArgumentParser(add_help=False)
and omit the -h definition and save. Let the second parser handle sys.argv unchanged (since you are including (but ignoring) the -C argument).
That append and test for len(partial.chdir_pattern) > 1 should work if you expect the user to use several -C dir1 ... -C dir2... commands. The alternative to use the default store action, which ends up saving the last of those repetitions. Why might the user repeat the -C, and why should you care? Usually we just ignore repetitions.
You might replace
print("Failed to resolve -C {!r}".format(pattern), file=sys.stderr)
sys.exit(1)
with
parser.error("Failed to resolve -C {!r}".format(pattern)')
It prints the usage (with only -C) and does ansys.exit(2)`. Not quite the same, but may be close enough.
For the second parser, the -C might be simplified (using defaults):
argParser.add_argument("-C", "--chdir-spec", help="Before anything else, chdir to SPEC", metavar="SPEC")
And use the full sys.argv.
return argParser.parse_args()
Otherwise, using 2 parsers makes sense, since the fromfile is present in the changed directory (and you want to ignore any such file in the initial directory).
I thought maybe a :arg.txt string the commandline would give problems in the first parser. But with parse_known_args it will just treat it as an unknown positional. But the proof's in the testing.
I am using python 3 argparse. I have multiple files passed as argparse.FileType which I use to write some data. I want to check some conditions and open those files only if they are met. However argparse opens them immediately, and they are created even if I exit with error code.
import argparse
from sys import exit
def main():
parser = argparse.ArgumentParser()
parser.add_argument('--condition', action='store_true')
parser.add_argument('out1', type=argparse.FileType('w'))
parser.add_argument('out2', type=argparse.FileType('w'))
args = parser.parse_args()
if not args.condition:
print('ERROR: please use --condition')
exit(2)
args.out1.write('hello\n')
args.out2.write('world\n')
if __name__ == '__main__':
main()
If I run this example without passing the --condition argument, it will still create 2 new files. I don't want to create them in that case. Can I do that without passing filename and opening the files manually?
The simplest thing is to just accept the filenames as the default string type, and open the files later.
http://bugs.python.org/issue13824 (argparse.FileType opens a file and never closes it) implements a FileContext type, one that operates like FileType except that it returns a context that can be used later as:
with args.input() as f:
f.read()
etc
But doing filename checking while parsing, without actually opening or creating a file, is not trivial. And handling stdin/out which should not be closed adds a complication.
In that bug issue, Steven Bethard, the argparse developer, notes that FileType was intended for quick scripts, not for larger projects where proper opening and closing files matters.