Related
In Python, how can we find out the command line arguments that were provided for a script, and process them?
For some more specific examples, see Implementing a "[command] [action] [parameter]" style command-line interfaces? and How do I format positional argument help using Python's optparse?.
import sys
print("\n".join(sys.argv))
sys.argv is a list that contains all the arguments passed to the script on the command line. sys.argv[0] is the script name.
Basically,
import sys
print(sys.argv[1:])
The canonical solution in the standard library is argparse (docs):
Here is an example:
from argparse import ArgumentParser
parser = ArgumentParser()
parser.add_argument("-f", "--file", dest="filename",
help="write report to FILE", metavar="FILE")
parser.add_argument("-q", "--quiet",
action="store_false", dest="verbose", default=True,
help="don't print status messages to stdout")
args = parser.parse_args()
argparse supports (among other things):
Multiple options in any order.
Short and long options.
Default values.
Generation of a usage help message.
Just going around evangelizing for argparse which is better for these reasons.. essentially:
(copied from the link)
argparse module can handle positional
and optional arguments, while
optparse can handle only optional
arguments
argparse isn’t dogmatic about
what your command line interface
should look like - options like -file
or /file are supported, as are
required options. Optparse refuses to
support these features, preferring
purity over practicality
argparse produces more
informative usage messages, including
command-line usage determined from
your arguments, and help messages for
both positional and optional
arguments. The optparse module
requires you to write your own usage
string, and has no way to display
help for positional arguments.
argparse supports action that
consume a variable number of
command-line args, while optparse
requires that the exact number of
arguments (e.g. 1, 2, or 3) be known
in advance
argparse supports parsers that
dispatch to sub-commands, while
optparse requires setting
allow_interspersed_args and doing the
parser dispatch manually
And my personal favorite:
argparse allows the type and
action parameters to add_argument()
to be specified with simple
callables, while optparse requires
hacking class attributes like
STORE_ACTIONS or CHECK_METHODS to get
proper argument checking
There is also argparse stdlib module (an "impovement" on stdlib's optparse module). Example from the introduction to argparse:
# script.py
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument(
'integers', metavar='int', type=int, choices=range(10),
nargs='+', help='an integer in the range 0..9')
parser.add_argument(
'--sum', dest='accumulate', action='store_const', const=sum,
default=max, help='sum the integers (default: find the max)')
args = parser.parse_args()
print(args.accumulate(args.integers))
Usage:
$ script.py 1 2 3 4
4
$ script.py --sum 1 2 3 4
10
If you need something fast and not very flexible
main.py:
import sys
first_name = sys.argv[1]
last_name = sys.argv[2]
print("Hello " + first_name + " " + last_name)
Then run python main.py James Smith
to produce the following output:
Hello James Smith
The docopt library is really slick. It builds an argument dict from the usage string for your app.
Eg from the docopt readme:
"""Naval Fate.
Usage:
naval_fate.py ship new <name>...
naval_fate.py ship <name> move <x> <y> [--speed=<kn>]
naval_fate.py ship shoot <x> <y>
naval_fate.py mine (set|remove) <x> <y> [--moored | --drifting]
naval_fate.py (-h | --help)
naval_fate.py --version
Options:
-h --help Show this screen.
--version Show version.
--speed=<kn> Speed in knots [default: 10].
--moored Moored (anchored) mine.
--drifting Drifting mine.
"""
from docopt import docopt
if __name__ == '__main__':
arguments = docopt(__doc__, version='Naval Fate 2.0')
print(arguments)
One way to do it is using sys.argv. This will print the script name as the first argument and all the other parameters that you pass to it.
import sys
for arg in sys.argv:
print arg
#set default args as -h , if no args:
if len(sys.argv) == 1: sys.argv[1:] = ["-h"]
I use optparse myself, but really like the direction Simon Willison is taking with his recently introduced optfunc library. It works by:
"introspecting a function
definition (including its arguments
and their default values) and using
that to construct a command line
argument parser."
So, for example, this function definition:
def geocode(s, api_key='', geocoder='google', list_geocoders=False):
is turned into this optparse help text:
Options:
-h, --help show this help message and exit
-l, --list-geocoders
-a API_KEY, --api-key=API_KEY
-g GEOCODER, --geocoder=GEOCODER
I like getopt from stdlib, eg:
try:
opts, args = getopt.getopt(sys.argv[1:], 'h', ['help'])
except getopt.GetoptError, err:
usage(err)
for opt, arg in opts:
if opt in ('-h', '--help'):
usage()
if len(args) != 1:
usage("specify thing...")
Lately I have been wrapping something similiar to this to make things less verbose (eg; making "-h" implicit).
As you can see optparse "The optparse module is deprecated with and will not be developed further; development will continue with the argparse module."
Pocoo's click is more intuitive, requires less boilerplate, and is at least as powerful as argparse.
The only weakness I've encountered so far is that you can't do much customization to help pages, but that usually isn't a requirement and docopt seems like the clear choice when it is.
import argparse
parser = argparse.ArgumentParser(description='Process some integers.')
parser.add_argument('integers', metavar='N', type=int, nargs='+',
help='an integer for the accumulator')
parser.add_argument('--sum', dest='accumulate', action='store_const',
const=sum, default=max,
help='sum the integers (default: find the max)')
args = parser.parse_args()
print(args.accumulate(args.integers))
Assuming the Python code above is saved into a file called prog.py
$ python prog.py -h
Ref-link: https://docs.python.org/3.3/library/argparse.html
You may be interested in a little Python module I wrote to make handling of command line arguments even easier (open source and free to use) - Commando
Yet another option is argh. It builds on argparse, and lets you write things like:
import argh
# declaring:
def echo(text):
"Returns given word as is."
return text
def greet(name, greeting='Hello'):
"Greets the user with given name. The greeting is customizable."
return greeting + ', ' + name
# assembling:
parser = argh.ArghParser()
parser.add_commands([echo, greet])
# dispatching:
if __name__ == '__main__':
parser.dispatch()
It will automatically generate help and so on, and you can use decorators to provide extra guidance on how the arg-parsing should work.
I recommend looking at docopt as a simple alternative to these others.
docopt is a new project that works by parsing your --help usage message rather than requiring you to implement everything yourself. You just have to put your usage message in the POSIX format.
Also with python3 you might find convenient to use Extended Iterable Unpacking to handle optional positional arguments without additional dependencies:
try:
_, arg1, arg2, arg3, *_ = sys.argv + [None] * 2
except ValueError:
print("Not enough arguments", file=sys.stderr) # unhandled exception traceback is meaningful enough also
exit(-1)
The above argv unpack makes arg2 and arg3 "optional" - if they are not specified in argv, they will be None, while if the first is not specified, ValueError will be thouwn:
Traceback (most recent call last):
File "test.py", line 3, in <module>
_, arg1, arg2, arg3, *_ = sys.argv + [None] * 2
ValueError: not enough values to unpack (expected at least 4, got 3)
My solution is entrypoint2. Example:
from entrypoint2 import entrypoint
#entrypoint
def add(file, quiet=True):
''' This function writes report.
:param file: write report to FILE
:param quiet: don't print status messages to stdout
'''
print file,quiet
help text:
usage: report.py [-h] [-q] [--debug] file
This function writes report.
positional arguments:
file write report to FILE
optional arguments:
-h, --help show this help message and exit
-q, --quiet don't print status messages to stdout
--debug set logging level to DEBUG
import sys
# Command line arguments are stored into sys.argv
# print(sys.argv[1:])
# I used the slice [1:] to print all the elements except the first
# This because the first element of sys.argv is the program name
# So the first argument is sys.argv[1], the second is sys.argv[2] ecc
print("File name: " + sys.argv[0])
print("Arguments:")
for i in sys.argv[1:]:
print(i)
Let's name this file command_line.py and let's run it:
C:\Users\simone> python command_line.py arg1 arg2 arg3 ecc
File name: command_line.py
Arguments:
arg1
arg2
arg3
ecc
Now let's write a simple program, sum.py:
import sys
try:
print(sum(map(float, sys.argv[1:])))
except:
print("An error has occurred")
Result:
C:\Users\simone> python sum.py 10 4 6 3
23
This handles simple switches, value switches with optional alternative flags.
import sys
# [IN] argv - array of args
# [IN] switch - switch to seek
# [IN] val - expecting value
# [IN] alt - switch alternative
# returns value or True if val not expected
def parse_cmd(argv,switch,val=None,alt=None):
for idx, x in enumerate(argv):
if x == switch or x == alt:
if val:
if len(argv) > (idx+1):
if not argv[idx+1].startswith('-'):
return argv[idx+1]
else:
return True
//expecting a value for -i
i = parse_cmd(sys.argv[1:],"-i", True, "--input")
//no value needed for -p
p = parse_cmd(sys.argv[1:],"-p")
Several of our biotechnology clients have posed these two questions recently:
How can we execute a Python script as a command?
How can we pass input values to a Python script when it is executed as a command?
I have included a Python script below which I believe answers both questions. Let's assume the following Python script is saved in the file test.py:
#
#----------------------------------------------------------------------
#
# file name: test.py
#
# input values: data - location of data to be processed
# date - date data were delivered for processing
# study - name of the study where data originated
# logs - location where log files should be written
#
# macOS usage:
#
# python3 test.py "/Users/lawrence/data" "20220518" "XYZ123" "/Users/lawrence/logs"
#
# Windows usage:
#
# python test.py "D:\data" "20220518" "XYZ123" "D:\logs"
#
#----------------------------------------------------------------------
#
# import needed modules...
#
import sys
import datetime
def main(argv):
#
# print message that process is starting...
#
print("test process starting at", datetime.datetime.now().strftime("%Y%m%d %H:%M"))
#
# set local values from input values...
#
data = sys.argv[1]
date = sys.argv[2]
study = sys.argv[3]
logs = sys.argv[4]
#
# print input arguments...
#
print("data value is", data)
print("date value is", date)
print("study value is", study)
print("logs value is", logs)
#
# print message that process is ending...
#
print("test process ending at", datetime.datetime.now().strftime("%Y%m%d %H:%M"))
#
# call main() to begin processing...
#
if __name__ == '__main__':
main(sys.argv)
The script can be executed on a macOS computer in a Terminal shell as shown below and the results will be printed to standard output (be sure the current directory includes the test.py file):
$ python3 test.py "/Users/lawrence/data" "20220518" "XYZ123" "/Users/lawrence/logs"
test process starting at 20220518 16:51
data value is /Users/lawrence/data
date value is 20220518
study value is XYZ123
logs value is /Users/lawrence/logs
test process ending at 20220518 16:51
The script can also be executed on a Windows computer in a Command Prompt as shown below and the results will be printed to standard output (be sure the current directory includes the test.py file):
D:\scripts>python test.py "D:\data" "20220518" "XYZ123" "D:\logs"
test process starting at 20220518 17:20
data value is D:\data
date value is 20220518
study value is XYZ123
logs value is D:\logs
test process ending at 20220518 17:20
This script answers both questions posed above and is a good starting point for developing scripts that will be executed as commands with input values.
Reason for the new answer:
Existing answers specify multiple options.
Standard option is to use argparse, a few answers provided examples from the documentation, and one answer suggested the advantage of it. But all fail to explain the answer adequately/clearly to the actual question by OP, at least for newbies.
An example of argparse:
import argparse
def load_config(conf_file):
pass
if __name__ == '__main__':
parser = argparse.ArgumentParser()
//Specifies one argument from the command line
//You can have any number of arguments like this
parser.add_argument("conf_file", help="configuration file for the application")
args = parser.parse_args()
config = load_config(args.conf_file)
Above program expects a config file as an argument. If you provide it, it will execute happily. If not, it will print the following
usage: test.py [-h] conf_file
test.py: error: the following arguments are required: conf_file
You can have the option to specify if the argument is optional.
You can specify the expected type for the argument using type key
parser.add_argument("age", type=int, help="age of the person")
You can specify default value for the arguments by specifying default key
This document will help you to understand it to an extent.
Background Information
I am writing a python script that will contain a set of methods that will be triggered via the command line. Most functions will accept one or two parameters.
Problem
I've been reading up about the ArgumentParser but it's not clear to me how to write my code so that a function can be triggered using the "-" or "--" notation, and also ensure that if / when a specific function is invoked, the user is passing the correct number of arguments and type.
Code
Sample function inside the script:
def restore_db_dump(dump, dest_db):
"""restore src (dumpfile) to dest (name of database)"""
popen = subprocess.Popen(['psql', '-U', 'postgres', '-c', 'CREATE DATABASE ' + dest_db], stdout=subprocess.PIPE, universal_newlines=True)
print popen.communicate()
popen.stdout.close()
popen.wait()
popen = subprocess.Popen(['pg_restore','-U', 'postgres', '-j', '2', '-d', 'provisioning', '/tmp/'+ dump + '.sql' ], stdout=subprocess.PIPE, u
niversal_newlines=True)
print popen.communicate()
popen.stdout.close()
popen.wait()
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-r', '--restore', dest='restoredbname',action='store_const', const=restore_dump, help='Restore specified dump file as dbname. Must supply <pathtodumpfile> and <dbname>')
args = parser.parse_args()
if __name__ == '__main__':
main()
Code Execution
The help system seems to be working as you can see below, but I don't know how to write logic that forces / checks to see if "restore_dump" is triggered, the user is passing the correct parameters:
lab-1:/tmp# ./test-db.py -h
usage: test-db.py [-h] [-r]
optional arguments:
-h, --help show this help message and exit
-r, --restore Restore specified dump file as dbname. Must supply
<pathtodumpfile> and <dbname>
Question
Can someone point me in the right direction about how to add logic that will check when the restore_db_dump file is called, the right number of parameters are passed?
As far as how to "link" the -r argument so that it triggers the right function, I saw another post here on stackoverflow so I'm going to check that out.
THanks.
EDIT 1:
I forgot to mention that I'm currently reading: https://docs.python.org/2.7/library/argparse.html - 15.4.1. Example
But it's not clear to me how to apply this to my code. It seems that in the case of the sum function the order of the parameters is the integers first and then the function name later.
In my case, I would like the function name first (as an optional arg) and then the parameters required by the function to follow.)
EDIT 2:
Changed the code to look like this:
def main():
parser = argparse.ArgumentParser()
parse = argparse.ArgumentParser(prog='test-db.py')
parser.add_argument('-r', '--restore', nargs=2, help='Restore specified dump file as dbname. Must supply <pathtodumpfile> and <dbname>')
args = parser.parse_args()
if args.restore:
restore_db_dump(args.restore[0], args.restore[1])
if __name__ == '__main__':
main()
And when I run it with one missing arg, it now correctly returns an error! Which is great!!!
But I'm wondering how to fix the help so it's more meaningful. It seems that for each argument, the system is showing the word "RESTORE". How do I change this so that its actually a useful message?
lab-1:/tmp# ./test-db.py -h
usage: test-db.py [-h] [-r RESTORE RESTORE]
optional arguments:
-h, --help show this help message and exit
-r RESTORE RESTORE, --restore RESTORE RESTORE
Restore specified dump file as dbname. Must supply
<pathtodumpfile> and <dbname>
Try the following:
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-r', '--restore', nargs=2,
metavar=('path-to-dump-file', 'db-name'),
help='Restore specified dump file as dbname. Must supply <pathtodumpfile> and <dbname>')
args = parser.parse_args()
if args.restore:
restore_db_dump(args.restore[0], args.restore[1])
if __name__ == '__main__':
main()
Notes:
I have removed const=, because it's not clear whether you want a default.
There is now a nargs=2 parameter, so the -r option requires two values to be given.
the metavar parameter sets the names of the arguments to -r in the help text.
I am writing a python script, it takes either 3 positional arguments (name, date, location, let's say) or 1 argument, which is a setup file which contains that information.
I know that I can use argparse and I can make the positional arguments optional with:
parser.add_argument('name_OR_setupFile')
parser.add_argument('date', nargs='?')
parser.add_argument('location', nargs='?')
and then I can error-check, to make sure that the user didn't do anything stupid
The problem is that now the help message will be very confusing, because it's unclear what the 1st argument really is. I'd LIKE a way to do this as two different add_argument lines, somehow, but I'm not sure how.
I also know that I could use a --setupFile argument, and make the three optional... but I'd rather not do that as well, if I don't have to.
A third option is to use:
parser.add_argument('ARGS', nargs='+', help='ARGS is either of the form setupFile, or name date location')
and then error check later...
ETA for clarification:
I want to be able to call the script with either:
python foo.py setupFile
or
python foo.py name date location
I want the help text to be something like:
usage:
foo.py setupFile
foo.py name date location
I think the clearest design using argparse is:
parser = argparse.ArgumentParser()
g = parser.add_mutually_exclusive_group()
g.add_argument('--setup','-s',metavar='FILE',help='your help')
g.add_argument('--name',nargs=3,metavar=('NAME','DATE','LOCATION'),hel
...: p='your help')
parser.print_help() produces:
usage: ipython3 [-h] [--setup FILE | --name NAME DATE LOCATION]
optional arguments:
-h, --help show this help message and exit
--setup FILE, -s FILE
your help
--name NAME DATE LOCATION
your help
I've handled the 1 or 3 arguments requirement with mutually exclusive optionals. And used metavar to add clarity to the arguments. (As noted in another recent question, metavar does not work well with positionals.)
Another option is to use subparsers. That still requires a key word like setup and name, only they are entered without the --. And the help structure for subparsers is quite different.
Not totally sure this is what you meant, but if I understand you correctly:
if __name__ =='__main__':
def dem_args(*args):
if len(args) == 1:
if os.path.isfile(args[0]):
#go file
else:
#error regarding this being a bad filename or nonexistent file
elif len(args) == 3:
#try to process / raise errors regarding name, date, location
else:
#error reg. wrong number of arguments, possible arguments are either this or that
Ok, this is what I'm currently doing. I'm putting this here for people to comment on, and in case it ends up being useful, for posterity.
I'm actually solving an additional problem here. The problem is actually a little bit more complicated than even I specified. Because there's actually 3 ways to run the program, and I want to be able to have a --help option for only give me the details for one type. So I want -h, -h 1 and -h 2 to all do different things.
My current code is:
import argparse
baseParser = argparse.ArgumentParser(add_help=False)
baseParser.add_argument('-f', '--foo', help ='foo argument')
baseParser.add_argument('-h', '--help', nargs='?' , const = 'all')
parser1 = argparse.ArgumentParser(parents = [baseParser], add_help=False)
parser1.add_argument('name', help='name argument (type 1)')
parser1.add_argument('date', help='date argument')
parser1.add_argument('location', help='location argument')
setupParser=argparse.ArgumentParser(parents = [baseParser],add_help=False)
setupParser.add_argument('setup', help='setup file')
parser2 = argparse.ArgumentParser(parents = [baseParser],add_help=False)
parser2.add_argument('name', help='name argument (type 2)')
parser2.add_argument('baa', help='sheep?')
realParser = argparse.ArgumentParser(parents=[baseParser], add_help=False)
realParser.add_argument('ARGS', nargs = '*', help = 'positional arguments')
args = realParser.parse_args()
if args.help:
if args.help == 'all':
print 'This product can be used in multiple ways:'
print 'setup'
setupParser.print_usage()
print 'type1'
parser1.print_usage()
print'type2'
parser2.print_usage()
print 'use help [type] for more details'
elif args.help=='setup':
setupParser.print_help()
elif args.help=='1':
parser1.print_help()
else:
parser2.print_help()
exit(0)
#actually parse the args in args.ARGS, and work with that
In Python, how can I parse the command line, edit the resulting parsed arguments object and generate a valid command line back with the updated values?
For instance, I would like python cmd.py --foo=bar --step=0 call python cmd.py --foo=bar --step=1 with all the original --foo=bar arguments, potentially without extra arguments added when default value is used.
Is it possible with argparse?
You can use argparse to parse the command-line arguments, and then modify those as desired. At the moment however, argparse lacks the functionality to work in reverse and convert those values back into a command-line string. There is however a package for doing precisely that, called argunparse. For example, the following code in cmd.py
import sys
import argparse
import argunparse
parser = argparse.ArgumentParser()
unparser = argunparse.ArgumentUnparser()
parser.add_argument('--foo')
parser.add_argument('--step', type=int)
kwargs = vars(parser.parse_args())
kwargs['step'] += 1
prefix = f'python {sys.argv[0]} '
arg_string = unparser.unparse(**kwargs)
print(prefix + arg_string)
will print the desired command line:
python cmd.py --foo=bar --step=1
argparse is clearly designed to go one way, from sys.argv to the args namespace. No thought has been given to preserving information that would let you map things back the other way, much less do the mapping itself.
In general, multiple sys.argv could produce the same args. You could, for example, have several arguments that have the same dest. Or you can repeat 'optionals'. But for a restricted 'parser' setup there may be enough information to recreate a usable argv.
Try something like:
parser = argparser.ArgumentParser()
arg1 = parser.add_argument('--foo', default='default')
arg2 = parser.add_argument('bar', nargs=2)
and then examine the arg1 and arg2 objects. They contain all the information that you supplied to the add_argument method. Of course you could have defined those values in your own data structures before hand, e.g.
{'option_string':'--foo', 'default':'default'}
{'dest':'bar', 'nargs':2}
and used those as input to add_argument.
While the parser may have enough information to recreate a useable sys.argv, you have to figure out how to do that yourself.
default=argparse.SUPPRESS may be handy. It keeps the parser from adding a default entry to the namespace. So if the option isn't used, it won't appear in the namespace.
This isn't possible in any easy way that I know of, then again I've never needed to do this.
But with the lack of information in the question in regards to how you call your script, I'll assume the following:
python test.py cmd --foo=bar --step=0
And what you could do is do:
from sys import argv
for index in range(1, len(argv)): # the first object is the script itself
if '=' in argv[index]:
param, value = argv[index].split('=', 1)
if param == '--step':
value = '1'
argv[index] = param + '=' + value
print(argv)
Note that this is very specific to --step and may be what you've already thought of and just wanted a "better way", but again, I don't think there is.
depending on the scope, this works at the same module at least:
pprint(argparse._sys.argv)
Per the other answers, rebuilding is imperfect, but if you aren't doing anything too fancy and are okay with imperfect, something like this could work for you as a starting point:
def unparse_args(parser, parsed_args):
"""Unparse argparsed args"""
positional_args = [action.dest
for action in parser._actions
if not action.option_strings]
optionals = []
positionals = []
for key, value in vars(parsed_args).items():
if not value:
# none and false flags go away
continue
elif key in positional_args:
positionals.append(value)
elif value is True:
optionals.append(f"--{key}")
else:
optionals.append(f"--{key}={value}")
return " ".join(optionals + positionals)
Here's an example using this with a git clone clone:
parser = argparse.ArgumentParser(description='A sample git clone wrapper')
# options
parser.add_argument("-v", "--verbose", action="store_true",
help="be more verbose")
parser.add_argument("-q", "--quiet", action="store_true",
help="be more quiet")
parser.add_argument("--recurse-submodules", nargs='?',
help="initialize submodules in the clone")
parser.add_argument("--recursive", nargs='?',
help="alias of --recurse-submodules")
parser.add_argument("-b", "--branch",
help=" checkout <branch> instead of the remote's HEAD")
parser.add_argument("--depth", type=int,
help="create a shallow clone of that depth")
parser.add_argument("--shallow-submodules", action="store_true",
help="any cloned submodules will be shallow")
# positional
parser.add_argument("repo", help="The git repo to clone")
parser.add_argument("dir", nargs='?',help="The location to clone the repo")
# make a fake call to your git clone clone and parse the args
cmdargs = ["--depth=1", "-q", "ohmyzsh/ohmyzsh"]
parsedargs = parser.parse_args(cmdargs)
# now unparse them
unparsed = unparse_args(parser, parsedargs)
print(unparsed)
I have an application that allows you to send event data to a custom script. You simply lay out the command line arguments and assign what event data goes with what argument. The problem is that there is no real flexibility here. Every option you map out is going to be used, but not every option will necessarily have data. So when the application builds the string to send to the script, some of the arguments are blank and python's OptionParser errors out with "error: --someargument option requires an argument"
Being that there are over 200 points of data, it's not like I can write separate scripts to handle each combination of possible arguments (it would take 2^200 scripts). Is there a way to handle empty arguments in python's optionparser?
Sorry, misunderstood the question with my first answer. You can accomplish the ability to have optional arguments to command line flags use the callback action type when you define an option. Use the following function as a call back (you will likely wish to tailor to your needs) and configure it for each of the flags that can optionally receive an argument:
import optparse
def optional_arg(arg_default):
def func(option,opt_str,value,parser):
if parser.rargs and not parser.rargs[0].startswith('-'):
val=parser.rargs[0]
parser.rargs.pop(0)
else:
val=arg_default
setattr(parser.values,option.dest,val)
return func
def main(args):
parser=optparse.OptionParser()
parser.add_option('--foo',action='callback',callback=optional_arg('empty'),dest='foo')
parser.add_option('--file',action='store_true',default=False)
return parser.parse_args(args)
if __name__=='__main__':
import sys
print main(sys.argv)
Running from the command line you'll see this:
# python parser.py
(<Values at 0x8e42d8: {'foo': None, 'file': False}>, [])
# python parser.py --foo
(<Values at 0x8e42d8: {'foo': 'empty', 'file': False}>, [])
# python parser.py --foo bar
(<Values at 0x8e42d8: {'foo': 'bar', 'file': False}>, [])
Yes, there is an argument to do so when you add the option:
from optparse import OptionParser
parser = OptionParser()
parser.add_option("--SomeData",action="store", dest="TheData", default='')
Give the default argument the value you want the option to have it is to be specified but optionally have an argument.
I don't think optparse can do this. argparse is a different (non-standard) module that can handle situations like this where the options have optional values.
With optparse you have to either have to specify the option including it's value or leave out both.
Optparse already allows you to pass the empty string as an option argument. So if possible, treat the empty string as "no value". For long options, any of the following work:
my_script --opt= --anotheroption
my_script --opt='' --anotheroption
my_script --opt="" --anotheroption
my_script --opt '' --anotheroption
my_script --opt "" --anotheroption
For short-style options, you can use either of:
my_script -o '' --anotheroption
my_script -o "" --anotheroption
Caveat: this has been tested under Linux and should work the same under other Unixlike systems; Windows handles command line quoting differently and might not accept all of the variants listed above.
Mark Roddy's solution would work, but it requires attribute modification of a parser object during runtime, and has no support for alternative option formattings other than - or --.
A slightly less involved solution is to modify the sys.argv array before running optparse and insert an empty string ("") after a switch which doesn't need to have arguments.
The only constraint of this method is that you have your options default to a predictable value other than the one you are inserting into sys.argv (I chose None for the example below, but it really doesn't matter).
The following code creates an example parser and set of options, extracts an array of allowed switches from the parser (using a little bit of instance variable magic), and then iterates through sys.argv, and every time it finds an
allowed switch, it checks to see if it was given without any arguments following it . If there is no argument after a switch, the empty string will be inserted on the command
line. After altering sys.argv, the parser is invoked, and you can check for options whose values are "", and act accordingly.
#Instantiate the parser, and add some options; set the options' default values to None, or something predictable that
#can be checked later.
PARSER_DEFAULTVAL = None
parser = OptionParser(usage="%prog -[MODE] INPUT [options]")
#This method doesn't work if interspersed switches and arguments are allowed.
parser.allow_interspersed_args = False
parser.add_option("-d", "--delete", action="store", type="string", dest="to_delete", default=PARSER_DEFAULTVAL)
parser.add_option("-a", "--add", action="store", type="string", dest="to_add", default=PARSER_DEFAULTVAL)
#Build a list of allowed switches, in this case ['-d', '--delete', '-a', '--add'] so that you can check if something
#found on sys.argv is indeed a valid switch. This is trivial to make by hand in a short example, but if a program has
#a lot of options, or if you want an idiot-proof way of getting all added options without modifying a list yourself,
#this way is durable. If you are using OptionGroups, simply run the loop below with each group's option_list field.
allowed_switches = []
for opt in parser.option_list:
#Add the short (-a) and long (--add) form of each switch to the list.
allowed_switches.extend(opt._short_opts + opt._long_opts)
#Insert empty-string values into sys.argv whenever a switch without arguments is found.
for a in range(len(sys.argv)):
arg = sys.argv[a]
#Check if the sys.argv value is a switch
if arg in allowed_switches:
#Check if it doesn't have an accompanying argument (i.e. if it is followed by another switch, or if it is last
#on the command line)
if a == len(sys.argv) - 1 or argv[a + 1] in allowed_switches:
sys.argv.insert(a + 1, "")
options, args = parser.parse_args()
#If the option is present (i.e. wasn't set to the default value)
if not (options.to_delete == PARSER_DEFAULTVAL):
if options.droptables_ids_csv == "":
#The switch was not used with any arguments.
...
else:
#The switch had arguments.
...
After checking that the cp command understands e.g. --backup=simple but not --backup simple, I answered the problem like this:
import sys
from optparse import OptionParser
def add_optval_option(pog, *args, **kwargs):
if 'empty' in kwargs:
empty_val = kwargs.pop('empty')
for i in range(1, len(sys.argv)):
a = sys.argv[i]
if a in args:
sys.argv.insert(i+1, empty_val)
break
pog.add_option(*args, **kwargs)
def main(args):
parser = OptionParser()
add_optval_option(parser,
'--foo', '-f',
default='MISSING',
empty='EMPTY',
help='"EMPTY" if given without a value. Note: '
'--foo=VALUE will work; --foo VALUE will *not*!')
o, a = parser.parse_args(args)
print 'Options:'
print ' --foo/-f:', o.foo
if a[1:]:
print 'Positional arguments:'
for arg in a[1:]:
print ' ', arg
else:
print 'No positional arguments'
if __name__=='__main__':
import sys
main(sys.argv)
Self-advertisement: This is part of the opo module of my thebops package ... ;-)