I'm using python 3.3. Consider this function:
def foo(action, log=False,*args) :
print(action)
print(log)
print(args)
print()
The following call works as expected:
foo("A",True,"C","D","E")
A
True
('C', 'D', 'E')
But this one doesn't.
foo("A",log=True,"C","D","E")
SyntaxError: non-keyword arg after keyword arg
Why is this the case?
Does this somehow introduce ambiguity?
Consider the following:
def foo(bar="baz", bat=False, *args):
...
Now if I call
foo(bat=True, "bar")
Where does "bar" go? Either:
bar = "bar", bat = True, args = (), or
bar = "baz", bat = True, args = ("bar",), or even
bar = "baz", bat = "bar", args = ()
and there's no obvious choice (at least between the first two) as to which one it should be. We want bat = True to 'consume' the second argument slot, but it's not clear which order the remaining arguments should be consumed in: treating it as if bat doesn't exist at all and moving everything to the left, or treating it as if bat moved the "cursor" past itself on to the next argument. Or, if we wanted to do something truly strange, we could defend the decision to say that the second argument in the argument tuple always goes with the second positional argument, whether or not other keyword arguments were passed.
Regardless, we're left with something pretty confusing, and someone is going to be surprised which one we picked regardless of which one it is. Python aims to be simple and clean, and it wants to avoid making any language design choices that might be unintuitive. There should be one-- and preferably only one --obvious way to do it.
The function of keyword arguments is twofold:
To provide an interface to functions that does not rely on the order of the parameters.
To provide a way to reduce ambiguity when passing parameters to a function.
Providing a mixture of keyword and ordered arguments is only a problem when you provide the keyword arguments before the ordered arguments. Why is this?
Two reasons:
It is confusing to read. If you're providing ordered parameters, why would you label some of them and not others?
The algorithm to process the arguments would be needless and complicated. You can provide keyword args after your 'ordered' arguments. This makes sense because it is clear that everything is ordered up until the point that you employ keywords. However; if you employ keywords between ordered arguments, there is no clear way to determine whether you are still ordering your arguments.
Related
What I would really like to do is make a function with a default first argument
def do_thing(self, category:str='default', sub_category:str, argA:int|float, argB:int|float, argC:int|float)
...
obj.do_thing('cat1', 'sub1', 1, 2, 3)
obj.do_thing('sub2', 1, 2, 3) #In thise case 'default' is used for the category
But that isn't possible since all arguments with default values must come after non-default arguments.
I could technically move the category argument last and the default work work fine, but the category and sub_category arguments are closely related so I would like to keep them grouped together at the beginning of the line then followed by argA/B/C since it follows the format [selected thing, arguments to use on selected thing]
Using *args would also technically work and could check number of arguments and whether there are one or two strings at the beginning of the list.
But this seems much less clear what arguments are expected to be given and doesn't have type hinting or the arguments which are useful for IDE auto-suggestions.
def do_thing(self, *args)
Multiple Dispatch seemed like it did sort of what I wanted since it allows different versions of the function to be called based on the arguments that are used.
So I tried doing this
from multipledispatch import dispatch
#dispatch(str, str, object, object, object) #Use object to allow more than one type since float and int are both valid
def do_thing(self, category:str='default', sub_category:str, argA:int|float, argB:int|float, argC:int|float)
...
#dispatch(str, object, object, object)
def do_thing(self, sub_category:str, argA:int|float, argB:int|float, argC:int|float)
do_thing('default', sub_category, argA, argB, argC)
This mostly works if all the arguments are entered as positional arguments, however the users of this library regularly (but not always) call the function with argA-argC as keyword arguments, which doesn't work with multiple dispatch.
Is there any way to check if there are at least 2 positional argument and then only if the first two arguments of the function are strings run one function, else for any other combination, call the other version?
functools.singledispatchmethod almost does what I want to do, but in both cases, the first argument will always be a string, so I don't think that will work unless there's a way to tell it to look at the second argument instead of the first
Alternatively, is there a better way to go about doing what I was trying to do to begin with? Basically create a default value for the first argument?
Under normal circumstances one calls a function with its default arguments by omitting those arguments. However if I'm generating arguments on the fly, omitting one isn't always easy or elegant. Is there a way to use a function's default argument explicitly? That is, to pass an argument which points back to the default argument.
So something like this except with ~use default~ replaced with something intelligent.
def function(arg='default'):
print(arg)
arg_list= ['not_default', ~use default~ ]
for arg in arg_list:
function(arg=arg)
# output:
# not_default
# default
I don't know if it's even possible and given the term "default argument" all my searches come up with is coders first tutorial. If this functionality is not supported that's ok too, I'd just like to know.
Unfortunately there is no such feature in Python. There are hackarounds, but they're not very likable.
The simple and popular pattern is to move the default into the function body:
def function(arg=None):
if arg is None:
arg = 'default'
...
Now you can either omit the argument or pass arg=None explicitly to take on the default value.
There is no general purpose way to omit an argument; you can specialize to particular functions by explicitly passing the appropriate default value, but otherwise, your only option is to fail to pass the argument.
The closest you could come is to replace your individual values with tuples or dicts that omit the relevant argument, then unpack them at call time. So for your example, you'd change arglist's definition to:
arg_list = [{'arg': 'not_default'}, {}]
then use it like so:
for arg in arg_list:
function(**arg)
A slightly uglier approach is to use a sentinel when you don't want to pass the argument, use that in your arg_list, and test for it, e.g.:
USEDEFAULT = object()
arg_list = ['not_default', USEDEFAULT]
for arg in arg_list:
if arg is USEDEFAULT:
function()
else:
function(arg=arg)
Obviously a bit less clean, but possibly more appropriate for specific use cases.
I keep getting an error message on my function parameters.
I've seen some other questions that are similar but I struggle to understand the answer and those people also had different situations which were more complex.
def check(user_answer='', list_of_blanks, letters):
x = 0
for letter in letters:
if letter == user_answer:
list_of_blanks[letters.index(user_answer)] = user_answer
x += 1
if x == 0:
return False
else:
str(list_of_blanks)
list_of_blanks = ''.join(list_of_blanks)
return list_of_blanks
pycharm highlights "list_of_blanks, letters" (function parameters) and gives me the error which says:
Non-Default parameter follows default parameter.
If I try to default the parameter to an empty list like so:
list_of_blanks=[], letters=[]
then I get this error:
Default argument value is mutable.
You can't have a non-positional argument (param=value) prior to positional arguments.
def func(positional1, positional2, nonpositional=10)
This is due primarily to the fact you are not required to specify the name of non-positional parameters.
func(10, 50, nonpositional=60) == func(10, 50, 60)
Is it possible to switch positions of the arguments? Then you could just write:
def check(list_of_blanks, letters, user_answer=''):
Giving an argument a default value makes it optional. You can't have optional arguments before non-optional ones because then it becomes unclear which slots you mean to assign and which you don't. You might be able to come up with a complex set of rules to determine how to interpret the arguments, but Python attempts to avoid unnecessary complexity, ambiguity, and counter-intuitive behavior.
You have a number of options available to work around this:
Move the default argument to the end of the list:
def check(list_of_blanks, letters, user_answer=''):
Make all the other arguments optional as well:
def check(user_answer='', list_of_blanks=(), letters=()):
Using tuples, which are immutable, avoids the possibility accidental (and permanent) modification of the default lists.
Remove the default argument entirely:
def check(user_answer, list_of_blanks, letters):
Frankly this seems to be the best option in your case. Conceptually speaking, the most important arguments of a function come first. Unless your function has side-effects, making these arguments optional seems somehow deficient.
Make the other arguments required, but keyword-only:
def check(user_answer='', *, list_of_blanks, letters):
You will no longer be able to invoke the function as check('abc', ['b', 'c'], ['a']) because list_of_blanks and letters can no longer be specified as positional arguments. You must now invoke with keywords as check('abc', list_of_blanks=['b', 'c'], letters=['a']). But now the first argument is unambiguously optional, so you can do things like check(letters=['a'], list_of_blanks=['b', 'c']).
My personal preference, in decreasing order is 3, 4, 1, 2. I think your best bet is to have no optional arguments, but I can imagine you being in a phase where you'd want to play with keyword-only arguments regardless of the design implications.
Okay this one is confusing. My old piece of code has something like
map(lambda x:x.func1(arg1), other_args_to_be_mapped)
now I would like to make arg1 -> *args
while other_args_to_be_mapped stays unchanged.
in func1, the length of arguments will be checked different operations. My questions are
1) which length will be checked? arg1 or other_args_to_be_mapped
2) in func1, how should I set up the default? It was like
def func1(arg1=something)
but now with potential multiple arguments, I don't know what to do with the initialization. I want to be able to do something like
def func1(args*=something, something_else)
Is that even possible?
If I understand your question correctly, you're looking for variable arguments. These can be mixed with fixed arguments, provided you obey a logical ordering (fixed arguments first, then keyword arguments or variable arguments).
For example, the following shows how map to a function that takes in one constant argument and one variable argument. If you would like different behaviour, please provide a concrete example of what you are trying to accomplish
import random
class Foo:
def get_variable_parameters(self):
return [1] if random.random() > .5 else [1,2]
def foo( self, arg, *args ):
print("Doing stuff with constant arg", arg)
if len(args) == 1:
print("Good",args)
else:
print("Bad",args)
list(map( lambda x : x.foo( 'Static Argument', *x.get_variable_parameters()), [Foo(),Foo(),Foo()] ))
We don't know how many arguments are going to be passed to foo (in this trivial case, it's one or two), but the "*" notation accepts any number of objects to be passed
Note I've encapsulated map in list so that it gets evaluated, as in python3 it is a generator. List comprehension may be more idiomatic in python. Also don't forget you can always use a simple for loop - an obfuscated or complex map call is far less pythonic than a clear (but several line) for-loop, imo.
If, rather, you're trying to combine multiple arguments in a map call, I would recommend using the same variable argument strategy with the zip function, e.g.,
def foo(a,*b): ...
map(lambda x : foo(x[0],*x[1]), zip(['a','b'],[ [1], [1,2] ]))
In this case, foo will get called first as foo('a',1), and then as foo('b',2,3)
I am creating a module in python that can take multiple arguments. What would be the best way to pass the arguments to the definition of a method?
def abc(arg):
...
abc({"host" : "10.1.0.100", "protocol" : "http"})
def abc(host, protocol):
...
abc("10.1.0.100", "http")
def abc(**kwargs):
...
abc(host = "10.1.0.100", protocol = "http")
Or something else?
Edit
I will actually have those arguments (username, password, protocol, host, password2) where none of them are required.
def abc(host, protocol):
...
abc("10.1.0.100", "http")
abc(host="10.1.0.100", protocol="http")
If you call it using positional args or keyword args depends on the number of arguments and if you skip any. For your example I don't see much use in calling the function with keyword args.
Now some reasons why the other solutions are bad:
abc({"host" : "10.1.0.100", "protocol" : "http"})
This is a workaround for keyword arguments commonly used in languages which lack real keyword args (usually JavaScript). In those languages it is nice, but in Python it is just wrong since you gain absolutely nothing from it. If you ever want to call the functions with args from a dict, you can always do abc(**{...}) anyway.
def abc(**kwargs):
People expect the function to accept lots of - maybe even arbitrary - arguments. Besides that, they don't see what arguments are possible/required and you'd have to write code to require certain arguments on your own.
If all the arguments are known ahead, use an explicit argument list, optionally with default values, like def abc(arg1="hello", arg2="world",...). This will make the code most readable.
When you call the function, you can use either abd("hello", "world") or abc(arg1="hello", arg2="world"). I use the longer form if there are more than 4 or 5 arguments, it's a matter of taste.
This really depends on the context, the design and the purpose of your method.
If the parameters are defined and compulsory, then the best option is:
def abc(host, protocol):
...
abc('10.1.0.100', 'http')
in case the parameters are defined but optional, then:
def abc(host=None, protocol=None): # I used None, but put a default value
...
you can call it thorugh positional arguments as abc('10.1.0.100', 'http') or by their name abc(protocol='http')
And if you don't know at first what are the arguments it will receive are (for example in string formatting) then the best option is using the **kwargs argument:
def abc(**kwargs):
...
And in this way you must call it using named arguments.