What is the "**params" syntax in a Python method definition? - python

So I am trying out the new python code for the google app engine search library and I came across a weird syntax. This was:
cls_createDocument(**params)
where params was a dictionary.
The function this refers to is:
#classmethod
def _createDocument(
cls, pid=None, category=None, name=None, description=None,
category_name=None, price=None, **params)
My questions is, what does the **params signify and what does that do to the object?
Thanks!
Jon

Consider a function with default arguments:
def func(foo=3):
print(foo)
The structure of the arguments is (in principle) very similar to a dictionary. The function foo has (essentially) a dictionary of default arguments (in this case {'foo':3}). Now, lets say that you don't want to use the keyword in the function call, but you want to use a dictionary instead -- then you can call foo as:
d = {"foo":8}
func(**d)
This allows you to dynamically change what arguments you are passing to the function func.
This become a little more interesting if you try the following:
d = {"foo":8, "bar":12}
func(**d)
This doesn't work (it is equivalent to foo(foo=8, bar=12), but since bar isn't a valid argument, it fails).
You can get around that problem by giving those extra arguments a place to go inside the definition of foo.
def func( foo=3, **kwargs ):
print(foo,kwargs)
Now, try:
d = {"foo":8, "bar":12}
func(**d) #prints (8, {'bar':12})
All the extra keyword arguments go into the kwargs dictionary inside the function.
This can also be called as:
func(foo=8, bar=12)
with the same result.
This is often useful if funcA calls funcB and you want funcA to accept all of the keywords of funcB (plus a few extra) which is a very common thing when dealing with classes and inheritance:
def funcA(newkey=None,**kwargs):
funcB(**kwargs)
Finally, here is a link to the documentation

The **params parameter represents all the keyword arguments passed to the function as a dictionary.

Related

passing keyword argument to inner function when outer function has keyword argument with the same name

I have two functions defined roughly like this:
def func_inner(bar=0):
print('inner bar:', bar)
def func_outer(bar=-1, *args, **kwargs):
print('outer bar:', bar)
func_inner(*args, **kwargs)
Is there a way to call func_outer and provide it with two values of bar - one for func_outer, and the other to be passed over to func_inner? Calling func_outer(bar=1,bar=2) clearly does not work.
It's possible to overcome the issue by specifying bar values as positional arguments, like func_outer(1,2), but I'd like to specify them as keyword arguments for both functions.
Is there a solution entirely at the caller's side (ie. without altering the functions)?
No, there is none
You cannot pass two arguments with the same name to a function. Thus, you will not be able to have the key "bar" in kwargs.
Thus you cannot pass this argument without modifying the two functions.
There may be a more adapted way to what you’re doing
This is the kind of code that pops up in a decorator. In this kind of case, you may want to make the outer function currified.

Test code such that no function calls kwarg as an arg

In python, it is possible to check whether a function makes function calls where named arguments are called as positional arguments?
For example:
def a(pos_arg, nam_arg=None, nam_arg2=None):
return "whatever"
def b():
return a(1, 2, nam_arg2="whatever")
Here, the function b calls the function a, but the second argument is a named argument, which is being called as a positional argument. This might cause confusing problems when object inheritance comes into play.
Better would have been:
def b():
return a(1, nam_arg=2, nam_arg2="whatever")
For testing purposes (as this is generally bad code in my project), I would like to find out whether the function b() makes such calls.
Is this possible in python?
You tagged your question as python-2.7 but for the sake of future viewers I would say that in python 3 we now have keyword-only arguments feature which allows to define a function like that:
def func(pos_arg, *, kw_only_arg):
pass
In that case kw_only_arg is allowed to be used only as keyword argument.
If you can't use python 3 then there is a recipe that can help you with that:
http://code.activestate.com/recipes/578993-keyword-only-arguments-in-python-2x/

Python imitate class's constructor only using function dynamically

This is a little bit weird. I want to dynamic initialize part of function's parameters before I call it. But I don't want to use class for some reason. Let's say I have a function:
def inner(a,b,c):
"a,b,c do something"
return result
Before I formally call it, I'd like to initialize it somewhere:
partInitalAbc=inner(,b,c)
Then I'll use it as:
result=partInitialAbc(a)
Please notice I want to do this dynamically. Just as when you initial a class using the constructor, so a decorator may be not appropriate at this time...
Any one have some idea?
It sounds like you're looking for functools.partial:
Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords.
If you pass the arguments to partial as positional arguments, they'll appear at the beginning of the argument list, which isn't exactly what you want. Keyword arguments give you more control over which parameters are passed:
partInitialAbc = functools.partial(inner, b=b_value, c=c_value);

"self" inside plain function?

I've got a bunch of functions (outside of any class) where I've set attributes on them, like funcname.fields = 'xxx'. I was hoping I could then access these variables from inside the function with self.fields, but of course it tells me:
global name 'self' is not defined
So... what can I do? Is there some magic variable I can access? Like __this__.fields?
A few people have asked "why?". You will probably disagree with my reasoning, but I have a set of functions that all must share the same signature (accept only one argument). For the most part, this one argument is enough to do the required computation. However, in a few limited cases, some additional information is needed. Rather than forcing every function to accept a long list of mostly unused variables, I've decided to just set them on the function so that they can easily be ignored.
Although, it occurs to me now that you could just use **kwargs as the last argument if you don't care about the additional args. Oh well...
Edit: Actually, some of the functions I didn't write, and would rather not modify to accept the extra args. By "passing in" the additional args as attributes, my code can work both with my custom functions that take advantage of the extra args, and with third party code that don't require the extra args.
Thanks for the speedy answers :)
self isn't a keyword in python, its just a normal variable name. When creating instance methods, you can name the first parameter whatever you want, self is just a convention.
You should almost always prefer passing arguments to functions over setting properties for input, but if you must, you can do so using the actual functions name to access variables within it:
def a:
if a.foo:
#blah
a.foo = false
a()
see python function attributes - uses and abuses for when this comes in handy. :D
def foo():
print(foo.fields)
foo.fields=[1,2,3]
foo()
# [1, 2, 3]
There is nothing wrong with adding attributes to functions. Many memoizers use this to cache results in the function itself.
For example, notice the use of func.cache:
from decorator import decorator
#decorator
def memoize(func, *args, **kw):
# Author: Michele Simoniato
# Source: http://pypi.python.org/pypi/decorator
if not hasattr(func, 'cache'):
func.cache = {}
if kw: # frozenset is used to ensure hashability
key = args, frozenset(kw.iteritems())
else:
key = args
cache = func.cache # attribute added by memoize
if key in cache:
return cache[key]
else:
cache[key] = result = func(*args, **kw)
return result
You can't do that "function accessing its own attributes" correctly for all situations - see for details here how can python function access its own attributes? - but here is a quick demonstration:
>>> def f(): return f.x
...
>>> f.x = 7
>>> f()
7
>>> g = f
>>> g()
7
>>> del f
>>> g()
Traceback (most recent call last):
File "<interactive input>", line 1, in <module>
File "<interactive input>", line 1, in f
NameError: global name 'f' is not defined
Basically most methods directly or indirectly rely on accessing the function object through lookup by name in globals; and if original function name is deleted, this stops working. There are other kludgey ways of accomplishing this, like defining class, or factory - but thanks to your explanation it is clear you don't really need that.
Just do the mentioned keyword catch-all argument, like so:
def fn1(oneArg):
// do the due
def fn2(oneArg, **kw):
if 'option1' in kw:
print 'called with option1=', kw['option1']
//do the rest
fn2(42)
fn2(42, option1='something')
Not sure what you mean in your comment of handling TypeError - that won't arise when using **kw. This approach works very well for some python system functions - check min(), max(), sort(). Recently sorted(dct,key=dct.get,reverse=True) came very handy to me in CodeGolf challenge :)
Example:
>>> def x(): pass
>>> x
<function x at 0x100451050>
>>> x.hello = "World"
>>> x.hello
"World"
You can set attributes on functions, as these are just plain objects, but I actually never saw something like this in real code.
Plus. self is not a keyword, just another variable name, which happens to be the particular instance of the class. self is passed implicitly, but received explicitly.
if you want globally set parameters for a callable 'thing' you could always create a class and implement the __call__ method?
There is no special way, within a function's body, to refer to the function object whose code is executing. Simplest is just to use funcname.field (with funcname being the function's name within the namespace it's in, which you indicate is the case -- it would be harder otherwise).
This isn't something you should do. I can't think of any way to do what you're asking except some walking around on the call stack and some weird introspection -- which isn't something that should happen in production code.
That said, I think this actually does what you asked:
import inspect
_code_to_func = dict()
def enable_function_self(f):
_code_to_func[f.func_code] = f
return f
def get_function_self():
f = inspect.currentframe()
code_obj = f.f_back.f_code
return _code_to_func[code_obj]
#enable_function_self
def foo():
me = get_function_self()
print me
foo()
While I agree with the the rest that this is probably not good design, the question did intrigue me. Here's my first solution, which I may update once I get decorators working. As it stands, it relies pretty heavily on being able to read the stack, which may not be possible in all implementations (something about sys._getframe() not necessarily being present...)
import sys, inspect
def cute():
this = sys.modules[__name__].__dict__.get(inspect.stack()[0][3])
print "My face is..." + this.face
cute.face = "very cute"
cute()
What do you think? :3
You could use the following (hideously ugly) code:
class Generic_Object(object):
pass
def foo(a1, a2, self=Generic_Object()):
self.args=(a1,a2)
print "len(self.args):", len(self.args)
return None
... as you can see it would allow you to use "self" as you described. You can't use an "object()" directly because you can't "monkey patch(*)" values into an object() instance. However, normal subclasses of object (such as the Generic_Object() I've shown here) can be "monkey patched"
If you wanted to always call your function with a reference to some object as the first argument that would be possible. You could put the defaulted argument first, followed by a *args and optional **kwargs parameters (through which any other arguments or dictionaries of options could be passed during calls to this function).
This is, as I said hideously ugly. Please don't ever publish any code like this or share it with anyone in the Python community. I'm only showing it here as a sort of strange educational exercise.
An instance method is like a function in Python. However, it exists within the namespace of a class (thus it must be accessed via an instance ... myobject.foo() for example) and it is called with a reference to "self" (analagous to the "this" pointer in C++) as the first argument. Also there's a method resolution process which causes the interpreter to search the namespace of the instance, then it's class, and then each of the parent classes and so on ... up through the inheritance tree.
An unbound function is called with whatever arguments you pass to it. There can't bee any sort of automatically pre-pended object/instance reference to the argument list. Thus, writing a function with an initial argument named "self" is meaningless. (It's legal because Python doesn't place any special meaning on the name "self." But meaningless because callers to your function would have to manually supply some sort of object reference to the argument list and it's not at all clear what that should be. Just some bizarre "Generic_Object" which then floats around in the global variable space?).
I hope that clarifies things a bit. It sounds like you're suffering from some very fundamental misconceptions about how Python and other object-oriented systems work.
("Monkey patching" is a term used to describe the direct manipulation of an objects attributes -- or "instance variables" by code that is not part of the class hierarchy of which the object is an instance).
As another alternative, you can make the functions into bound class methods like so:
class _FooImpl(object):
a = "Hello "
#classmethod
def foo(cls, param):
return cls.a + param
foo = _FooImpl.foo
# later...
print foo("World") # yes, Hello World
# and if you have to change an attribute:
foo.im_self.a = "Goodbye "
If you want functions to share attribute namespaecs, you just make them part of the same class. If not, give each its own class.
What exactly are you hoping "self" would point to, if the function is defined outside of any class? If your function needs some global information to execute properly, you need to send this information to the function in the form of an argument.
If you want your function to be context aware, you need to declare it within the scope of an object.

Superfluous python parameters

I've noticed a discrepancy in the way that python parameters are called. In every other language I've dealt with, you either have
foo()
meaning either no parameters, or as many parameters as you like, or
foo(arg1, arg2,...,argn)
where you pass in the same number of parameters to define the function and call it. In python however, I've noticed that the function definitions, and when the function is called, can have two different parameters sets, this usually consists of:
class foo(object):
def bar(self, arg1, arg2):
pass
However, when I want to call the function, all I have to do is:
zoo = foo()
zoo.bar(arg1, arg2)
Where did the self parameter go?
Thank you.
Where did the self parameter go?
It's in front of the dot when you call the function, i.e. in your case it's zoo.
Note that you can also call the function as foo.bar(zoo, arg1, arg2). Basically in python object.method(arguments) is a shortcut for objects_class.method(object, arguments).
zoo is the self parameter.
In C++, for example, you get the object passed implicitly as the this pointer. In Python, this parameter is explicit.
zoo is implicitly passed as the first parameter in your example.
As I remember, "zoo.bar" gives you just an attribute "bar" of object "zoo" that can be called. All magic is done at construction where all methods of class is binded to that object while dictionary of attributes is populated.
Consider next example:
zoo = foo()
xbar = zoo.bar
xbar(arg1, arg2)

Categories