It's possible to define a class without using the class keyword.
The following ...
get_i = lambda self: self.i
get_i.__name__ = 'get_i'
get_i.__qualname__ = 'Klass2.get_i'
dct = dict(a=1, i=4, get_i=get_i)
Klass2 = type('Klass2', (SuperK,), dct)
... produces the same end result as:
class Klass1(SuperK):
a = 1
i = 4
def get_i(self):
return self.i
How can we do something similar for functions? That is, how can we define a function without using the def or lambda keywords? What might a pure-python implementation of dehf look like if the following two pieces of code created identical foos?
def foo(bar):
bar += 934
return bar
foo = dehf(blah, blah, blah, blah, [...])
You can create functions by calling the types.FunctionType constructor. Keep in mind however that this constructor is undocumented and implementation specific. In CPython, we can figure out the constructor arguments by calling help(types.FunctionType):
class function(object)
| function(code, globals[, name[, argdefs[, closure]]])
|
| Create a function object from a code object and a dictionary.
| The optional name string overrides the name from the code object.
| The optional argdefs tuple specifies the default argument values.
| The optional closure tuple supplies the bindings for free variables.
To create a code object, we can use compile:
code = compile('print(5)', 'foo.py', 'exec')
function = types.FunctionType(code, globals())
function() # output: 5
Related
I recently studied how decorators work in python, and found an example which integrates decorators with nested functions.
The code is here :
def integer_check(method):
def inner(ref):
if not isinstance(ref._val1, int) or not isinstance(ref._val2, int):
raise TypeError('val1 and val2 must be integers')
else:
return method(ref)
return inner
class NumericalOps(object):
def __init__(self, val1, val2):
self._val1 = val1
self._val2 = val2
#integer_check
def multiply_together(self):
return self._val1 * self._val2
def power(self, exponent):
return self.multiply_together() ** exponent
y = NumericalOps(1, 2)
print(y.multiply_together())
print(y.power(3))
My question is how the inner function argument("ref") accesses the instance attributes (ref._val1 and ref._val2)?
It seems like ref equals the instance but i have no idea how it happenes.
Let's first recall how a decorator works:
Decorating the method multiply_together with the decorator #integer_check is equivalent to adding the line: multiply_together = integer_check(multiply_together), and by the definition of multiply_together, this is equivalent to multiply_together = inner.
Now, when you call the method multiply_together, since this is an instance method, Python implicitly adds the class instance used to invoke the method as its first (an only, in this case) argument. But multiply_togethet is, actually,inner, so, in fact, inner is invoked with the class instance as an argument. This instance is mapped to the parameter ref, and through this parameter the function gets access to the required instance attributes.
well one explanation I found some time ago about the self argument was that this:
y.multiply_together()
is roughly the same as
NumericalOps.multiply_together(y)
So now that you use that decorator it returns the function inner which requires the ref argument so I see that roughly happen like this (on a lower level):
NumericalOps.inner(y)
Because inner "substitutes" multiply_together while also adding the extra functionality
inner replaces the original function as the value of the class attribute.
#integer_check
def multiply_together(self):
return self._val1 * self._val2
# def multiply_together(self):
# ...
#
# multiply_together = integer_check(multiply_together)
first defines a function and binds it to the name multiply_together. That function is then passed as the argument to integer_check, and then the return value of integer_check is bound to the name multiply_together. The original function is now only refernced by the name ref that is local to inner/multiply_together.
The definition of inner implies that integer_check can only be applied to functions whose first argument will have attributes named _val1 and _val2.
I have a collection of functions with (mostly) shared parameters but different processes. I'd like to use a decorator to add the description for each parameter to a function's headline-level docstring.
I've tried to mimic the structure found in this answer by incorporating a nested function within appender but failed. I've also tried functools.partial but something is slightly off.
My attempt:
def appender(func, *args):
"""Appends additional parameter descriptions to func's __doc__."""
def _doc(func):
params = ''.join([defaultdocs[arg] for arg in args])
func.__doc__ += '\n' + params
return func
return _doc
defaultdocs = {
'a' :
"""
a : int, default 0
the first parameter
""",
'b' :
"""
b : int, default 1
the second parameter
"""
}
#appender('a')
def f(a):
"""Title-level docstring."""
return a
#appender('a', 'b')
def g(a, b):
"""Title-level docstring."""
return a + b
This fails, and it fails I believe because the first arg passed to appender is interpreted as func. So when I view the resulting docstring for g I get:
print(g.__doc__)
Title-level docstring.
b : int, default 1
the second parameter
because, again, 'a' is interpreted to be 'func' when I want it to be the first element of *args. How can I correct this?
Desired result:
print(g.__doc__)
Title-level docstring.
a : int, default 0
the first parameter
b : int, default 1
the second parameter
This happens because the variable names you pass actually get captured into a func argument.
In order to do callable decorators in Python you need to code the function twice, having external function to accept decorator arguments and internal function to accept original function. Callable decorators are just higher-order functions that return other decorators. For example:
def appender(*args): # This is called when a decorator is called,
# e. g. #appender('a', 'b')
"""Appends additional parameter descriptions to func's __doc__."""
def _doc(func): # This is called when the function is about
# to be decorated
params = ''.join([defaultdocs[arg] for arg in args])
func.__doc__ += '\n' + params
return func
return _doc
The external (appender) function acts as a factory for new decorator while _doc function is an actual decorator. Always pass it this way:
Pass decorator args to the external function
Pass original function to the internal function
Once the Python sees this:
#appender('a', 'b')
def foo(): pass
...it will do something like this under the hood:
foo = appender('a', 'b')(foo)
...which expands to this:
decorator = appender('a', 'b')
foo = decorator(foo)
Because of how scopes in Python work, each newly returned _doc function instance will have its own local args value from the external function.
An alternate solution that uses inspect.signature to collect the passed function params.
import inspect
import textwrap
def appender(defaultdocs):
def _doc(func):
params = inspect.signature(func).parameters
params = [param.name for param in params.values()]
params = ''.join([textwrap.dedent(defaultdocs[param])
for param in params])
func.__doc__ += '\n\nParameters\n' + 10 * '=' + params
return func
return _doc
Example:
# default docstrings for parameters that are re-used often
# class implementation not a good alternative in my specific case
defaultdocs = {
'a' :
"""
a : int, default 0
the first parameter""",
'b' :
"""
b : int, default 1
the second parameter"""
}
#appender
def f(a):
"""Title-level docstring."""
return a
#appender
def g(a, b):
"""Title-level docstring."""
return a + b
This appends the descriptions for a and b to g.__doc__ without needing to specify them in the decorator:
help(g)
Help on function g in module __main__:
g(a, b)
Title-level docstring.
Parameters
==========
a : int, default 0
the first parameter
b : int, default 1
the second parameter
I have a base decorator that takes arguments but that also is built upon by other decorators. I can't seem to figure where to put the functools.wraps in order to preserve the full signature of the decorated function.
import inspect
from functools import wraps
# Base decorator
def _process_arguments(func, *indices):
""" Apply the pre-processing function to each selected parameter """
#wraps(func)
def wrap(f):
#wraps(f)
def wrapped_f(*args):
params = inspect.getargspec(f)[0]
args_out = list()
for ind, arg in enumerate(args):
if ind in indices:
args_out.append(func(arg))
else:
args_out.append(arg)
return f(*args_out)
return wrapped_f
return wrap
# Function that will be used to process each parameter
def double(x):
return x * 2
# Decorator called by end user
def double_selected(*args):
return _process_arguments(double, *args)
# End-user's function
#double_selected(2, 0)
def say_hello(a1, a2, a3):
""" doc string for say_hello """
print('{} {} {}'.format(a1, a2, a3))
say_hello('say', 'hello', 'arguments')
The result of this code should be and is:
saysay hello argumentsarguments
However, running help on say_hello gives me:
say_hello(*args, **kwargs)
doc string for say_hello
Everything is preserved except the parameter names.
It seems like I just need to add another #wraps() somewhere, but where?
I experimented with this:
>>> from functools import wraps
>>> def x(): print(1)
...
>>> #wraps(x)
... def xyz(a,b,c): return x
>>> xyz.__name__
'x'
>>> help(xyz)
Help on function x in module __main__:
x(a, b, c)
AFAIK, this has nothing to do with wraps itself, but an issue related to help. Indeed, because help inspects your objects to provide the information, including __doc__ and other attributes, this is why you get this behavior, although your wrapped function has different argument list. Though, wraps doesn't update that automatically (the argument list) what it really updates is this tuple and the __dict__ which is technically the objects namespace:
WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__',
'__annotations__')
WRAPPER_UPDATES = ('__dict__',)
If you aren't sure about how wraps work, probably it'll help if your read the the source code from the standard library: functools.py.
It seems like I just need to add another #wraps() somewhere, but where?
No, you don't need to add another wraps in your code, help as I stated above works that way by inspecting your objects. The function's arguments are associated with code objects (__code__) because your function's arguments are stored/represented in that object, wraps has no way to update the argument of the wrapper to be like the wrapped function (continuing with the above example):
>>> xyz.__code__.co_varnames
>>> xyz.__code__.co_varnames = x.__code__.co_varnames
AttributeError: readonly attribute
If help displayed that function xyz has this argument list () instead of (a, b, c) then this is clearly wrong! And the same applies for wraps, to change the argument list of the wrapper to the wrapped, would be cumbersome! So this should not be a concern at all.
>>> #wraps(x, ("__code__",))
... def xyz(a,b,c): pass
...
>>> help(xyz)
Help on function xyz in module __main__:
xyz()
But xyz() returns x():
>>> xyz()
1
For other references take a look at this question or the Python Documentation
What does functools.wraps do?
direprobs was correct in that no amount of functools wraps would get me there. bravosierra99 pointed me to somewhat related examples. However, I couldn't find a single example of signature preservation on nested decorators in which the outer decorator takes arguments.
The comments on Bruce Eckel's post on decorators with arguments gave me the biggest hints in achieving my desired result.
The key was in removing the middle function from within my _process_arguments function and placing its parameter in the next, nested function. It kind of makes sense to me now...but it works:
import inspect
from decorator import decorator
# Base decorator
def _process_arguments(func, *indices):
""" Apply the pre-processing function to each selected parameter """
#decorator
def wrapped_f(f, *args):
params = inspect.getargspec(f)[0]
args_out = list()
for ind, arg in enumerate(args):
if ind in indices:
args_out.append(func(arg))
else:
args_out.append(arg)
return f(*args_out)
return wrapped_f
# Function that will be used to process each parameter
def double(x):
return x * 2
# Decorator called by end user
def double_selected(*args):
return _process_arguments(double, *args)
# End-user's function
#double_selected(2, 0)
def say_hello(a1, a2,a3):
""" doc string for say_hello """
print('{} {} {}'.format(a1, a2, a3))
say_hello('say', 'hello', 'arguments')
print(help(say_hello))
And the result:
saysay hello argumentsarguments
Help on function say_hello in module __main__:
say_hello(a1, a2, a3)
doc string for say_hello
I would like to do the following:
class A(object): pass
a = A()
a.__int__ = lambda self: 3
i = int(a)
Unfortunately, this throws:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: int() argument must be a string or a number, not 'A'
This only seems to work if I assign the "special" method to the class A instead of an instance of it. Is there any recourse?
One way I thought of was:
def __int__(self):
# No infinite loop
if type(self).__int__.im_func != self.__int__.im_func:
return self.__int__()
raise NotImplementedError()
But that looks rather ugly.
Thanks.
Python always looks up special methods on the class, not the instance (except in the old, aka "legacy", kind of classes -- they're deprecated and have gone away in Python 3, because of the quirky semantics that mostly comes from looking up special methods on the instance, so you really don't want to use them, believe me!-).
To make a special class whose instances can have special methods independent from each other, you need to give each instance its own class -- then you can assign special methods on the instance's (individual) class without affecting other instances, and live happily ever after. If you want to make it look like you're assigning to an attribute the instance, while actually assigning to an attribute of the individualized per-instance class, you can get that with a special __setattr__ implementation, of course.
Here's the simple case, with explicit "assign to class" syntax:
>>> class Individualist(object):
... def __init__(self):
... self.__class__ = type('GottaBeMe', (self.__class__, object), {})
...
>>> a = Individualist()
>>> b = Individualist()
>>> a.__class__.__int__ = lambda self: 23
>>> b.__class__.__int__ = lambda self: 42
>>> int(a)
23
>>> int(b)
42
>>>
and here's the fancy version, where you "make it look like" you're assigning the special method as an instance attribute (while behind the scene it still goes to the class of course):
>>> class Sophisticated(Individualist):
... def __setattr__(self, n, v):
... if n[:2]=='__' and n[-2:]=='__' and n!='__class__':
... setattr(self.__class__, n, v)
... else:
... object.__setattr__(self, n, v)
...
>>> c = Sophisticated()
>>> d = Sophisticated()
>>> c.__int__ = lambda self: 54
>>> d.__int__ = lambda self: 88
>>> int(c)
54
>>> int(d)
88
The only recourse that works for new-style classes is to have a method on the class that calls the attribute on the instance (if it exists):
class A(object):
def __int__(self):
if '__int__' in self.__dict__:
return self.__int__()
raise ValueError
a = A()
a.__int__ = lambda: 3
int(a)
Note that a.__int__ will not be a method (only functions that are attributes of the class will become methods) so self is not passed implicitly.
I have nothing to add about the specifics of overriding __int__. But I noticed one thing about your sample that bears discussing.
When you manually assign new methods to an object, "self" is not automatically passed in. I've modified your sample code to make my point clearer:
class A(object): pass
a = A()
a.foo = lambda self: 3
a.foo()
If you run this code, it throws an exception because you passed in 0 arguments to "foo" and 1 is required. If you remove the "self" it works fine.
Python only automatically prepends "self" to the arguments if it had to look up the method in the class of the object and the function it found is a "normal" function. (Examples of "abnormal" functions: class methods, callable objects, bound method objects.) If you stick callables in to the object itself they won't automatically get "self".
If you want self there, use a closure.
In my code I have a class, where one method is responsible for filtering some data. To allow customization for descendants I would like to define filtering function as a class attribute as per below:
def my_filter_func(x):
return x % 2 == 0
class FilterClass(object):
filter_func = my_filter_func
def filter_data(self, data):
return filter(self.filter_func, data)
class FilterClassDescendant(FilterClass):
filter_func = my_filter_func2
However, such code leads to TypeError, as filter_func receives "self" as first argument.
What is a pythonic way to handle such use cases? Perhaps, I should define my "filter_func" as a regular class method?
You could just add it as a plain old attribute?
def my_filter_func(x):
return x % 2 == 0
class FilterClass(object):
def __init__(self):
self.filter_func = my_filter_func
def filter_data(self, data):
return filter(self.filter_func, data)
Alternatively, force it to be a staticmethod:
def my_filter_func(x):
return x % 2 == 0
class FilterClass(object):
filter_func = staticmethod(my_filter_func)
def filter_data(self, data):
return filter(self.filter_func, data)
Python has a lot of magic within. One of those magics has something to do with transforming functions into UnboundMethod objects (when assigned to the class, and not to an class' instance).
When you assign a function (And I'm not sure whether it applies to any callable or just functions), Python converts it to an UnboundMethod object (i.e. an object which can be called using an instance or not).
Under normal conditions, you can call your UnboundMethod as normal:
def myfunction(a, b):
return a + b
class A(object):
a = myfunction
A.a(1, 2)
#prints 3
This will not fail. However, there's a distinct case when you try to call it from an instance:
A().a(1, 2)
This will fail since when an instance gets (say, internal getattr) an attribute which is an UnboundMethod, it returns a copy of such method with the im_self member populated (im_self and im_func are members of UnboundMethod). The function you intended to call, is in the im_func member. When you call this method, you're actually calling im_func with, additionally, the value in im_self. So, the function needs an additional parameter (the first one, which will stand for self).
To avoid this magic, Python has two possible decorators:
If you want to pass the function as-is, you must use #staticmethod. In this case, you will have the function not converted to UnboundMethod. However, you will not be able to access the calling class, except as a global reference.
If you want to have the same, but be able to access the current class (disregarding whether the function it is called from an instance or from a class), then your function should have another first argument (INSTEAD of self: cls) which is a reference to the class, and the decorator to use is #classmethod.
Examples:
class A(object):
a = staticmethod(lambda a, b: a + b)
A.a(1, 2)
A().a(1, 2)
Both will work.
Another example:
def add_print(cls, a, b):
print cls.__name__
return a + b
class A(object):
ap = classmethod(add_print)
class B(A):
pass
A.ap(1, 2)
B.ap(1, 2)
A().ap(1, 2)
B().ap(1, 2)
Check this by yourseld and enjoy the magic.