I'm trying to code a method from a class that uses a decorator from another class. The problem is that I need information stored in the Class that contains the decorator (ClassWithDecorator.decorator_param). To achieve that I'm using partial, injecting self as the first argument, but when I do that the self, from the class that uses the decorator " gets lost" somehow and I end up getting an error. Note that this does not happen if I remove partial() from my_decorator() and "self" will be correctly stored inside *args.
See the code sample:
from functools import partial
class ClassWithDecorator:
def __init__(self):
self.decorator_param = "PARAM"
def my_decorator(self, decorated_func):
def my_callable(ClassWithDecorator_instance, *args, **kwargs):
# Do something with decorator_param
print(ClassWithDecorator_instance.decorator_param)
return decorated_func(*args, **kwargs)
return partial(my_callable, self)
decorator_instance = ClassWithDecorator()
class WillCallDecorator:
def __init__(self):
self.other_param = "WillCallDecorator variable"
#decorator_instance.my_decorator
def decorated_method(self):
pass
WillCallDecorator().decorated_method()
I get
PARAM
Traceback (most recent call last):
File "****/decorator.py", line 32, in <module>
WillCallDecorator().decorated_method()
File "****/decorator.py", line 12, in my_callable
return decorated_func(*args, **kwargs)
TypeError: decorated_method() missing 1 required positional argument: 'self'
How can I pass the self corresponding to WillCallDecorator() into decorated_method() but at the same time pass information from its own class to my_callable() ?
It seems that you may want to use partialmethod instead of partial:
From the docs:
class functools.partialmethod(func, /, *args, **keywords)
When func is a non-descriptor callable, an appropriate bound method is created dynamically. This behaves like a normal Python function when used as a method: the self argument will be inserted as the first positional argument, even before the args and keywords supplied to the partialmethod constructor.
So much simpler just to use the self variable you already have. There is absolutely no reason to be using partial or partialmethod here at all:
from functools import partial
class ClassWithDecorator:
def __init__(self):
self.decorator_param = "PARAM"
def my_decorator(self, decorated_func):
def my_callable(*args, **kwargs):
# Do something with decorator_param
print(self.decorator_param)
return decorated_func(*args, **kwargs)
return my_callable
decorator_instance = ClassWithDecorator()
class WillCallDecorator:
def __init__(self):
self.other_param = "WillCallDecorator variable"
#decorator_instance.my_decorator
def decorated_method(self):
pass
WillCallDecorator().decorated_method()
Also, to answer your question about why your code didn't work, when you access something.decorated_method() the code checks whether decorated_method is a function and if so turns it internally into a call WillCallDecorator.decorated_method(something). But the value returned from partial is a functools.partial object, not a function. So the class lookup binding won't happen here.
In more detail, something.method(arg) is equivalent to SomethingClass.method.__get__(something, arg) when something doesn't have an attribute method and its type SomethingClass does have the attribute and the attribute has a method __get__ but the full set of steps for attribute lookup is quite complicated.
Related
I am trying to patch __new__ method of a class, and it is not working as I expect.
from contextlib import contextmanager
class A:
def __init__(self, arg):
print('A init', arg)
#contextmanager
def patch_a():
new = A.__new__
def fake_new(cls, *args, **kwargs):
print('call fake_new')
return new(cls, *args, **kwargs)
# here I get error: TypeError: object.__new__() takes exactly one argument (the type to instantiate)
A.__new__ = fake_new
try:
yield
finally:
A.__new__ = new
if __name__ == '__main__':
A('foo')
with patch_a():
A('bar')
A('baz')
I expect the following output:
A init foo
call fake_new
A init bar
A init baz
But after call fake_new I get an error (see comment in the code).
For me It seems like I just decorate a __new__ method and propagate all args unchanged.
It doesn't work and the reason is obscure for me.
Also I can write return new(cls) and call A('bar') works fine. But then A('baz') breaks.
Can someone explain what is going on?
Python version is 3.8
You've run into a complicated part of Python object instantiation - in which the language opted for a design that would allow one to create a custom __init__ method with parameters, without having to touch __new__.
However, the in the base of class hierarchy, object, both __new__ and __init__ take one single parameter each.
IIRC, it goes this way: if your class have a custom __init__ and you did not touch __new__ and there are more any parameters to the class instantiation that would be passed to both __init__ and __new__, the parameters will be stripped from the call do __new__, so you don't have to customize it just to swallow the parameters you consume in __init__. The converse is also true: if your class have a custom __new__ with extra parameters, and no custom __init__, these are not passed to object.__init__.
With your design, Python sees a custom __new__ and passes it the same extra arguments that are passed to __init__ - and by using *args, **kw, you forward those to object.__new__ which accepts a single parameter - and you get the error you presented us.
The fix is to not pass those extra parameters to the original __new__ method - unless they are needed there - so you have to make the same check Python's type does when initiating an object.
And an interesting surprise to top it: while making the example work, I found out that even if A.__new__
is deleted when restoring the patch, it is still considered as "touched" by cPython's type instantiation, and the arguments are passed through.
In order to get your code working I needed to leave a permanent stub A.__new__ that will forward only the cls argument:
from contextlib import contextmanager
class A:
def __init__(self, arg):
print('A init', arg)
#contextmanager
def patch_a():
new = A.__new__
def fake_new(cls, *args, **kwargs):
print('call fake_new')
if new is object.__new__:
return new(cls)
return new(cls, *args, **kwargs)
# here I get error: TypeError: object.__new__() takes exactly one argument (the type to instantiate)
A.__new__ = fake_new
try:
yield
finally:
del A.__new__
if new is not object.__new__:
A.__new__ = new
else:
A.__new__ = lambda cls, *args, **kw: object.__new__(cls)
print(A.__new__)
if __name__ == '__main__':
A('foo')
with patch_a():
A('bar')
A('baz')
(I tried inspecting the original __new__ signature instead of the new is object.__new__ comparison - to no avail: object.__new__ signature is *args, **kwargs - possibly made so that it will never fail on static checking)
I have a decorator to control time limit, if the function execution exceeds limit, an error is raised.
def timeout(seconds=10):
def decorator(func):
# a timeout decorator
return decorator
And I want to build a class, using the constructor to pass the time limit into the class.
def myClass:
def __init__(self,time_limit):
self.time_limit = time_limit
#timeout(self.time_limit)
def do_something(self):
#do something
But this does not work.
File "XX.py", line YY, in myClass
#timeout(self.tlimit)
NameError: name 'self' is not defined
What's the correct way to implement this?
self.time_limit is only available when a method in an instance of your class is called.
The decorator statement, prefixing the methods, on the other hand is run when the class body is parsed.
However, the inner part of your decorator, if it will always be applied to methods, will get self as its first parameter - and there you can simply make use of any instance attribute:
def timeout(**decorator_parms):
def decorator(func):
def wrapper(self, *args, **kwargs):
time_limit = self.time_limit
now = time.time()
result = func(self, *args, **kwargs)
# code to check timeout
..
return result
return wrapper
return decorator
If your decorator is expected to work with other time limits than always self.limit you could always pass a string or other constant object, and check it inside the innermost decorator with a simple if statement. In case the timeout is a certain string or object, you use the instance attribute, otherwise you use the passed in value;
You can also decorate a method in the constructor:
def myClass:
def __init__(self,time_limit):
self.do_something = timeout(time_limit)(self.do_something)
def do_something(self):
#do something
Consider this small example:
import datetime as dt
class Timed(object):
def __init__(self, f):
self.func = f
def __call__(self, *args, **kwargs):
start = dt.datetime.now()
ret = self.func(*args, **kwargs)
time = dt.datetime.now() - start
ret["time"] = time
return ret
class Test(object):
def __init__(self):
super(Test, self).__init__()
#Timed
def decorated(self, *args, **kwargs):
print(self)
print(args)
print(kwargs)
return dict()
def call_deco(self):
self.decorated("Hello", world="World")
if __name__ == "__main__":
t = Test()
ret = t.call_deco()
which prints
Hello
()
{'world': 'World'}
Why is the self parameter (which should be the Test obj instance) not passed as first argument to the decorated function decorated?
If I do it manually, like :
def call_deco(self):
self.decorated(self, "Hello", world="World")
it works as expected. But if I must know in advance if a function is decorated or not, it defeats the whole purpose of decorators. What is the pattern to go here, or do I misunderstood something?
tl;dr
You can fix this problem by making the Timed class a descriptor and returning a partially applied function from __get__ which applies the Test object as one of the arguments, like this
class Timed(object):
def __init__(self, f):
self.func = f
def __call__(self, *args, **kwargs):
print(self)
start = dt.datetime.now()
ret = self.func(*args, **kwargs)
time = dt.datetime.now() - start
ret["time"] = time
return ret
def __get__(self, instance, owner):
from functools import partial
return partial(self.__call__, instance)
The actual problem
Quoting Python documentation for decorator,
The decorator syntax is merely syntactic sugar, the following two function definitions are semantically equivalent:
def f(...):
...
f = staticmethod(f)
#staticmethod
def f(...):
...
So, when you say,
#Timed
def decorated(self, *args, **kwargs):
it is actually
decorated = Timed(decorated)
only the function object is passed to the Timed, the object to which it is actually bound is not passed on along with it. So, when you invoke it like this
ret = self.func(*args, **kwargs)
self.func will refer to the unbound function object and it is invoked with Hello as the first argument. That is why self prints as Hello.
How can I fix this?
Since you have no reference to the Test instance in the Timed, the only way to do this would be to convert Timed as a descriptor class. Quoting the documentation, Invoking descriptors section,
In general, a descriptor is an object attribute with “binding behavior”, one whose attribute access has been overridden by methods in the descriptor protocol: __get__(), __set__(), and __delete__(). If any of those methods are defined for an object, it is said to be a descriptor.
The default behavior for attribute access is to get, set, or delete the attribute from an object’s dictionary. For instance, a.x has a lookup chain starting with a.__dict__['x'], then type(a).__dict__['x'], and continuing through the base classes of type(a) excluding metaclasses.
However, if the looked-up value is an object defining one of the descriptor methods, then Python may override the default behavior and invoke the descriptor method instead.
We can make Timed a descriptor, by simply defining a method like this
def __get__(self, instance, owner):
...
Here, self refers to the Timed object itself, instance refers to the actual object on which the attribute lookup is happening and owner refers to the class corresponding to the instance.
Now, when __call__ is invoked on Timed, the __get__ method will be invoked. Now, somehow, we need to pass the first argument as the instance of Test class (even before Hello). So, we create another partially applied function, whose first parameter will be the Test instance, like this
def __get__(self, instance, owner):
from functools import partial
return partial(self.__call__, instance)
Now, self.__call__ is a bound method (bound to Timed instance) and the second parameter to partial is the first argument to the self.__call__ call.
So, all these effectively translate like this
t.call_deco()
self.decorated("Hello", world="World")
Now self.decorated is actually Timed(decorated) (this will be referred as TimedObject from now on) object. Whenever we access it, the __get__ method defined in it will be invoked and it returns a partial function. You can confirm that like this
def call_deco(self):
print(self.decorated)
self.decorated("Hello", world="World")
would print
<functools.partial object at 0x7fecbc59ad60>
...
So,
self.decorated("Hello", world="World")
gets translated to
Timed.__get__(TimedObject, <Test obj>, Test.__class__)("Hello", world="World")
Since we return a partial function,
partial(TimedObject.__call__, <Test obj>)("Hello", world="World"))
which is actually
TimedObject.__call__(<Test obj>, 'Hello', world="World")
So, <Test obj> also becomes a part of *args, and when self.func is invoked, the first argument will be the <Test obj>.
You first have to understand how function become methods and how self is "automagically" injected.
Once you know that, the "problem" is obvious: you are decorating the decorated function with a Timed instance - IOW, Test.decorated is a Timed instance, not a function instance - and your Timed class does not mimick the function type's implementation of the descriptor protocol. What you want looks like this:
import types
class Timed(object):
def __init__(self, f):
self.func = f
def __call__(self, *args, **kwargs):
start = dt.datetime.now()
ret = self.func(*args, **kwargs)
time = dt.datetime.now() - start
ret["time"] = time
return ret
def __get__(self, instance, cls):
return types.MethodType(self, instance, cls)
What I am trying to do is write a wrapper around another module so that I can transform the parameters that are being passed to the methods of the other module. That was fairly confusing, so here is an example:
import somemodule
class Wrapper:
def __init__(self):
self.transforms = {}
self.transforms["t"] = "test"
# This next function is the one I want to exist
# Please understand the lines below will not compile and are not real code
def __intercept__(self, item, *args, **kwargs):
if "t" in args:
args[args.index("t")] = self.transforms["t"]
return somemodule.item(*args, **kwargs)
The goal is to allow users of the wrapper class to make simplified calls to the underlying module without having to rewrite all of the functions in the module. So in this case if somemodule had a function called print_uppercase then the user could do
w = Wrapper()
w.print_uppercase("t")
and get the output
TEST
I believe the answer lies in __getattr__ but I'm not totally sure how to use it for this application.
__getattr__ combined with defining a function on the fly should work:
# somemodule
def print_uppercase(x):
print(x.upper())
Now:
from functools import wraps
import somemodule
class Wrapper:
def __init__(self):
self.transforms = {}
self.transforms["t"] = "test"
def __getattr__(self, attr):
func = getattr(somemodule, attr)
#wraps(func)
def _wrapped(*args, **kwargs):
if "t" in args:
args = list(args)
args[args.index("t")] = self.transforms["t"]
return func(*args, **kwargs)
return _wrapped
w = Wrapper()
w.print_uppercase('Hello')
w.print_uppercase('t')
Output:
HELLO
TEST
I would approach this by calling the intercept method, and entering the desired method to execute, as a parameter for intercept. Then, in the intercept method, you can search for a method with that name and execute it.
Since your Wrapper object doesn't have any mutable state, it'd be easier to implement without a class. Example wrapper.py:
def func1(*args, **kwargs):
# do your transformations
return somemodule.func1(*args, **kwargs)
Then call it like:
import wrapper as w
print w.func1('somearg')
I'm writing a Python class to wrap/decorate/enhance another class from a package called petl, a framework for ETL (data movement) workflows. Due to design constraints I can't just subclass it; every method call has to be sent through my own class so I can control what kind of objects are being passed back. So in principle this is a proxy class, but I'm having some trouble using existing answers/recipes out there. This is what my code looks like:
from functools import partial
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name)
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""this returns a partial referencing the child method"""
petl_attr = getattr(self.petl_tbl, name, None)
if petl_attr and callable(petl_attr):
return partial(self.call_petl_method, func=petl_attr)
raise NotImplementedError('Not implemented')
def call_petl_method(self, func, *args, **kwargs):
func(*args, **kwargs)
Then I try to instantiate a table and call something:
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world')
This gives a TypeError: call_petl_method() got multiple values for argument 'func'.
This only happens with positional arguments; kwargs seem to be fine. I'm pretty sure it has to do with self not being passed in, but I'm not sure what the solution is. Can anyone think of what I'm doing wrong, or a better solution altogether?
This seems to be a common issue with mixing positional and keyword args:
TypeError: got multiple values for argument
To get around it, I took the positional arg func out of call_petl_method and put it in a kwarg that's unlikely to overlap with the kwargs of the child function. A little hacky, but it works.
I ended up writing a Proxy class to do all this generically:
class Proxy(object):
def __init__(self, child):
self.child = child
def __getattr__(self, name):
child_attr = getattr(self.child, name)
return partial(self.call_child_method, __child_fn__=child_attr)
#classmethod
def call_child_method(cls, *args, **kwargs):
"""
This calls a method on the child object and wraps the response as an
object of its own class.
Takes a kwarg `__child_fn__` which points to a method on the child
object.
Note: this can't take any positional args or they get clobbered by the
keyword args we're trying to pass to the child. See:
https://stackoverflow.com/questions/21764770/typeerror-got-multiple-values-for-argument
"""
# get child method
fn = kwargs.pop('__child_fn__')
# call the child method
r = fn(*args, **kwargs)
# wrap the response as an object of the same class
r_wrapped = cls(r)
return r_wrapped
This will also solve the problem. It doesn't use partial at all.
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name))
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""Looks-up named attribute in class of the petl_tbl object."""
petl_attr = self.petl_tbl.__class__.__dict__.get(name, None)
if petl_attr and callable(petl_attr):
return petl_attr
raise NotImplementedError('Not implemented')
if __name__ == '__main__':
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world') # -> Hello, world!