I need to augment the behavior of a class using external methods, hence I leverage the strategy pattern.
First I define an interface for the signature of methods:
class ILabel(ABC):
#abstractmethod
def get_label(self, obj):
pass
and an implementation of that interface:
class Label(ILabel):
def __init__(self, prefix):
self.prefix = prefix
def get_label(self, merchandise, obj):
return self.prefix + str(obj) + merchandise.name
And the class that I would like to augment its algorithm:
class Merchandise:
def __init__(self):
self.name = "__a_name"
def get_label(self, obj):
return str(obj) + self.name
def display(self, obj, get_label=None):
if get_label:
self.get_label = types.MethodType(get_label, self)
print(self.get_label(obj))
And finally the caller:
# default behavior
x = Merchandise().display("an_obj")
# augmented behavior
label = Label("a_prefix__")
y = Merchandise().display("an_obj", label)
print(f"Default output: {x}")
print(f"Augmented output: {y}")
the output should be:
Default output: an_obj__a_name
Augmented output: a_prefix__an_obj__a_name
Two questions:
Given instead of an "orphan" method (for the lack of a better word), I am sending a method within a class with reference to self; is this still considered strategy pattern, or a different pattern is closer to this design?
Since I pass a reference to the Merchandise when registering the method (i.e., types.MethodType(get_label, self)), the get_label method in the Label class has a reference to an instance Merchandise. i.e.:
def get_label(self, merchandise, obj):
The question is, is there any better naming convention for merchandise reference?
Update
In an endeavor to provide a minimal-working example, a decent amount of context is striped, which may lead to thinking the get_label method can be stateless (i.e., without a reference to an instance of Merchandise). The Label.get_label is updated to clarify this point.
Given instead of an "orphan" method (for the lack of a better word), I am sending a method within a class with reference to self; is this still considered strategy pattern, or a different pattern is closer to this design?
I would still call this Strategy, but keep reading.
Since I pass a reference to the Merchandise when registering the method (i.e., types.MethodType(get_label, self)), the correct definition of get_label in the Label class is:
This confusion is why you should take a different approach.
Your logic for implementing default behaviour of the strategy is backwards. There is no reason why "a Strategy for getting a label for obj" should need to be a method of the Merchandise class, except that you happen to have the default implementation stored there. Even that doesn't need to be an ordinary method, since it doesn't do anything with self.
This means the code is too complex (because you're needlessly using the types.MethodType machinery and dynamically patching the class) and also has unexpected stateful behaviour: when you call display with a non-None value for get_label, that Strategy will affect future calls to display where None is passed.
If you don't want stateful behaviour, then you want the default-setting logic the other way around - set a local rather than modifying the class:
class Merchandise:
#staticmethod
def get_label(obj):
return str(obj)
def display(self, obj, get_label=None):
if get_label is None:
get_label = Merchandise.get_label
print(get_label(obj))
Although we don't actually need the "replace None with a default value" pattern here, since we aren't going to mutate the parameter:
class Merchandise:
#staticmethod
def get_label(obj):
return str(obj)
def display(self, obj, get_label=Merchandise.get_label):
print(get_label(obj))
And this toy example it could be even simpler:
class Merchandise:
def display(self, obj, get_label=str):
print(get_label(obj))
# although *this* doesn't rely on `self`, either....
If you do want stateful behaviour, then you should set the state either at initialization, or explicitly later, or both:
class Merchandise:
def __init__(self, get_label=str):
self.get_label = get_label
#property
def get_label(self): return self._get_label
#get_label.setter
def get_label(self, value):
# may as well do a little verification
if not callable(value):
raise TypeError("get_label strategy must be callable")
self._get_label = value
def display(self, obj):
print(self.get_label(obj))
Notice here that self.get_label(obj) is not a method call; Python will find get_label as an attribute of the instance, before it attempts to look it up in the class; having found a callable object, it then calls that object.
Related
Task:
Implement some class that accepts at least one argument and can be either initialized by original data, or its own instance.
Minimal example of usage:
arg = {} # whatever necessary for the real object
instance1 = NewClass(arg)
instance2 = NewClass(instance1)
assert instance2 is instance1 # or at least, ==
More complex example of usage:
from typing import Mapping, Union
class NewClass:
"""
Incomplete
Should somehow act like described in the task
"""
def __init__(self, data: Mapping):
self.data = data
def cool_method(self):
assert isinstance(self.data, Mapping)
# do smth with self.data
return ...
...
class AnotherClass:
"""
Accepts both mappings and NewClass instances,
but needs NewClass internally
"""
def __init__(self, obj: Union[Mapping, NewClass]):
self.cool = NewClass(obj).cool_method()
...
One just have to make use of the __new__ method on the class, instead of __init__ to be able to change what is instantiated.
In this case, all you need is to write your NewClass like this:
from typing import Union, Mapping, Self
class NewClass:
"""
acts like described in the task
"""
# typing.Self is available in Python 3.11.
# For previous versions, just put the class name quoted
# in a string: `"NewClass"` instead of `Self`
def __new__(cls, data: Union[Mapping, Self]):
if isinstance(data, NewClass):
return data
self = super().__new__(cls)
self.data = data
return self
def cool_method(self):
assert isinstance(self.data, Mapping)
# do smth with self.data
return ...
Avoiding a metaclass is interesting because it avoid metaclasses conflicts, in larger projects, and it is an abstraction level most
projects simply does not need. Actually, static type checkers such
as "Mypy" can't even figure out behavior changes coded into
the metaclasses.
On the other hand, __new__ is a common special method sibling to __init__, readily available, just not used more commonly because Python also provides
__init__, which suffices when the default behavior of __new__, of
always creating a new instance, is not the desired one.
For some reason I do not know, making use of a metaclass to create a "singleton" got wildly popular in tutorials and answers. It is a design pattern much less important and less used in Python than in languages which do not allow "stand alone" functions. Metaclasses are not needed for singletons either, by the way - one can just create a top-level instance of whatever class should have a single instance, and use that instance from that point on, instead of creating new instances. Other languages also restrict the existence of top-level, importable, instances, making that a need that was artificially imported into Python.
Metaclass solution:
Actual for python 3.8
class SelfWrapperMeta(type):
"""
Metaclass, allowing to return previously created user class instance,
if the user class init receives it as the first positional argument
Other arguments are just ignored in that self-wrapping case
Otherwise, the user class init calls normally
"""
def __call__(cls, arg, /, *args, **kwargs):
if isinstance(arg, cls):
return arg
return super().__call__(arg, *args, **kwargs)
Example of usage:
class A(metaclass=SelfWrapperMeta):
def __init__(self, data):
self.data = data
example = {}
a = A(example)
b = A(a)
c = A(example)
assert a is b
assert c is not a
I have subclassed the built-in property class (call it SpecialProperty) in order to add more fields to it:
class SpecialProperty(property):
extra_field_1 = None
extra_field_2 = None
def __init__(self, fget=None, fset=None, fdel=None, doc=None):
super().__init__(fget, fset, fdel, doc)
def make_special_property(func):
prop = SerialisableProperty(fget=func)
return prop
and I am able to use it in the same fashion as the built-in property() decorator:
#my_module.make_special_property
def get_my_property(self): return self._my_property
I now want to further specialise my SpecialProperty instances populating one of the extra fields I have added to the class with an arbitrary value.
Is it possible, in Python, to write a decorator that will return a property with also accepting extra parameters?
I'd like to do it via the decorator because this is where and when the information is most relevant, however I'm finding myself stuck. I suspect this falls under the domain of decorators with arguments that have been well documented (Decorators with arguments? (Stack Overflow), or Python Decorators II: Decorator Arguments (artima.com) to only cite a couple sources), however I find myself unable to apply the same pattern to my case.
Here's how I'm trying to write it:
#my_module.make_special_property("example string")
def get_my_property(self): return self._my_property
And on the class declaring get_my_property:
>>> DeclaringClass.my_property
<SpecialProperty object at 0x...>
>>> DeclaringClass.my_property.extra_field_1
'example string'
Since I am making properties, the decorated class member should be swapped with an instance of SpecialProperty, and hence should not be a callable anymore -- thus, I am unable to apply the "nested wrapper" pattern for allowing a decorator with arguments.
Non working example:
def make_special_property(custom_arg_1):
def wrapper(func):
prop = SerialisableProperty(fget=func)
prop.extra_field_1 = custom_arg_1
return prop
return wrapper # this returns a callable (function)
I shouldn't have a callable be returned here, if I want a property I should have a SpecialProperty instance be returned, but I can't call return wrapper(func) for obvious reasons.
Your decorator doesn't return a callable. Your decorator factory returns a decorator, which returns a property. You might understand better if you rename the functions:
def make_decorator(custom_arg_1):
def decorator(func):
prop = SerialisableProperty(fget=func)
prop.extra_field_1 = custom_arg_1
return prop
return decorator
When you decorate with make_decorator, it is called with an argument, and decorator is returned and called on the decorated function.
I'm trying to understand how to change an object's attribute temporarily when it is called and have the original value persist when the object is not called.
Let me describe the problem with some code:
class DateCalc:
DEFAULT= "1/1/2001"
def __init__(self, day=DEFAULT):
self.day= day
def __call__(self, day=DEFAULT):
self.day= day
return self
def getday(self):
return self.day
In the event where a user calls getday method while passing another value
i.e 2/2/2002, self.day is set to 2/2/2002. However I want to be able to revert self.day to the original value of 1/1/2001 after the method call:
d_obj = DateCalc()
d_obj.getday() == "1/1/2001"
True
d_obj().getday() == "1/1/2001"
True
another_day_str = "2/2/2002"
d_obj(another_day_str).getday()
returns
"2/2/2002"
But when I run the following
d_obj.getday()
returns
"2/2/2002"
I was wondering what's the right way to revert the value, without needing to include code at every method call. Secondly, this should also be true when the object is called. For example:
d_obj().getday()
should return
"1/1/2001"
I thought a decorator on the call magic method would work here, but I'm not really sure where to start.
Any help would be much appreciated
Since you probably don't really want to modify the attributes of your object for a poorly defined interval, you need to return or otherwise create a different object.
The simplest case would be one in which you had two separate objects, and no __call__ method at all:
d1_obj = DateCalc()
d2_obj = DateCalc('2/2/2002')
print(d1_obj.getday()) # 1/1/2001
print(d2_obj.getday()) # 2/2/2002
If you know where you want to use d_obj vs d_obj() in the original case, you clearly know where to use d1_obj vs d2_obj in this version as well.
This may not be adequate for cases where DateCalc actually represents a very complex object that has many attributes that you do not want to change. In that case, you can have the __call__ method return a separate object that intelligently copies the portions of the original that you want.
For a simple case, this could be just
def __call__(self, day=DEFAULT):
return type(self)(day)
If the object becomes complex enough, you will want to create a proxy. A proxy is an object that forwards most of the implementation details to another object. super() is an example of a proxy that has a very highly customized __getattribute__ implementation, among other things.
In your particular case, you have a couple of requirements:
The proxy must store all overriden attributes.
The proxy must get all non-overriden attributes from the original objects.
The proxy must pass itself as the self parameter to any (at least non-special) methods that are invoked.
You can get as complicated with this as you want (in which case look up how to properly implement proxy objects like here). Here is a fairly simple example:
# Assume that there are many fields like `day` that you want to modify
class DateCalc:
DEFAULT= "1/1/2001"
def __init__(self, day=DEFAULT):
self.day= day
def getday(self):
return self.day
def __call__(self, **kwargs):
class Proxy:
def __init__(self, original, **kwargs):
self._self_ = original
self.__dict__.update(kwargs)
def __getattribute__(self, name):
# Don't forward any overriden, dunder or quasi-private attributes
if name.startswith('_') or name in self.__dict__:
return object.__getattribute__(self, name)
# This part is simplified:
# it does not take into account __slots__
# or attributes shadowing methods
t = type(self._self_)
if name in t.__dict__:
try:
return t.__dict__[name].__get__(self, t)
except AttributeError:
pass
return getattr(self._self_, name)
return Proxy(self, **kwargs)
The proxy would work exactly as you would want: it forwards any values that you did not override in __call__ from the original object. The interesting thing is that it binds instance methods to the proxy object instead of the original, so that getday gets called with a self that has the overridden value in it:
d_obj = DateCalc()
print(type(d_obj)) # __main__.DateCalc
print(d_obj.getday()) # 1/1/2001
d2_obj = d_obj(day='2/2/2002')
print(type(d2_obj)) # __main__.DateCalc.__call__.<locals>.Proxy
print(d2_obj.getday()) # 2/2/2002
Keep in mind that the proxy object shown here has very limited functionality implemented, and will not work properly in many situations. That being said, it likely covers many of the use cases that you will have out of the box. A good example is if you chose to make day a property instead of having a getter (it is the more Pythonic approach):
class DateCalc:
DEFAULT= "1/1/2001"
def __init__(self, day=DEFAULT):
self.__dict__['day'] = day
#property
def day(self):
return self.__dict__['day']
# __call__ same as above
...
d_obj = DateCalc()
print(d_obj(day='2/2/2002').day) # 2/2/2002
The catch here is that the proxy's version of day is just a regular writable attribute instead of a read-only property. If this is a problem for you, implementing __setattr__ appropriately on the proxy will be left as an exercise for the reader.
It seems that you want a behavior like a context manager: to modify an attribute for a limited time, use the updated attribute and then revert to the original. You can do this by having __call__ return a context manager, which you can then use in a with block like this:
d_obj = DateCalc()
print(d_obj.getday()) # 1/1/2001
with d_obj('2/2/2002'):
print(d_obj.getday()) # 2/2/2002
print(d_obj.getday()) # 1/1/2001
There are a couple of ways of creating such a context manager. The simplest would be to use a nested method in __call__ and decorate it with contextlib.contextmanager:
from contextlib import contextmanager
...
def __call__(self, day=DEFAULT):
#contextmanager
def context()
orig = self.day
self.day = day
yield
self.day = orig
return context
You could also use a fully-fledged nested class for this, but I would not recommend it unless you have some really complex requirements. I am just providing it for completeness:
def __call__(self, day=DEFAULT):
class Context:
def __init__(self, inst, new):
self.inst = inst
self.old = inst.day
self.new = new
def __enter__(self):
self.inst.day = self.new
def __exit__(self, *args):
self.inst.day = self.old
return Context(self, day)
Also, you should consider making getday a property, especially if it is really read-only.
Another alternative would be to have your methods accept different values:
def getday(self, day=None):
if day is None:
day = self.day
return day
This is actually a fairly common idiom.
Ok, here is the real world scenario: I'm writing an application, and I have a class that represents a certain type of files (in my case this is photographs but that detail is irrelevant to the problem). Each instance of the Photograph class should be unique to the photo's filename.
The problem is, when a user tells my application to load a file, I need to be able to identify when files are already loaded, and use the existing instance for that filename, rather than create duplicate instances on the same filename.
To me this seems like a good situation to use memoization, and there's a lot of examples of that out there, but in this case I'm not just memoizing an ordinary function, I need to be memoizing __init__(). This poses a problem, because by the time __init__() gets called it's already too late as there's a new instance created already.
In my research I found Python's __new__() method, and I was actually able to write a working trivial example, but it fell apart when I tried to use it on my real-world objects, and I'm not sure why (the only thing I can think of is that my real world objects were subclasses of other objects that I can't really control, and so there were some incompatibilities with this approach). This is what I had:
class Flub(object):
instances = {}
def __new__(cls, flubid):
try:
self = Flub.instances[flubid]
except KeyError:
self = Flub.instances[flubid] = super(Flub, cls).__new__(cls)
print 'making a new one!'
self.flubid = flubid
print id(self)
return self
#staticmethod
def destroy_all():
for flub in Flub.instances.values():
print 'killing', flub
a = Flub('foo')
b = Flub('foo')
c = Flub('bar')
print a
print b
print c
print a is b, b is c
Flub.destroy_all()
Which output this:
making a new one!
139958663753808
139958663753808
making a new one!
139958663753872
<__main__.Flub object at 0x7f4aaa6fb050>
<__main__.Flub object at 0x7f4aaa6fb050>
<__main__.Flub object at 0x7f4aaa6fb090>
True False
killing <__main__.Flub object at 0x7f4aaa6fb050>
killing <__main__.Flub object at 0x7f4aaa6fb090>
It's perfect! Only two instances were made for the two unique id's given, and Flub.instances clearly only has two listed.
But when I tried to take this approach with the objects I was using, I got all kinds of nonsensical errors about how __init__() took only 0 arguments, not 2. So I'd change some things around and then it would tell me that __init__() needed an argument. Totally bizarre.
After a while of fighting with it, I basically just gave up and moved all the __new__() black magic into a staticmethod called get, such that I could call Photograph.get(filename) and it would only call Photograph(filename) if filename wasn't already in Photograph.instances.
Does anybody know where I went wrong here? Is there some better way to do this?
Another way of thinking about it is that it's similar to a singleton, except it's not globally singleton, just singleton-per-filename.
Here's my real-world code using the staticmethod get if you want to see it all together.
Let us see two points about your question.
Using memoize
You can use memoization, but you should decorate the class, not the __init__ method. Suppose we have this memoizator:
def get_id_tuple(f, args, kwargs, mark=object()):
"""
Some quick'n'dirty way to generate a unique key for an specific call.
"""
l = [id(f)]
for arg in args:
l.append(id(arg))
l.append(id(mark))
for k, v in kwargs:
l.append(k)
l.append(id(v))
return tuple(l)
_memoized = {}
def memoize(f):
"""
Some basic memoizer
"""
def memoized(*args, **kwargs):
key = get_id_tuple(f, args, kwargs)
if key not in _memoized:
_memoized[key] = f(*args, **kwargs)
return _memoized[key]
return memoized
Now you just need to decorate the class:
#memoize
class Test(object):
def __init__(self, somevalue):
self.somevalue = somevalue
Let us see a test?
tests = [Test(1), Test(2), Test(3), Test(2), Test(4)]
for test in tests:
print test.somevalue, id(test)
The output is below. Note that the same parameters yield the same id of the returned object:
1 3072319660
2 3072319692
3 3072319724
2 3072319692
4 3072319756
Anyway, I would prefer to create a function to generate the objects and memoize it. Seems cleaner to me, but it may be some irrelevant pet peeve:
class Test(object):
def __init__(self, somevalue):
self.somevalue = somevalue
#memoize
def get_test_from_value(somevalue):
return Test(somevalue)
Using __new__:
Or, of course, you can override __new__. Some days ago I posted an answer about the ins, outs and best practices of overriding __new__ that can be helpful. Basically, it says to always pass *args, **kwargs to your __new__ method.
I, for one, would prefer to memoize a function which creates the objects, or even write a specific function which would take care of never recreating a object to the same parameter. Of course, however, this is mostly a opinion of mine, not a rule.
The solution that I ended up using is this:
class memoize(object):
def __init__(self, cls):
self.cls = cls
self.__dict__.update(cls.__dict__)
# This bit allows staticmethods to work as you would expect.
for attr, val in cls.__dict__.items():
if type(val) is staticmethod:
self.__dict__[attr] = val.__func__
def __call__(self, *args):
key = '//'.join(map(str, args))
if key not in self.cls.instances:
self.cls.instances[key] = self.cls(*args)
return self.cls.instances[key]
And then you decorate the class with this, not __init__. Although brandizzi provided me with that key piece of information, his example decorator didn't function as desired.
I found this concept quite subtle, but basically when you're using decorators in Python, you need to understand that the thing that gets decorated (whether it's a method or a class) is actually replaced by the decorator itself. So for example when I'd try to access Photograph.instances or Camera.generate_id() (a staticmethod), I couldn't actually access them because Photograph doesn't actually refer to the original Photograph class, it refers to the memoized function (from brandizzi's example).
To get around this, I had to create a decorator class that actually took all the attributes and static methods from the decorated class and exposed them as it's own. Almost like a subclass, except that the decorator class doesn't know ahead of time what classes it will be decorating, so it has to copy the attributes over after the fact.
The end result is that any instance of the memoize class becomes an almost transparent wrapper around the actual class that it has decorated, with the exception that attempting to instantiate it (but really calling it) will provide you with cached copies when they're available.
The parameters to __new__ also get passed to __init__, so:
def __init__(self, flubid):
...
You need to accept the flubid argument there, even if you don't use it in __init__
Here is the relevant comment taken from typeobject.c in Python2.7.3
/* You may wonder why object.__new__() only complains about arguments
when object.__init__() is not overridden, and vice versa.
Consider the use cases:
1. When neither is overridden, we want to hear complaints about
excess (i.e., any) arguments, since their presence could
indicate there's a bug.
2. When defining an Immutable type, we are likely to override only
__new__(), since __init__() is called too late to initialize an
Immutable object. Since __new__() defines the signature for the
type, it would be a pain to have to override __init__() just to
stop it from complaining about excess arguments.
3. When defining a Mutable type, we are likely to override only
__init__(). So here the converse reasoning applies: we don't
want to have to override __new__() just to stop it from
complaining.
4. When __init__() is overridden, and the subclass __init__() calls
object.__init__(), the latter should complain about excess
arguments; ditto for __new__().
Use cases 2 and 3 make it unattractive to unconditionally check for
excess arguments. The best solution that addresses all four use
cases is as follows: __init__() complains about excess arguments
unless __new__() is overridden and __init__() is not overridden
(IOW, if __init__() is overridden or __new__() is not overridden);
symmetrically, __new__() complains about excess arguments unless
__init__() is overridden and __new__() is not overridden
(IOW, if __new__() is overridden or __init__() is not overridden).
However, for backwards compatibility, this breaks too much code.
Therefore, in 2.6, we'll *warn* about excess arguments when both
methods are overridden; for all other cases we'll use the above
rules.
*/
Was trying to figure this out as well and I put together a solution that combines some tips from other StackOverflow questions (links in the code comments).
If anyone still needs, try this out:
import functools
from collections import OrderedDict
def memoize(f):
class Memoized:
def __init__(self, func):
self._f = func
self._cache = {}
# Make the Memoized class masquerade as the object we are memoizing.
# Preserve class attributes
functools.update_wrapper(self, func)
# Preserve static methods
# From https://stackoverflow.com/questions/11174362
for k, v in func.__dict__.items():
self.__dict__[k] = v.__func__ if type(v) is staticmethod else v
def __call__(self, *args, **kwargs):
# Generate key
key = (args)
if kwargs:
key += (object())
for k, v in kwargs.items():
key += (hash(k))
key += (hash(v))
key = hash(key)
if key in self._cache:
return self._cache[key]
else:
self._cache[key] = self._f(*args, **kwargs)
return self._cache[key]
def __get__(self, instance, owner):
"""
From https://stackoverflow.com/questions/30104047/how-can-i-decorate-an-instance-method-with-a-decorator-class
"""
return functools.partial(self.__call__, instance)
def __instancecheck__(self, other):
"""Make isinstance() work"""
return isinstance(other, self._f)
return Memoized(f)
Then you can use like so:
#memoize
class Test:
def __init__(self, value):
self._value = value
#property
def value(self):
return self._value
Uploaded the full thing with documentation to: https://github.com/spoorn/nemoize
This question already has answers here:
Decorating class methods - how to pass the instance to the decorator?
(3 answers)
Closed 3 years ago.
I am new to Python decorators (wow, great feature!), and I have trouble getting the following to work because the self argument gets sort of mixed up.
#this is the decorator
class cacher(object):
def __init__(self, f):
self.f = f
self.cache = {}
def __call__(self, *args):
fname = self.f.__name__
if (fname not in self.cache):
self.cache[fname] = self.f(self,*args)
else:
print "using cache"
return self.cache[fname]
class Session(p.Session):
def __init__(self, user, passw):
self.pl = p.Session(user, passw)
#cacher
def get_something(self):
print "get_something called with self = %s "% self
return self.pl.get_something()
s = Session(u,p)
s.get_something()
When I run this, I get:
get_something called with self = <__main__.cacher object at 0x020870F0>
Traceback:
...
AttributeError: 'cacher' object has no attribute 'pl'
for the line where I do self.cache[fname] = self.f(self,*args)
The problem - Obviously, the problem is that self is the cacher object instead of a Session instance, which indeed doesn't have a pl attribute. However I can't find how to solve this.
Solutions I've considered, but can't use - I thought of making the decorator class return a function instead of a value (like in section 2.1 of this article) so that self is evaluated in the right context, but that isn't possible since my decorator is implemented as a class and uses the build-in __call__ method. Then I thought to not use a class for my decorator, so that I don't need the __call__method, but I can't do that because I need to keep state between decorator calls (i.e. for keeping track of what is in the self.cache attribute).
Question - So, apart from using a global cache dictionary variable (which I didn't try, but assume will work), is there any other way to make this decorator work?
Edit: this SO question seems similar Decorating python class methods, how do I pass the instance to the decorator?
Use the descriptor protocol like this:
import functools
class cacher(object):
def __init__(self, f):
self.f = f
self.cache = {}
def __call__(self, *args):
fname = self.f.__name__
if (fname not in self.cache):
self.cache[fname] = self.f(self,*args)
else:
print "using cache"
return self.cache[fname]
def __get__(self, instance, instancetype):
"""Implement the descriptor protocol to make decorating instance
method possible.
"""
# Return a partial function with the first argument is the instance
# of the class decorated.
return functools.partial(self.__call__, instance)
Edit :
How it's work ?
Using the descriptor protocol in the decorator will allow us to access the method decorated with the correct instance as self, maybe some code can help better:
Now when we will do:
class Session(p.Session):
...
#cacher
def get_something(self):
print "get_something called with self = %s "% self
return self.pl.get_something()
equivalent to:
class Session(p.Session):
...
def get_something(self):
print "get_something called with self = %s "% self
return self.pl.get_something()
get_something = cacher(get_something)
So now get_something is an instance of cacher . so when we will call the method get_something it will be translated to this (because of the descriptor protocol):
session = Session()
session.get_something
# <==>
session.get_something.__get__(get_something, session, <type ..>)
# N.B: get_something is an instance of cacher class.
and because :
session.get_something.__get__(get_something, session, <type ..>)
# return
get_something.__call__(session, ...) # the partial function.
so
session.get_something(*args)
# <==>
get_something.__call__(session, *args)
Hopefully this will explain how it work :)
Closures are often a better way to go, since you don't have to muck about with the descriptor protocol. Saving mutable state across calls is even easier than with a class, since you just stick the mutable object in the containing scope (references to immutable objects can be handled either via the nonlocal keyword, or by stashing them in a mutable object like a single-entry list).
#this is the decorator
from functools import wraps
def cacher(f):
# No point using a dict, since we only ever cache one value
# If you meant to create cache entries for different arguments
# check the memoise decorator linked in other answers
print("cacher called")
cache = []
#wraps(f)
def wrapped(*args, **kwds):
print ("wrapped called")
if not cache:
print("calculating and caching result")
cache.append(f(*args, **kwds))
return cache[0]
return wrapped
class C:
#cacher
def get_something(self):
print "get_something called with self = %s "% self
C().get_something()
C().get_something()
If you aren't completely familiar with the way closures work, adding more print statements (as I have above) can be illustrative. You will see that cacher is only called as the function is defined, but wrapped is called each time the method is called.
This does highlight how you need to be careful with memoisation techniques and instance methods though - if you aren't careful to account for changes in the value of self, you will end up sharing cached answers across instances, which may not be what you want.
First, you explicitly pass cacher object as first argument in the following line:
self.cache[fname] = self.f(self,*args)
Python automatically adds self to the list of arguments for methods only. It converts functions (but not other callables as your cacher object!) defined in class namespace to methods. To get such behavior I see two ways:
Change your decorator to return function by using closures.
Implement descriptor protocol to pass self argument yourself as it's done in memoize decorator recipe.