I'm attempting to implement a decorator on certain methods in a class so that if the value has NOT been calculated yet, the method will calculate the value, otherwise it will just return the precomputed value, which is stored in an instance defaultdict. I can't seem to figure out how to access the instance defaultdict from inside of a decorator declared outside of the class. Any ideas on how to implement this?
Here are the imports (for a working example):
from collections import defaultdict
from math import sqrt
Here is my decorator:
class CalcOrPass:
def __init__(self, func):
self.f = func
#if the value is already in the instance dict from SimpleData,
#don't recalculate the values, instead return the value from the dict
def __call__(self, *args, **kwargs):
# can't figure out how to access/pass dict_from_SimpleData to here :(
res = dict_from_SimpleData[self.f.__name__]
if not res:
res = self.f(*args, **kwargs)
dict_from_SimpleData[self.f__name__] = res
return res
And here's the SimpleData class with decorated methods:
class SimpleData:
def __init__(self, data):
self.data = data
self.stats = defaultdict() #here's the dict I'm trying to access
#CalcOrPass
def mean(self):
return sum(self.data)/float(len(self.data))
#CalcOrPass
def se(self):
return [i - self.mean() for i in self.data]
#CalcOrPass
def variance(self):
return sum(i**2 for i in self.se()) / float(len(self.data) - 1)
#CalcOrPass
def stdev(self):
return sqrt(self.variance())
So far, I've tried declaring the decorator inside of SimpleData, trying to pass multiple arguments with the decorator(apparently you can't do this), and spinning around in my swivel chair while trying to toss paper airplanes into my scorpion tank. Any help would be appreciated!
The way you define your decorator the target object information is lost. Use a function wrapper instead:
def CalcOrPass(func):
#wraps(func)
def result(self, *args, **kwargs):
res = self.stats[func.__name__]
if not res:
res = func(self, *args, **kwargs)
self.stats[func.__name__] = res
return res
return result
wraps is from functools and not strictly necessary here, but very convenient.
Side note: defaultdict takes a factory function argument:
defaultdict(lambda: None)
But since you're testing for the existence of the key anyway, you should prefer a simple dict.
You can't do what you want when your function is defined, because it is unbound. Here's a way to achieve it in a generic fashion at runtime:
class CalcOrPass(object):
def __init__(self, func):
self.f = func
def __get__(self, obj, type=None): # Cheat.
return self.__class__(self.f.__get__(obj, type))
#if the value is already in the instance dict from SimpleData,
#don't recalculate the values, instead return the value from the dict
def __call__(self, *args, **kwargs):
# I'll concede that this doesn't look very pretty.
# TODO handle KeyError here
res = self.f.__self__.stats[self.f.__name__]
if not res:
res = self.f(*args, **kwargs)
self.f.__self__.stats[self.f__name__] = res
return res
A short explanation:
Our decorator defines __get__ (and is hence said to be a descriptor). Whereas the default behaviour for an attribute access is to get it from the object's dictionary, if the descriptor method is defined, Python will call that instead.
The case with objects is that object.__getattribute__ transforms an access like b.x into type(b).__dict__['x'].__get__(b, type(b))
This way we can access the bound class and its type from the descriptor's parameters.
Then we create a new CalcOrPass object which now decorates (wraps) a bound method instead of the old unbound function.
Note the new style class definition. I'm not sure if this will work with old-style classes, as I haven't tried it; just don't use those. :) This will work for both functions and methods, however.
What happens to the "old" decorated functions is left as an exercise.
Related
It is known that in Python, due to optimization concerns, we cannot add/modify member functions of a built-in class, e.g., adding an sed function to the built-in str class to perform re.sub(). Thus, the only way to achieve so is to inherit the class (or subclassing). i.e.,
class String(str):
def __init__(self, value='', **kwargs):
super().__init__()
def sed(self, src, tgt):
return String(re.sub(src, tgt, self))
The problem with this is that after sub-classing, member functions return base-class instance instead of the inherited class instance. For example, I would like to chain String edits String(' A b C d E [!] ').sed(...).lower().sed(...).strip().sed('\[.*\]', '').split() and so on. However, functions such as .lower() and .strip() returns an str instead of String, so cannot perform .sed(...) afterwards. And I do not want to keep casting to String after every function call.
So I did a manual over-ride of every base-class methods as follows:
class String(str):
for func in dir(str):
if not func.startswith('_'):
exec(f'{func}=lambda *args, **kwargs: [(String(i) if type(i)==str else i) for i in [str.{func}(*args, **kwargs)]][0]')
def __init__(self, value='', **kwargs):
super().__init__()
def sed(self, src, tgt):
return String(re.sub(src, tgt, self))
However, not every member function returns a simple str object, e.g., for functions such as .split(), they return a list of str; other functions like .isalpha() or .find() return boolean or integer. In general, I want to add more string-morphing functions and do not want to manually over-ride member functions of each return type in order to return inherited-class objects rather than base-class objects. So is there a more elegant way of doing this? Thanks!
Python's built-in classes are not designed to support that style of inheritance
easily. Also, the whole idea seems flawed to my eye. Even if you do figure out
a way to solve the problem as you've framed it, what's the advantage over good
old functions?
# Special String objects with new methods.
s = String('foo bar')
result = s.sed('...', '...')
# Regular str instances passed to ordinary functions.
s = 'foo bar'
result = sed(s, '...', '...')
That said, here's one way to try. I have not tested it
extensively, it might have a flaw, and I would never use it in real code.
The basic idea is to capture objects returned during low-level
attribute access, and if the object is callable return
a wrapped version of it that will perform the needed
data conversions.
import re
from functools import wraps
class String(str):
def __getattribute__(self, attr):
obj = object.__getattribute__(self, attr)
return wrapped(obj) if callable(obj) else obj
def __init__(self, value='', **kwargs):
super().__init__()
def sed(self, src, tgt):
return re.sub(src, tgt, self)
def wrapped(func):
#wraps(func)
def wrapper(*xs, **kws):
obj = func(*xs, **kws)
return convert(obj)
return wrapper
def convert(obj):
if isinstance(obj, str):
return String(obj)
elif isinstance(obj, list):
return [convert(x) for x in obj]
elif isinstance(obj, tuple):
return tuple(convert(x) for x in obj)
else:
return obj
Demo:
s = String('foo bar')
got = s.sed('foo', 'bzz').upper().split()
print(got)
print(type(got))
print(type(got[0]))
Output:
['BZZ', 'BAR']
<class 'list'>
<class '__main__.String'>
I have this class:
class SomeClass(object):
def __init__(self):
self.cache = {}
def check_cache(method):
def wrapper(self):
if method.__name__ in self.cache:
print('Got it from the cache!')
return self.cache[method.__name__]
print('Got it from the api!')
self.cache[method.__name__] = method(self)
return self.cache[method.__name__]
return wrapper
#check_cache
def expensive_operation(self):
return get_data_from_api()
def get_data_from_api():
"This would call the api."
return 'lots of data'
The idea is that I can use the #check_cache decorator to keep the expensive_operation method from calling an api additional times if the result is already cached.
This works fine, it seems.
>>> sc.expensive_operation()
Got it from the api!
'lots of data'
>>> sc.expensive_operation()
Got it from the cache!
'lots of data'
But I would love to be able to test it with another decorator:
import unittest
class SomeClassTester(SomeClass):
def counted(f):
def wrapped(self, *args, **kwargs):
wrapped.calls += 1
return f(self, *args, **kwargs)
wrapped.calls = 0
return wrapped
#counted
def expensive_operation(self):
return super().expensive_operation()
class TestSomeClass(unittest.TestCase):
def test_api_is_only_called_once(self):
sc = SomeClassTester()
sc.expensive_operation()
self.assertEqual(sc.expensive_operation.calls, 1) # is 1
sc.expensive_operation()
self.assertEqual(sc.expensive_operation.calls, 1) # but this goes to 2
unittest.main()
The problem is that the counted decorator counts the number of times the wrapper function is called, not this inner function.
How do I count that from SomeClassTester?
There's no easy way to do this. Your current test applies the decorators in the wrong order. You want check_cache(counted(expensive_operation)), but you're getting the counted decorator on the outside instead: counted(check_cache(expensive_operation)).
There's no easy way to fix this within the counted decorator, because by the time it gets called, the original function is already wrapped up by the check_cache decorator, and there's no easy way to change the wrapper (it holds its reference to the original function in a closure cell, which is read-only from the outside).
One possible way to make it work is to rebuild the whole method with the decorators in the desired order. You can get a reference to the original method from the closure cell:
class SomeClassTester(SomeClass):
def counted(f):
def wrapped(self, *args, **kwargs):
wrapped.calls += 1
return f(self, *args, **kwargs)
wrapped.calls = 0
return wrapped
expensive_operation = SomeClass.check_cache(
counted(SomeClass.expensive_operation.__closure__[0].cell_value)
)
This is of course far from ideal, since you need to know exactly what decorators are being applied on the method in SomeClass in order to apply them again properly. You also need to know the internals of those decorators so that you can get the right closure cell (the [0] index may not be correct if the other decorator gets changed to differently).
Another (perhaps better) approach might be to change SomeClass in such a way that you can inject your counting code in between the changed method and the expensive bit you want to count. For example, you could have the real expensive part be in _expensive_method_implementation, while the decorated expensive_method is just a simple wrapper that calls it. The test class can override the _implementation method with its own decorated version (which might even skip the actually expensive part and just return dummy data). It doesn't need to override the regular method or mess with its decorators.
It is impossible to do this, without modifying the base class to provide hooks or changing the whole decorated function in derived class based on internal knowledge of base class. Though there is a third way based on internal working of cache decorator, basically change your cache dict so that it counts
class CounterDict(dict):
def __init__(self, *args):
super().__init__(*args)
self.count = {}
def __setitem__(self, key, value):
try:
self.count[key] += 1
except KeyError:
self.count[key] = 1
return super().__setitem__(key, value)
class SomeClassTester(SomeClass):
def __init__(self):
self.cache = CounterDict()
class TestSomeClass(unittest.TestCase):
def test_api_is_only_called_once(self):
sc = SomeClassTester()
sc.expensive_operation()
self.assertEqual(sc.cache.count['expensive_operation'], 1) # is 1
sc.expensive_operation()
self.assertEqual(sc.cache.count['expensive_operation'], 1) # is 1
I am learning from Django source code.
When I read about functional module in Django,
I don't know how to understand it.
What the function is for and how to explain the implement of it.
This is my first to use stackoverflow.
If some rules in here I didn't notice, please remind me.Thanks.
the code:
class Promise(object):
"""
This is just a base class for the proxy class created in
the closure of the lazy function. It can be used to recognize
promises in code.
"""
pass
def lazy(func, *resultclasses):
"""
Turns any callable into a lazy evaluated callable. You need to give result
classes or types -- at least one is needed so that the automatic forcing of
the lazy evaluation code is triggered. Results are not memoized; the
function is evaluated on every access.
"""
#total_ordering
class __proxy__(Promise):
"""
Encapsulate a function call and act as a proxy for methods that are
called on the result of that function. The function is not evaluated
until one of the methods on the result is called.
"""
__dispatch = None
def __init__(self, args, kw):
self.__args = args
self.__kw = kw
if self.__dispatch is None:
self.__prepare_class__()
def __reduce__(self):
return (
_lazy_proxy_unpickle,
(func, self.__args, self.__kw) + resultclasses
)
#classmethod
def __prepare_class__(cls):
cls.__dispatch = {}
for resultclass in resultclasses:
cls.__dispatch[resultclass] = {}
for type_ in reversed(resultclass.mro()):
for (k, v) in type_.__dict__.items():
# All __promise__ return the same wrapper method, but
# they also do setup, inserting the method into the
# dispatch dict.
meth = cls.__promise__(resultclass, k, v)
if hasattr(cls, k):
continue
setattr(cls, k, meth)
cls._delegate_bytes = bytes in resultclasses
cls._delegate_text = six.text_type in resultclasses
assert not (cls._delegate_bytes and cls._delegate_text), "Cannot call lazy() with both bytes and text return types."
if cls._delegate_text:
if six.PY3:
cls.__str__ = cls.__text_cast
else:
cls.__unicode__ = cls.__text_cast
elif cls._delegate_bytes:
if six.PY3:
cls.__bytes__ = cls.__bytes_cast
else:
cls.__str__ = cls.__bytes_cast
#classmethod
def __promise__(cls, klass, funcname, method):
# Builds a wrapper around some magic method and registers that
# magic method for the given type and method name.
def __wrapper__(self, *args, **kw):
# Automatically triggers the evaluation of a lazy value and
# applies the given magic method of the result type.
res = func(*self.__args, **self.__kw)
for t in type(res).mro():
if t in self.__dispatch:
return self.__dispatch[t][funcname](res, *args, **kw)
raise TypeError("Lazy object returned unexpected type.")
if klass not in cls.__dispatch:
cls.__dispatch[klass] = {}
cls.__dispatch[klass][funcname] = method
return __wrapper__
def __text_cast(self):
return func(*self.__args, **self.__kw)
def __bytes_cast(self):
return bytes(func(*self.__args, **self.__kw))
def __cast(self):
if self._delegate_bytes:
return self.__bytes_cast()
elif self._delegate_text:
return self.__text_cast()
else:
return func(*self.__args, **self.__kw)
def __ne__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() != other
def __eq__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() == other
def __lt__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() < other
def __hash__(self):
return hash(self.__cast())
def __mod__(self, rhs):
if self._delegate_bytes and six.PY2:
return bytes(self) % rhs
elif self._delegate_text:
return six.text_type(self) % rhs
return self.__cast() % rhs
def __deepcopy__(self, memo):
# Instances of this class are effectively immutable. It's just a
# collection of functions. So we don't need to do anything
# complicated for copying.
memo[id(self)] = self
return self
#wraps(func)
def __wrapper__(*args, **kw):
# Creates the proxy object, instead of the actual value.
return __proxy__(args, kw)
return __wrapper__
This function takes function and any number of classes.
If to simplify, it returns wrapper(lets say "lazy function") instead of that function. At that point we can say that we turned function
into lazy function.
After that we can call this lazy function. Once called, it will return instance of proxy class, without calling the initial
function instead of result of initial function.
The initial function will be called only after we invoke any method on that result(proxy instance).
*resultclasses here is the classes, instances of which are expected as results of the initial function
For example:
def func(text):
return text.title()
lazy_func = lazy(func, str)
#lazy functon. prepared to dispatch any method of str instance.
res = lazy_func('test') #instance of __proxy__ class instead of 'Test' string.
res.find('T') #only at that point we call the initial function
I'll try to explain how it works in overall:
def lazy(func, *resultclasses): #On decorate
#total_ordering
class __proxy__(Promise):
__dispatch = None
def __init__(self, args, kw): #On call
#3) __proxy__ instance stores the original call's args and kwargs. args = ('Test', ) for our example
self.__args = args
self.__kw = kw
if self.__dispatch is None:
self.__prepare_class__()
#4) if it's the first call ot lazy function, we should prepare __proxy__ class
#On the first call of the __wrapper__ function we should prepare class. Class preparation in this case
#means that we'll fill the __dispatch class attribute with links to all methods of each result class.
#We need to prepare class only on first call.
#classmethod
def __prepare_class__(cls):
cls.__dispatch = {}
for resultclass in resultclasses:
#5) Looping through the resultclasses. In our example it's only str
cls.__dispatch[resultclass] = {}
for type_ in reversed(resultclass.mro()):
#6) looping through each superclass of each resultclass in reversed direction.
# So that'll be (object, str) for our example
for (k, v) in type_.__dict__.items():
#7) Looping through each attribute of each superclass. For example k = 'find', v = str.find
meth = cls.__promise__(resultclass, k, v)
if hasattr(cls, k):
continue
setattr(cls, k, meth)
#9) If __proxy__ class doesn't have attribute 'find' for example, we set the __wrapper__ to
#that attribute
#So class __proxy__ will have the __wrapper__ method in __proxy__.__dict__['find'].
#And so on for all methods.
#classmethod
def __promise__(cls, klass, funcname, method):
# Builds a wrapper around some magic method and registers that
# magic method for the given type and method name.
def __wrapper__(self, *args, **kw): #При вызове каждого метода результирующего класса (str)
# Automatically triggers the evaluation of a lazy value and
# applies the given magic method of the result type.
res = func(*self.__args, **self.__kw)
#10 finally we call the original function
for t in type(res).mro():
#11) We're looping through all the superclasses of result's class from the bottom to the top
#That''ll be (str, object) for our example
if t in self.__dispatch:
#12) If the class is dispatched we pass the result with args and kwargs to
#__proxy__.__dispatch[str]['find'] which is unbound method 'find' of str class
#For our example res = 'Test', args = ('T', )
return self.__dispatch[t][funcname](res, *args, **kw)
raise TypeError("Lazy object returned unexpected type.")
if klass not in cls.__dispatch:
cls.__dispatch[klass] = {}
cls.__dispatch[klass][funcname] = method
#7) Adds __proxy__.__dispatch[str]['find'] = str.find for example which is unbound method 'find' of str class
#and so on with each method of each superclass of each resultclass
#8) Returns new __wrapper__ method for each method of each resultclass. This wrapper method has the
#funcname variable in closure.
return __wrapper__
#wraps(func) #makes the lazy function look like the initial
def __wrapper__(*args, **kw):
# Creates the proxy object, instead of the actual value.
return __proxy__(args, kw)
#2)On call of lazy function we get __proxy__ instance instead of the actual value
return __wrapper__
#1)As the result of lazy(func, *resultclasses) call we get the __wrapper__ function, which looks like
#the initial function because of the #wraps decorator
I apologize in advance for the rather long question.
I'm implementing callable objects and would like them to behave somewhat like (mathematical) functions. I have a base class whose __call__ method raises NotImplementedError so users must subclass to define __call__. My question is: how can I define the special method __neg__ in the base class so subclasses immediately have the expected behavior without having the implement __neg__ in each subclass? My sense of the expected behavior is that if f is an instance of (a subclass of) the base class with a properly defined __call__, then -f should be an instance of the same class as f, possessing all the same attributes as f, except for __call__, which should return the negative of f's __call__.
Here's an example of what I mean:
class Base(object):
def __call__(self, *args, **kwargs):
raise NotImplementedError, 'Please subclass'
def __neg__(self):
def call(*args, **kwargs):
return -self(*args, **kwargs)
mBase = type('mBase', (Base,), {'__call__': call})
return mBase()
class One(Base):
def __init__(self data):
self.data = data
def __call__(self, *args, **kwargs):
return 1
This has the expected behavior:
one = One()
print one() # Prints 1
minus_one = -one
print minus_one() # Prints -1
though it's not exactly what I'd like since minus_one is not an instance of the same class as one (but I could live with that).
Now I'd like the new instance minus_one to inherit all attributes and methods of one; only the __call__ method should change. So I could change __neg__ to
def __neg__(self):
def call(*args, **kwargs):
return -self(*args, **kwargs)
mBase = type('mBase', (Base,), {'__call__': call})
new = mBase()
for n, v in inspect.getmembers(self):
if n != '__call__':
setattr(new, n, v)
return new
This seems to work. My question is: are there cons to this strategy? Implementing a generic __neg__ must be a standard exercise but I couldn't find anything on it on the web. Are there recommended alternatives?
Thanks in advance for any comments.
Your approach has several downsides. One example is that you copy all members of the original instance to the new instance -- this won't work if your class overrides any special methods other than __call__, since special methods are only looked up in the dictionary of the object's type when called implicitly. Moreover, it copies a lot of stuff that is actually inherited from object and doesn't need to go in the instance's __dict__.
An easier approach that satisfies your exact requirements is to make the new type a subclass of the instance's original type. This can be done by defining a local class inside the __neg__() method:
def __neg__(self):
class Neg(self.__class__):
def __call__(self_, *args, **kwargs):
return -self(*args, **kwargs)
neg = Base.__new__(Neg)
neg.__dict__ = self.__dict__.copy()
return neg
This defines a new class Neg derived from the original function's type and overwrites its __call__() method. It creates an instance of this class using Base's constructor -- this is to cover the case that self's class would take constructor arguments. finally we copy everything that is directly stored in the instance self to the new instance.
If I were to design the system, I'd take a completely different approach. I'd fix the interface for a function and would only rely on this fixed interface for every function. I wouldn't bother to copy all attributes of an instance to the negated function, but rather do this:
class Function(object):
def __neg__(self):
return NegatedFunction(self)
def __add__(self, other):
return SumFunction(self, other)
class NegatedFunction(Function):
def __init__(self, f):
self.f = f
def __call__(self, *args, **kwargs):
return -self.f(*args, **kwargs)
class SumFunction(Function):
def __init__(self, *funcs):
self.funcs = funcs
def __call__(self, *args, **kwargs):
return sum(f(*args, **kwargs) for f in self.funcs)
This approach does not fulfil your requirement that the function returned by __neg__() has all the attributes and methods of the original function, but I think this requirement is rather questionable as far as design is concerned. I think dropping this requirement will give you a much cleaner and more general approach (as demonstrated by including an __add__() operator in the example above).
The basic problem you're running into is that __xxx__ methods are only looked up on the class, which means all instances of the same class will use the same __xxx__ methods. This suggests using a method similar to what Cat Plus Plus suggested; however, you also don't want your users to have to worry about even more special names (such as _call_impl and _negate).
If you don't mind the possibly mind-melting power of metaclasses, that is the route to take. A metaclass can add in the _negate attribute automatically (and name mangle it to avoid clashes), as well as take the __call__ that your user wrote and rename it to _call, then create a new __call__ that calls the old __call__ (now called _call ;) and then negates the result, if necessary, before returning it.
Here's the code:
import copy
import inspect
class MetaFunction(type):
def __new__(metacls, cls_name, cls_bases, cls_dict):
result_class = type.__new__(metacls, cls_name, cls_bases, cls_dict)
if '__call__' in cls_dict:
original_call = cls_dict['__call__']
args, varargs, kwargs, defaults = inspect.getargspec(original_call)
args = args[1:]
if defaults is None:
defaults = [''] * len(args)
else:
defaults = [''] * (len(args) - len(defaults)) + list(defaults)
signature = []
for arg, default in zip(args, defaults):
if default:
signature.append('%s=%s' % (arg, default))
else:
signature.append(arg)
if varargs is not None:
signature.append(varargs)
if kwargs is not None:
signature.append(kwargs)
signature = ', '.join(signature)
passed_args = ', '.join(args)
new_call = (
"""def __call__(self, %(signature)s):
result = self._call(%(passed_args)s)
if self._%(cls_name)s__negate:
result = -result
return result"""
% {
'cls_name':cls_name,
'signature':signature,
'passed_args':passed_args,
})
eval_dict = {}
exec new_call in eval_dict
new_call = eval_dict['__call__']
new_call.__doc__ = original_call.__doc__
new_call.__module__ = original_call.__module__
new_call.__dict__ = original_call.__dict__
setattr(result_class, '__call__', new_call)
setattr(result_class, '_call', original_call)
setattr(result_class, '_%s__negate' % cls_name, False)
negate = """def __neg__(self):
"returns an instance of the same class that returns the negation of __call__"
negated = copy.copy(self)
negated._%(cls_name)s__negate = not self._%(cls_name)s__negate
return negated""" % {'cls_name':cls_name}
eval_dict = {'copy':copy}
exec negate in eval_dict
negate = eval_dict['__neg__']
negate.__module__ = new_call.__module__
setattr(result_class, '__neg__', eval_dict['__neg__'])
return result_class
class Base(object):
__metaclass__ = MetaFunction
class Power(Base):
def __init__(self, power):
"power = the power to raise to"
self.power = power
def __call__(self, number):
"raises number to power"
return number ** self.power
and an example:
--> square = Power(2)
--> neg_square = -square
--> square(9)
81
--> neg_square(9)
-81
While the metaclass code itself can be complex, the resulting objects can be very easy to use. To be fair, most of the code, and the complexity, in MetaFunction is due to re-writing __call__ in order to preserve the call signature and make introspection useful... so instead of seeing __call__(*args, *kwargs) in help, you see this:
Help on Power in module test object:
class Power(Base)
| Method resolution order:
| Power
| Base
| __builtin__.object
|
| Methods defined here:
|
| __call__(self, number)
| raises number to power
|
| __init__(self, power)
| power = the power to raise to
|
| __neg__(self)
| returns an instance of the same class that returns the negation of __call__
Instead of creating new type, you can keep a flag on the instance that says whether call result should be negated or not. And then you can offload the actual overrideable call behaviour to a separate (non-special) method, as part of your own protocol.
class Base(object):
def __init__(self):
self._negate_call = False
def call_impl(self, *args, **kwargs):
raise NotImplementedError
def __call__(self, *args, **kwargs):
result = self.call_impl(*args, **kwargs)
return -result if self._negate_call else result
def __neg__(self):
other = copy.copy(self)
other._negate_call = not other._negate_call
return other
I'd like to provide the capability for users of one of my modules to extend its capabilities by providing an interface to call a user's function. For example, I want to give users the capability to be notified when an instance of a class is created and given the opportunity to modify the instance before it is used.
The way I've implemented it is to declare a module-level factory function that does the instantiation:
# in mymodule.py
def factory(cls, *args, **kwargs):
return cls(*args, **kwargs)
Then when I need an instance of a class in mymodule, I do factory(cls, arg1, arg2) rather than cls(arg1, arg2).
To extend it, a programmer would write in another module a function like this:
def myFactory(cls, *args, **kwargs):
instance = myFactory.chain(cls, *args, **kwargs)
# do something with the instance here if desired
return instance
Installation of the above callback looks like this:
myFactory.chain, mymodule.factory = mymodule.factory, myFactory
This seems straightforward enough to me, but I was wondering if you, as a Python programmer, would expect a function to register a callback rather than doing it with an assignment, or if there were other methods you would expect. Does my solution seem workable, idiomatic, and clear to you?
I am looking to keep it as simple as possible; I don't think most applications will actually need to chain more than one user callback, for example (though unlimited chaining comes "for free" with the above pattern). I doubt they will need to remove callbacks or specify priorities or order. Modules like python-callbacks or PyDispatcher seem to me like overkill, especially the latter, but if there are compelling benefits to a programmer working with my module, I'm open to them.
Taking aaronsterling's idea a bit further:
class C(object):
_oncreate = []
def __new__(cls):
return reduce(lambda x, y: y(x), cls._oncreate, super(C, cls).__new__(cls))
#classmethod
def oncreate(cls, func):
cls._oncreate.append(func)
c = C()
print hasattr(c, 'spew')
#C.oncreate
def spew(obj):
obj.spew = 42
return obj
c = C()
print c.spew
Combining Aaron's idea of using a decorator and Ignacio's idea of a class that maintains a list of attached callbacks, plus a concept borrowed from C#, I came up with this:
class delegate(object):
def __init__(self, func):
self.callbacks = []
self.basefunc = func
def __iadd__(self, func):
if callable(func):
self.__isub__(func)
self.callbacks.append(func)
return self
def callback(self, func):
if callable(func):
self.__isub__(func)
self.callbacks.append(func)
return func
def __isub__(self, func):
try:
self.callbacks.remove(func)
except ValueError:
pass
return self
def __call__(self, *args, **kwargs):
result = self.basefunc(*args, **kwargs)
for func in self.callbacks:
newresult = func(result)
result = result if newresult is None else newresult
return result
Decorating a function with #delegate allows other functions to be "attached" to it.
#delegate
def intfactory(num):
return int(num)
Functions can be added to the delegate with += (and removed with -=). You can also decorate with funcname.callback to add a callback function.
#intfactory.callback
def notify(num):
print "notify:", num
def increment(num):
return num+1
intfactory += increment
intfactory += lambda num: num * 2
print intfactory(3) # outputs 8
Does this feel Pythonic?
I might use a decorator so that the user could just write.
#new_factory
def myFactory(cls, *args, **kwargs):
instance = myFactory.chain(cls, *args, **kwargs)
# do something with the instance here if desired
return instance
Then in your module,
import sys
def new_factory(f):
mod = sys.modules[__name__]
f.chain = mod.factory
mod.factory = f
return f