Implementing __neg__ in a generic way for all subclasses in Python - python

I apologize in advance for the rather long question.
I'm implementing callable objects and would like them to behave somewhat like (mathematical) functions. I have a base class whose __call__ method raises NotImplementedError so users must subclass to define __call__. My question is: how can I define the special method __neg__ in the base class so subclasses immediately have the expected behavior without having the implement __neg__ in each subclass? My sense of the expected behavior is that if f is an instance of (a subclass of) the base class with a properly defined __call__, then -f should be an instance of the same class as f, possessing all the same attributes as f, except for __call__, which should return the negative of f's __call__.
Here's an example of what I mean:
class Base(object):
def __call__(self, *args, **kwargs):
raise NotImplementedError, 'Please subclass'
def __neg__(self):
def call(*args, **kwargs):
return -self(*args, **kwargs)
mBase = type('mBase', (Base,), {'__call__': call})
return mBase()
class One(Base):
def __init__(self data):
self.data = data
def __call__(self, *args, **kwargs):
return 1
This has the expected behavior:
one = One()
print one() # Prints 1
minus_one = -one
print minus_one() # Prints -1
though it's not exactly what I'd like since minus_one is not an instance of the same class as one (but I could live with that).
Now I'd like the new instance minus_one to inherit all attributes and methods of one; only the __call__ method should change. So I could change __neg__ to
def __neg__(self):
def call(*args, **kwargs):
return -self(*args, **kwargs)
mBase = type('mBase', (Base,), {'__call__': call})
new = mBase()
for n, v in inspect.getmembers(self):
if n != '__call__':
setattr(new, n, v)
return new
This seems to work. My question is: are there cons to this strategy? Implementing a generic __neg__ must be a standard exercise but I couldn't find anything on it on the web. Are there recommended alternatives?
Thanks in advance for any comments.

Your approach has several downsides. One example is that you copy all members of the original instance to the new instance -- this won't work if your class overrides any special methods other than __call__, since special methods are only looked up in the dictionary of the object's type when called implicitly. Moreover, it copies a lot of stuff that is actually inherited from object and doesn't need to go in the instance's __dict__.
An easier approach that satisfies your exact requirements is to make the new type a subclass of the instance's original type. This can be done by defining a local class inside the __neg__() method:
def __neg__(self):
class Neg(self.__class__):
def __call__(self_, *args, **kwargs):
return -self(*args, **kwargs)
neg = Base.__new__(Neg)
neg.__dict__ = self.__dict__.copy()
return neg
This defines a new class Neg derived from the original function's type and overwrites its __call__() method. It creates an instance of this class using Base's constructor -- this is to cover the case that self's class would take constructor arguments. finally we copy everything that is directly stored in the instance self to the new instance.
If I were to design the system, I'd take a completely different approach. I'd fix the interface for a function and would only rely on this fixed interface for every function. I wouldn't bother to copy all attributes of an instance to the negated function, but rather do this:
class Function(object):
def __neg__(self):
return NegatedFunction(self)
def __add__(self, other):
return SumFunction(self, other)
class NegatedFunction(Function):
def __init__(self, f):
self.f = f
def __call__(self, *args, **kwargs):
return -self.f(*args, **kwargs)
class SumFunction(Function):
def __init__(self, *funcs):
self.funcs = funcs
def __call__(self, *args, **kwargs):
return sum(f(*args, **kwargs) for f in self.funcs)
This approach does not fulfil your requirement that the function returned by __neg__() has all the attributes and methods of the original function, but I think this requirement is rather questionable as far as design is concerned. I think dropping this requirement will give you a much cleaner and more general approach (as demonstrated by including an __add__() operator in the example above).

The basic problem you're running into is that __xxx__ methods are only looked up on the class, which means all instances of the same class will use the same __xxx__ methods. This suggests using a method similar to what Cat Plus Plus suggested; however, you also don't want your users to have to worry about even more special names (such as _call_impl and _negate).
If you don't mind the possibly mind-melting power of metaclasses, that is the route to take. A metaclass can add in the _negate attribute automatically (and name mangle it to avoid clashes), as well as take the __call__ that your user wrote and rename it to _call, then create a new __call__ that calls the old __call__ (now called _call ;) and then negates the result, if necessary, before returning it.
Here's the code:
import copy
import inspect
class MetaFunction(type):
def __new__(metacls, cls_name, cls_bases, cls_dict):
result_class = type.__new__(metacls, cls_name, cls_bases, cls_dict)
if '__call__' in cls_dict:
original_call = cls_dict['__call__']
args, varargs, kwargs, defaults = inspect.getargspec(original_call)
args = args[1:]
if defaults is None:
defaults = [''] * len(args)
else:
defaults = [''] * (len(args) - len(defaults)) + list(defaults)
signature = []
for arg, default in zip(args, defaults):
if default:
signature.append('%s=%s' % (arg, default))
else:
signature.append(arg)
if varargs is not None:
signature.append(varargs)
if kwargs is not None:
signature.append(kwargs)
signature = ', '.join(signature)
passed_args = ', '.join(args)
new_call = (
"""def __call__(self, %(signature)s):
result = self._call(%(passed_args)s)
if self._%(cls_name)s__negate:
result = -result
return result"""
% {
'cls_name':cls_name,
'signature':signature,
'passed_args':passed_args,
})
eval_dict = {}
exec new_call in eval_dict
new_call = eval_dict['__call__']
new_call.__doc__ = original_call.__doc__
new_call.__module__ = original_call.__module__
new_call.__dict__ = original_call.__dict__
setattr(result_class, '__call__', new_call)
setattr(result_class, '_call', original_call)
setattr(result_class, '_%s__negate' % cls_name, False)
negate = """def __neg__(self):
"returns an instance of the same class that returns the negation of __call__"
negated = copy.copy(self)
negated._%(cls_name)s__negate = not self._%(cls_name)s__negate
return negated""" % {'cls_name':cls_name}
eval_dict = {'copy':copy}
exec negate in eval_dict
negate = eval_dict['__neg__']
negate.__module__ = new_call.__module__
setattr(result_class, '__neg__', eval_dict['__neg__'])
return result_class
class Base(object):
__metaclass__ = MetaFunction
class Power(Base):
def __init__(self, power):
"power = the power to raise to"
self.power = power
def __call__(self, number):
"raises number to power"
return number ** self.power
and an example:
--> square = Power(2)
--> neg_square = -square
--> square(9)
81
--> neg_square(9)
-81
While the metaclass code itself can be complex, the resulting objects can be very easy to use. To be fair, most of the code, and the complexity, in MetaFunction is due to re-writing __call__ in order to preserve the call signature and make introspection useful... so instead of seeing __call__(*args, *kwargs) in help, you see this:
Help on Power in module test object:
class Power(Base)
| Method resolution order:
| Power
| Base
| __builtin__.object
|
| Methods defined here:
|
| __call__(self, number)
| raises number to power
|
| __init__(self, power)
| power = the power to raise to
|
| __neg__(self)
| returns an instance of the same class that returns the negation of __call__

Instead of creating new type, you can keep a flag on the instance that says whether call result should be negated or not. And then you can offload the actual overrideable call behaviour to a separate (non-special) method, as part of your own protocol.
class Base(object):
def __init__(self):
self._negate_call = False
def call_impl(self, *args, **kwargs):
raise NotImplementedError
def __call__(self, *args, **kwargs):
result = self.call_impl(*args, **kwargs)
return -result if self._negate_call else result
def __neg__(self):
other = copy.copy(self)
other._negate_call = not other._negate_call
return other

Related

How to write factory functions for subclasses?

Suppose there is a class A and a factory function make_A
class A():
...
def make_A(*args, **kwars):
# returns an object of type A
both defined in some_package.
Suppose also that I want to expand the functionality of A, by subclassing it,
without overriding the constructor:
from some_package import A, make_A
class B(A):
def extra_method(self, ...):
# adds extra functionality
What I also need is to write a new factory function make_B for subclass B.
The solution I have found so far is
def make_B(*args, **kwargs):
"""
same as make_A except that it returns an object of type B
"""
out = make_A(*args, **kwargs)
out.__class__ = B
return out
This seems to work, but I am a bit worried about directly modifying the
__class__ attribute, as it feels to me like a hack. I am also worried about
unexpected side-effects this modification may have. Is this the recommended
solution or is there a "cleaner" pattern to achieve the same result?
I guess I finally found something not verbose yet still working. For this you need to replace inheritance with composition, this will allow to consume an object A by doing self.a = ....
To mimic the methods of A you can use __getattr__ overload to delegate those methods (and fields) to self.a
The next snippet works for me
class A:
def __init__(self, val):
self.val = val
def method(self):
print(f"A={self.val}")
def make_A():
return A(42)
class B:
def __init__(self, *args, consume_A = None, **kwargs):
if consume_A is None:
self.a = A(*args, **kwargs)
else:
self.a = consume_A
def __getattr__(self, name):
return getattr(self.a, name)
def my_extension(self):
print(f"B={self.val * 100}")
def make_B(*args, **kwargs):
return B(consume_A=make_A(*args, **kwargs))
b = make_B()
b.method() # A=42
b.my_extension() # B=4200
What makes this approach superior to yours is that modifying __class__ is probably not harmless. On the other hand __getattr__ and __getattribute__ are specifically provided as the mechanisms to resolve attributes search in an object. For more details, see this tutorial.
Make your original factory function more general by accepting a class as parameter: remember, everything is an object in Python, even classes.
def make(class_type, *args, **kwargs):
return class_type(*args, **kwargs)
a = make(A)
b = make(B)
Since B has the same parameters as A, you don't need to make an A and then turn it into B: B inherits from A, so it "is an A" and will have the same functionality, plus the extra method that you added.

How can I return self and another variable in a python class method while method chaining?

I understand what I am asking here is probably not the best code design, but the reason for me asking is strictly academic. I am trying to understand how to make this concept work.
Typically, I will return self from a class method so that the following methods can be chained together. My understanding is by returning self, I am simply returning an instance of the class, for the following methods to work on.
But in this case, I am trying to figure out how to return both self and another value from the method. The idea is if I do not want to chain, or I do not call any class attributes, I want to retrieve the data from the method being called.
Consider this example:
class Test(object):
def __init__(self):
self.hold = None
def methoda(self):
self.hold = 'lol'
return self, 'lol'
def newmethod(self):
self.hold = self.hold * 2
return self, 2
t = Test()
t.methoda().newmethod()
print(t.hold)
In this case, I will get an AttributeError: 'tuple' object has no attribute 'newmethod' which is to be expected because the methoda method is returning a tuple which does not have any methods or attributes called newmethod.
My question is not about unpacking multiple returns, but more about how can I continue to chain methods when the preceding methods are returning multiple values. I also understand that I can control the methods return with an argument to it, but that is not what I am trying to do.
As mentioned previously, I do realize this is probably a bad question, and I am happy to delete the post if the question doesnt make any sense.
Following the suggestion by #JohnColeman, you can return a special tuple with attribute lookup delegated to your object if it is not a normal tuple attribute. That way it acts like a normal tuple except when you are chaining methods.
You can implement this as follows:
class ChainResult(tuple):
def __new__(cls, *args):
return super(ChainResult, cls).__new__(cls, args)
def __getattribute__(self, name):
try:
return getattr(super(), name)
except AttributeError:
return getattr(super().__getitem__(0), name)
class Test(object):
def __init__(self):
self.hold = None
def methoda(self):
self.hold = 'lol'
return ChainResult(self, 'lol')
def newmethod(self):
self.hold = self.hold * 2
return ChainResult(self, 2)
Testing:
>>> t = Test()
>>> t.methoda().newmethod()
>>> print(t.hold)
lollol
The returned result does indeed act as a tuple:
>>> t, res = t.methoda().newmethod()
>>> print(res)
2
>>> print(isinstance(t.methoda().newmethod(), tuple))
True
You could imagine all sorts of semantics with this, such as forwarding the returned values to the next method in the chain using closure:
class ChainResult(tuple):
def __new__(cls, *args):
return super(ChainResult, cls).__new__(cls, args)
def __getattribute__(self, name):
try:
return getattr(super(), name)
except AttributeError:
attr = getattr(super().__getitem__(0), name)
if callable(attr):
chain_results = super().__getitem__(slice(1, None))
return lambda *args, **kw: attr(*(chain_results+args), **kw)
else:
return attr
For example,
class Test:
...
def methodb(self, *args):
print(*args)
would produce
>>> t = Test()
>>> t.methoda().methodb('catz')
lol catz
It would be nice if you could make ChainResults invisible. You can almost do it by initializing the tuple base class with the normal results and saving your object in a separate attribute used only for chaining. Then use a class decorator that wraps every method with ChainResults(self, self.method(*args, **kw)). It will work okay for methods that return a tuple but a single value return will act like a length 1 tuple, so you will need something like obj.method()[0] or result, = obj.method() to work with it. I played a bit with delegating to tuple for a multiple return or to the value itself for a single return; maybe it could be made to work but it introduces so many ambiguities that I doubt it could work well.

Subclassing method decorators in python

I am having trouble thinking of a way that's good python and consistent with oop principles as I've been taught to figure out how to create a family of related method decorators in python.
The mutually inconsistent goals seem to be that I want to be able to access both decorator attributes AND attributes of the instance on which the decorated method is bound. Here's what I mean:
from functools import wraps
class AbstractDecorator(object):
"""
This seems like the more natural way, but won't work
because the instance to which the wrapped function
is attached will never be in scope.
"""
def __new__(cls,f,*args,**kwargs):
return wraps(f)(object.__new__(cls,*args,**kwargs))
def __init__(decorator_self, f):
decorator_self.f = f
decorator_self.punctuation = "..."
def __call__(decorator_self, *args, **kwargs):
decorator_self.very_important_prep()
return decorator_self.f(decorator_self, *args, **kwargs)
class SillyDecorator(AbstractDecorator):
def very_important_prep(decorator_self):
print "My apartment was infested with koalas%s"%(decorator_self.punctuation)
class UsefulObject(object):
def __init__(useful_object_self, noun):
useful_object_self.noun = noun
#SillyDecorator
def red(useful_object_self):
print "red %s"%(useful_object_self.noun)
if __name__ == "__main__":
u = UsefulObject("balloons")
u.red()
which of course produces
My apartment was infested with koalas...
AttributeError: 'SillyDecorator' object has no attribute 'noun'
Note that of course there is always a way to get this to work. A factory with enough arguments, for example, will let me attach methods to some created instance of SillyDecorator, but I was kind of wondering whether there is a reasonable way to do this with inheritance.
#miku got the key idea of using the descriptor protocol. Here is a refinement that keeps the decorator object separate from the "useful object" -- it doesn't store the decorator info on the underlying object.
class AbstractDecorator(object):
"""
This seems like the more natural way, but won't work
because the instance to which the wrapped function
is attached will never be in scope.
"""
def __new__(cls,f,*args,**kwargs):
return wraps(f)(object.__new__(cls,*args,**kwargs))
def __init__(decorator_self, f):
decorator_self.f = f
decorator_self.punctuation = "..."
def __call__(decorator_self, obj_self, *args, **kwargs):
decorator_self.very_important_prep()
return decorator_self.f(obj_self, *args, **kwargs)
def __get__(decorator_self, obj_self, objtype):
return functools.partial(decorator_self.__call__, obj_self)
class SillyDecorator(AbstractDecorator):
def very_important_prep(decorator_self):
print "My apartment was infested with koalas%s"%(decorator_self.punctuation)
class UsefulObject(object):
def __init__(useful_object_self, noun):
useful_object_self.noun = noun
#SillyDecorator
def red(useful_object_self):
print "red %s"%(useful_object_self.noun)
>>> u = UsefulObject("balloons")
... u.red()
My apartment was infested with koalas...
red balloons
The descriptor protocol is the key here, since it is the thing that gives you access to both the decorated method and the object on which it is bound. Inside __get__, you can extract the useful object identity (obj_self) and pass it on to the __call__ method.
Note that it's important to use functools.partial (or some such mechanism) rather than simply storing obj_self as an attribute of decorator_self. Since the decorated method is on the class, only one instance of SillyDecorator exists. You can't use this SillyDecorator instance to store useful-object-instance-specific information --- that would lead to strange errors if you created multiple UsefulObjects and accessed their decorated methods without immediately calling them.
It's worth pointing out, though, that there may be an easier way. In your example, you're only storing a small amount of information in the decorator, and you don't need to change it later. If that's the case, it might be simpler to just use a decorator-maker function: a function that takes an argument (or arguments) and returns a decorator, whose behavior can then depend on those arguments. Here's an example:
def decoMaker(msg):
def deco(func):
#wraps(func)
def wrapper(*args, **kwargs):
print msg
return func(*args, **kwargs)
return wrapper
return deco
class UsefulObject(object):
def __init__(useful_object_self, noun):
useful_object_self.noun = noun
#decoMaker('koalas...')
def red(useful_object_self):
print "red %s"%(useful_object_self.noun)
>>> u = UsefulObject("balloons")
... u.red()
koalas...
red balloons
You can use the decoMaker ahead of time to make a decorator to reuse later, if you don't want to retype the message every time you make the decorator:
sillyDecorator = decoMaker("Some really long message about koalas that you don't want to type over and over")
class UsefulObject(object):
def __init__(useful_object_self, noun):
useful_object_self.noun = noun
#sillyDecorator
def red(useful_object_self):
print "red %s"%(useful_object_self.noun)
>>> u = UsefulObject("balloons")
... u.red()
Some really long message about koalas that you don't want to type over and over
red balloons
You can see that this is much less verbose than writing a whole class inheritance tree for different kinds of decoratorts. Unless you're writing super-complicated decorators that store all sorts of internal state (which is likely to get confusing anyway), this decorator-maker approach might be an easier way to go.
Adapted from http://metapython.blogspot.de/2010/11/python-instance-methods-how-are-they.html. Note that this variant sets attributes on the target instance, hence, without checks, it is possible to overwrite target instance attributes. The code below does not contain any checks for this case.
Also note that this example sets the punctuation attribute explicitly; a more general class could auto-discover it's attributes.
from types import MethodType
class AbstractDecorator(object):
"""Designed to work as function or method decorator """
def __init__(self, function):
self.func = function
self.punctuation = '...'
def __call__(self, *args, **kw):
self.setup()
return self.func(*args, **kw)
def __get__(self, instance, owner):
# TODO: protect against 'overwrites'
setattr(instance, 'punctuation', self.punctuation)
return MethodType(self, instance, owner)
class SillyDecorator(AbstractDecorator):
def setup(self):
print('[setup] silly init %s' % self.punctuation)
class UsefulObject(object):
def __init__(self, noun='cat'):
self.noun = noun
#SillyDecorator
def d(self):
print('Hello %s %s' % (self.noun, self.punctuation))
obj = UsefulObject()
obj.d()
# [setup] silly init ...
# Hello cat ...

Accessing self from outside of a class

I'm attempting to implement a decorator on certain methods in a class so that if the value has NOT been calculated yet, the method will calculate the value, otherwise it will just return the precomputed value, which is stored in an instance defaultdict. I can't seem to figure out how to access the instance defaultdict from inside of a decorator declared outside of the class. Any ideas on how to implement this?
Here are the imports (for a working example):
from collections import defaultdict
from math import sqrt
Here is my decorator:
class CalcOrPass:
def __init__(self, func):
self.f = func
#if the value is already in the instance dict from SimpleData,
#don't recalculate the values, instead return the value from the dict
def __call__(self, *args, **kwargs):
# can't figure out how to access/pass dict_from_SimpleData to here :(
res = dict_from_SimpleData[self.f.__name__]
if not res:
res = self.f(*args, **kwargs)
dict_from_SimpleData[self.f__name__] = res
return res
And here's the SimpleData class with decorated methods:
class SimpleData:
def __init__(self, data):
self.data = data
self.stats = defaultdict() #here's the dict I'm trying to access
#CalcOrPass
def mean(self):
return sum(self.data)/float(len(self.data))
#CalcOrPass
def se(self):
return [i - self.mean() for i in self.data]
#CalcOrPass
def variance(self):
return sum(i**2 for i in self.se()) / float(len(self.data) - 1)
#CalcOrPass
def stdev(self):
return sqrt(self.variance())
So far, I've tried declaring the decorator inside of SimpleData, trying to pass multiple arguments with the decorator(apparently you can't do this), and spinning around in my swivel chair while trying to toss paper airplanes into my scorpion tank. Any help would be appreciated!
The way you define your decorator the target object information is lost. Use a function wrapper instead:
def CalcOrPass(func):
#wraps(func)
def result(self, *args, **kwargs):
res = self.stats[func.__name__]
if not res:
res = func(self, *args, **kwargs)
self.stats[func.__name__] = res
return res
return result
wraps is from functools and not strictly necessary here, but very convenient.
Side note: defaultdict takes a factory function argument:
defaultdict(lambda: None)
But since you're testing for the existence of the key anyway, you should prefer a simple dict.
You can't do what you want when your function is defined, because it is unbound. Here's a way to achieve it in a generic fashion at runtime:
class CalcOrPass(object):
def __init__(self, func):
self.f = func
def __get__(self, obj, type=None): # Cheat.
return self.__class__(self.f.__get__(obj, type))
#if the value is already in the instance dict from SimpleData,
#don't recalculate the values, instead return the value from the dict
def __call__(self, *args, **kwargs):
# I'll concede that this doesn't look very pretty.
# TODO handle KeyError here
res = self.f.__self__.stats[self.f.__name__]
if not res:
res = self.f(*args, **kwargs)
self.f.__self__.stats[self.f__name__] = res
return res
A short explanation:
Our decorator defines __get__ (and is hence said to be a descriptor). Whereas the default behaviour for an attribute access is to get it from the object's dictionary, if the descriptor method is defined, Python will call that instead.
The case with objects is that object.__getattribute__ transforms an access like b.x into type(b).__dict__['x'].__get__(b, type(b))
This way we can access the bound class and its type from the descriptor's parameters.
Then we create a new CalcOrPass object which now decorates (wraps) a bound method instead of the old unbound function.
Note the new style class definition. I'm not sure if this will work with old-style classes, as I haven't tried it; just don't use those. :) This will work for both functions and methods, however.
What happens to the "old" decorated functions is left as an exercise.

What's the preferred way to implement a hook or callback in Python?

I'd like to provide the capability for users of one of my modules to extend its capabilities by providing an interface to call a user's function. For example, I want to give users the capability to be notified when an instance of a class is created and given the opportunity to modify the instance before it is used.
The way I've implemented it is to declare a module-level factory function that does the instantiation:
# in mymodule.py
def factory(cls, *args, **kwargs):
return cls(*args, **kwargs)
Then when I need an instance of a class in mymodule, I do factory(cls, arg1, arg2) rather than cls(arg1, arg2).
To extend it, a programmer would write in another module a function like this:
def myFactory(cls, *args, **kwargs):
instance = myFactory.chain(cls, *args, **kwargs)
# do something with the instance here if desired
return instance
Installation of the above callback looks like this:
myFactory.chain, mymodule.factory = mymodule.factory, myFactory
This seems straightforward enough to me, but I was wondering if you, as a Python programmer, would expect a function to register a callback rather than doing it with an assignment, or if there were other methods you would expect. Does my solution seem workable, idiomatic, and clear to you?
I am looking to keep it as simple as possible; I don't think most applications will actually need to chain more than one user callback, for example (though unlimited chaining comes "for free" with the above pattern). I doubt they will need to remove callbacks or specify priorities or order. Modules like python-callbacks or PyDispatcher seem to me like overkill, especially the latter, but if there are compelling benefits to a programmer working with my module, I'm open to them.
Taking aaronsterling's idea a bit further:
class C(object):
_oncreate = []
def __new__(cls):
return reduce(lambda x, y: y(x), cls._oncreate, super(C, cls).__new__(cls))
#classmethod
def oncreate(cls, func):
cls._oncreate.append(func)
c = C()
print hasattr(c, 'spew')
#C.oncreate
def spew(obj):
obj.spew = 42
return obj
c = C()
print c.spew
Combining Aaron's idea of using a decorator and Ignacio's idea of a class that maintains a list of attached callbacks, plus a concept borrowed from C#, I came up with this:
class delegate(object):
def __init__(self, func):
self.callbacks = []
self.basefunc = func
def __iadd__(self, func):
if callable(func):
self.__isub__(func)
self.callbacks.append(func)
return self
def callback(self, func):
if callable(func):
self.__isub__(func)
self.callbacks.append(func)
return func
def __isub__(self, func):
try:
self.callbacks.remove(func)
except ValueError:
pass
return self
def __call__(self, *args, **kwargs):
result = self.basefunc(*args, **kwargs)
for func in self.callbacks:
newresult = func(result)
result = result if newresult is None else newresult
return result
Decorating a function with #delegate allows other functions to be "attached" to it.
#delegate
def intfactory(num):
return int(num)
Functions can be added to the delegate with += (and removed with -=). You can also decorate with funcname.callback to add a callback function.
#intfactory.callback
def notify(num):
print "notify:", num
def increment(num):
return num+1
intfactory += increment
intfactory += lambda num: num * 2
print intfactory(3) # outputs 8
Does this feel Pythonic?
I might use a decorator so that the user could just write.
#new_factory
def myFactory(cls, *args, **kwargs):
instance = myFactory.chain(cls, *args, **kwargs)
# do something with the instance here if desired
return instance
Then in your module,
import sys
def new_factory(f):
mod = sys.modules[__name__]
f.chain = mod.factory
mod.factory = f
return f

Categories