Better to provide an example i guess (a littler bit pseudo-codish...)
from django.db import transaction
from somewhere import some_job
from functools import partial
class Foo:
def do_something(self, key, value):
return some_job(key, value)
#property
def modifier(self):
pass
f = Foo()
f.do_something(key='a', value=1) -> result
f.modifier.do_something(key='a', value=1) -> transaction.on_commit(partial(do_something, key='a', value=1))
Normally if do_something is called it would do it regular thing and return some result,
but when it is chained via modifier it should return transaction.on_commit(partial(do_something, key='a', value=1)) instead of regular result. Modifier might be property or something else inside class. Problem is that this insinstance is a singletone and should not be changed permanently as it will be used latelly by other code.
Can not wrap my head around how to do this.
Any ideas?
As pointed out in the comments, you can have the modifier property return a quick-and-dirty wrapper class that implements the method do_something itself but does something different to the underlying foo instance.
class Foo:
def do_something(self, key, value):
print(f"Called unmodified on {self} :)")
#property
def modifier(self):
return ModifiedFoo(self)
class ModifiedFoo:
def __init__(self, foo):
self.foo = foo
def do_something(self, key, value):
print(f"Called modified on {self.foo} :)")
f = Foo()
f.do_something(key='a', value=1)
f.modifier.do_something(key='a', value=1)
class Modifier(Foo):
def do_something(this, key, value):
transaction.on_commit(partial(super().do_something, key='a', value=1))
This does modify the foo object, but it should work if you change the name for _lock so it doesn't collide with other attributes.
from functools import partial
class Foo:
_lock = False
def do_something(self, key, value):
if self._lock:
transaction.on_commit(partial(some_job, key, value))
self._lock = False
else:
return some_job(key, value)
def lock(self):
self._lock = True
return self
# some mocks
class Transaction:
#staticmethod
def on_commit(func):
print("Transaction commit success.")
func()
some_job = lambda x, y: print("Some job", x, y)
transaction = Transaction()
# end mocks
foo = Foo()
print(">>> foo.do_something(...)")
foo.do_something('key1', 'value1')
# Some job key1 value1
print(">>> foo.lock().do_something(...)")
foo.lock().do_something('key2', 'value2')
# Transaction commit success.
# Some job key2 value2
Related
So I have a class with a method, which takes string. Somethinkg like this:
class A():
def func(self, name):
# do some stuff with it
I have finite number of possible values, [val1, val2, val2] for example, All strings. I want to use them like this:
a = A()
a.val1() # actually a.func(val1)
I tried to combine decorators and setattr:
class A():
def func(self, val):
# do some stuff with it
def register(self, val):
def wrapper(self):
self.func(val)
setattr(self, val, wrapper)
So I can iterate through all possible values in run-time:
a = A()
for val in vals:
a.register(val)
And it has zero effect. Usually setattr adds new attribute with value None, but in this case nothing happens. Can somebody explain why it is this way and what can I do?
register() isn't a decorator, it's mostly just a "function factory" with side-effects. Also, as I said in a comment, setattr() needs to know what name to assigned to the value.
Here's a way to get your code to work:
class A():
def func(self, val):
# do some stuff with it
print('func({}) called'.format(val))
def register(self, val, name):
def wrapper():
self.func(val)
wrapper.__name__ = name
setattr(self, name, wrapper)
vals = 10, 20, 30
a = A()
for i, val in enumerate(vals, 1):
a.register(val, 'val'+str(i)) # Creates name argument.
a.val1() # -> func(10) called
a.val2() # -> func(20) called
This question is not in general about the observer pattern. It is focused on the use of decorators in that pattern. The question is based on the answer of a similar question.
#!/usr/bin/env python3
class Observable:
"""
The object that need to be observed. Alternative names are 'Subject'.
In the most cases it is a data object.
"""
def __init__(self):
self._observers = []
def register_observer(self, callback):
self._observers.append(callback)
return callback
def _broadcast_observers(self, *args, **kwargs):
for callback in self._observers:
callback(*args, **kwargs)
class TheData(Observable):
"""
Example of a data class just for demonstration.
"""
def __init__(self, data):
Observable.__init__(self)
self._data = data
#property
def data(self):
return self._data
#data.setter
def data(self, data):
self._data = data
self._broadcast_observers()
class TheGUIElement:
"""
Example of a gui class (Widget) just for demonstration.
e. g. it could be a text field in GUI.
"""
def __init__(self, data):
self._data = data
#data.register_observer(self._data_updated)
self._redraw()
def _redraw(self):
print('in _redraw(): ' + data.data)
#Observable.register_observer
def _data_updated(self, **kwargs):
"""
This is the callback that is called by the Observable if the
data changed.
"""
print('in _data_updated() - kwargs: {}'.format(kwargs))
self._redraw()
if __name__ == '__main__':
data = TheData('DATA')
gui = TheGUIElement(data)
data.data = 'SECOND DATA'
This code doesn't work because of this error.
Traceback (most recent call last):
File "./o.py", line 42, in <module>
class TheGUIElement:
File "./o.py", line 55, in TheGUIElement
#Observable.register_observer
TypeError: register_observer() missing 1 required positional argument: 'callback'
It is unclear to me how to use a decorator for to register the observers (e.g. TheGUIElement).
To register the callback, you need to have an actual object. In your code, how is #Observable.register_observer supposed to find which instance is should register on?
Please drop that Observable thing that's a javaism, cumbersome in python.
Look at this.
#!/usr/bin/env python
class SomeData(object):
def __init__(self, value):
self.callbacks = []
self.foo = value
def register(self, callback):
self.callbacks.append(callback)
return callback
def notify(self, *args, **kwargs):
for callback in self.callbacks:
callback(self, *args, **kwargs)
class SomeGUI(object):
def redraw(self, obj, key, newvalue):
print('redrawing %s with value %s' % (self, newvalue))
if __name__ == '__main__':
my_data = SomeData(42)
# Register some function using decorator syntax
#my_data.register
def print_it(obj, key, value):
print('Key %s changed to %s' % (key, value))
# Register the SomeGUI element
my_gui = SomeGUI()
my_data.register(my_gui.redraw)
# Try changing it. Note my_data is dumb for now, notify manually.
my_data.foo = 10
my_data.notify("foo", 10)
I intentionally removed automatic notifications to illustrate registration by itself.
Let's add it back. But there is no point using that Observable class. Let's make it lighter, simply defining an event class.
#!/usr/bin/env python3
class Event(object):
def __init__(self):
self.callbacks = []
def notify(self, *args, **kwargs):
for callback in self.callbacks:
callback(*args, **kwargs)
def register(self, callback):
self.callbacks.append(callback)
return callback
class SomeData(object):
def __init__(self, foo):
self.changed = Event()
self._foo = foo
#property
def foo(self):
return self._foo
#foo.setter
def foo(self, value):
self._foo = value
self.changed.notify(self, 'foo', value)
class SomeGUI(object):
def redraw(self, obj, key, newvalue):
print('redrawing %s with value %s' % (self, newvalue))
if __name__ == '__main__':
my_data = SomeData(42)
# Register some function using decorator syntax
#my_data.changed.register
def print_it(obj, key, value):
print('Key %s changed to %s' % (key, value))
# Register the SomeGUI element
my_gui = SomeGUI()
my_data.changed.register(my_gui.redraw)
# Try changing it.
my_data.foo = 10
As you probably noted now, the decorator syntax is useful in those circumstances:
You have a single registry. Either a singleton or the class itself class are first-order objects, and most are singletons.
You dynamically define the function and register it as you go.
Now, those manual getters/setters you have are cumbersome as well, if you have many why not factor them out?
#!/usr/bin/env python3
class Event(object):
def __init__(self):
self.callbacks = []
def notify(self, *args, **kwargs):
for callback in self.callbacks:
callback(*args, **kwargs)
def register(self, callback):
self.callbacks.append(callback)
return callback
#classmethod
def watched_property(cls, event_name, key):
actual_key = '_%s' % key
def getter(obj):
return getattr(obj, actual_key)
def setter(obj, value):
event = getattr(obj, event_name)
setattr(obj, actual_key, value)
event.notify(obj, key, value)
return property(fget=getter, fset=setter)
class SomeData(object):
foo = Event.watched_property('changed', 'foo')
def __init__(self, foo):
self.changed = Event()
self.foo = foo
class SomeGUI(object):
def redraw(self, obj, key, newvalue):
print('redrawing %s with value %s' % (self, newvalue))
if __name__ == '__main__':
my_data = SomeData(42)
# Register some function using decorator syntax
#my_data.changed.register
def print_it(obj, key, value):
print('Key %s changed to %s' % (key, value))
# Register the SomeGUI element
my_gui = SomeGUI()
my_data.changed.register(my_gui.redraw)
# Try changing it.
my_data.foo = 10
For reference, all three programs output the exact same thing:
$ python3 test.py
Key foo changed to 10
redrawing <__main__.SomeGUI object at 0x7f9a90d55fd0> with value 10
Even though the thread is kinda old (probably the problem is already solved), I would like to share a solution of mine to the "Decorated Observer Pattern" problem:
https://pypi.org/project/notifyr/
I created a package that implements decorators which add the observer-observed methods/attributes to python classes. I managed to use the package in a Django project too, but with a few adaptations (the .observers attribute is not persisted in the database, so I had to load the list of observers into it every time I expected to notify them).
Here is an implementation example:
Original Code:
class Dog(object):
def __init__(self, name):
self.name = name
def bark(self):
print('Woof')
def sleep(self):
print(self.name, 'is now asleep: ZZzzzzZzzZ...')
class Person(object):
def __init__(self, name):
self.name = name
def educate_dog(self, dog):
print(self.name + ':','Sleep,', dog.name)
dog.sleep()
Suppose we want a person to educate a dog every time the animal barks:
from notifyr.agents import observed, observer
from notifyr.functions import target
#observed
class Dog(object):
def __init__(self, name):
self.name = name
#target
def bark(self):
print('Woof')
def sleep(self):
print(self.name, 'is now asleep: ZZzzzzZzzZ...')
#observer('educate_dog')
class Person(object):
def __init__(self, name):
self.name = name
def educate_dog(self, dog):
print(self.name + ':','Sleep,', dog.name)
dog.sleep()
Given the decorated classes, it is possible to achieve the following result:
d = Dog('Tobby')
p = Person('Victor')
d.attach(p) # Victor is now observing Tobby
d.bark()
# Woof
# Victor: Sleep, Tobby
# Tobby is now asleep: ZZzzzzZzzZ...
The package is still very primitive, but it presents a working solution to this type of situation.
I was recently looking for something similar and here's what I came up with. It works by intercepting the __setattr__ method -- a useful stunt I plan on keeping in my pocket for later.
def watchableClass(cls):
"""
Class Decorator!
* If the class has a "dirty" member variable, then it will be
automatically set whenever any class value changes
* If the class has an "onChanged()" method, it will be called
automatically whenever any class value changes
* All this only takes place if the value is different from what it was
that is, if myObject.x is already 10 and you set myObject.x=10
nothing happens
* DOES NOT work with getter/setter functions. But then, you are
already in a function, so do what you want!
EXAMPLE:
#watchableClass
class MyClass:
def __init__(self):
self.dirty=False
def onChanged(self):
print('class has changed')
"""
if hasattr(cls,'__setattr__'):
cls.__setattr_unwatched__=cls.__setattr__
cls.__setattr__=_setObjValueWatchedCascade
else:
cls.__setattr__=_setObjValueWatched
return cls
def _setObjValueWatched(ob,k,v):
"""
called when an object value is set
"""
different=not k in ob.__dict__ or ob.__dict__[k]!=v
if different:
ob.__dict__[k]=v
if k not in ('dirty'):
if hasattr(ob,'dirty'):
ob.dirty=True
if hasattr(ob,'onChanged'):
ob.onChanged()
def _setObjValueWatchedCascade(ob,k,v):
"""
called when an object value is set
IF the class had its own __setattr__ member defined!
"""
different=not k in ob.__dict__ or ob.__dict__[k]!=v
ob.__setattr_unwatched__(k,v)
if different:
if k not in ('dirty'):
if hasattr(ob,'dirty'):
ob.dirty=True
if hasattr(ob,'onChanged'):
ob.onChanged()
I'm using functools.partial to create a closure, and using setattr to make is callable from a class instance. The idea here is to create a set of methods at runtime.
#!/usr/bin/python
from functools import partial
class MyClass(object):
def __init__(self, val):
self.val = val
#classmethod
def generateMethods(self):
def dummy(conf1, self):
print "conf1:", conf1
print "self.val:", self.val
print
for s in ('dynamic_1', 'dynamic_2'):
closed = partial(dummy, s)
setattr(self, "test_{0}".format(s), closed)
It seems to me that partial would bind the current value of s to dummy's first arg, which would free up self to be passed when this is called from an instance.
It's not working how I'd expect
if __name__ == '__main__':
# Dynamically create some methods
MyClass.generateMethods()
# Create an instance
x = MyClass('FOO')
# The dynamically created methods aren't callable from the instance :(
#x.test_dynamic_1()
# TypeError: dummy() takes exactly 2 arguments (1 given)
# .. but these work just fine
MyClass.test_dynamic_1(x)
MyClass.test_dynamic_2(x)
Is it possible to dynamically create methods which are closures, but callable from instances of the class?
I think the new functools.partialmethod is for this exact use case.
Straight from the docs:
>>> class Cell(object):
... def __init__(self):
... self._alive = False
... #property
... def alive(self):
... return self._alive
... def set_state(self, state):
... self._alive = bool(state)
... set_alive = partialmethod(set_state, True)
... set_dead = partialmethod(set_state, False)
...
>>> c = Cell()
>>> c.alive
False
>>> c.set_alive()
>>> c.alive
True
The issue is that when you're calling them using the instances they are actually not bound methods, i.e they have no knowledge about the instance. Bound methods insert the self to the arguments of the underlying function automatically when called, it is stored in the __self__ attribute of bound method.
So, override __getattribute__ and see if the object being fetched is an instance of partial type or not, if yes, convert it to a bound method using types.MethodType.
Code:
#!/usr/bin/python
from functools import partial
import types
class MyClass(object):
def __init__(self, val):
self.val = val
#classmethod
def generateMethods(self):
def dummy(conf1, self):
print "conf1:", conf1
print "self.val:", self.val
print
for s in ('dynamic_1', 'dynamic_2'):
closed = partial(dummy, s)
setattr(self, "test_{0}".format(s), closed)
def __getattribute__(self, attr):
# Here we do have access to the much need instance(self)
obj = object.__getattribute__(self, attr)
if isinstance(obj, partial):
return types.MethodType(obj, self, type(self))
else:
return obj
if __name__ == '__main__':
MyClass.generateMethods()
x = MyClass('FOO')
x.test_dynamic_1()
x.test_dynamic_2()
Currently when I want to define a setter and leave getter alone I do this:
#property
def my_property(self):
return self._my_property
#my_property.setter
def my_property(self, value):
value.do_some_magic()
self._my_property = value
Is there any way to make it shorter? I'd like to skip this part as it always look the same:
#property
def my_property(self):
return self._my_property
There's no out of the box solution, but you can try something like this:
def defprop(name):
def getter(self):
return getattr(self, name)
return property(getter)
class C(object):
# ...
my_dictionary = defprop('_my_dictionary')
# ...
That does not save you that many keystrokes though, you still have to duplicate the attribute name. Besides it's less explicit.
Update: after thinking a bit, I've come up with this descriptor-based hackish trick (disclaimer: this is done just for a demonstration, I don't imply it's a good practice unless you have a damn good reason to do so):
class with_default_getter(object):
def __init__(self, func):
self._attr_name = '_{0.__name__}'.format(func)
self._setter = func
def __get__(self, obj, type):
return getattr(obj, self._attr_name)
def __set__(self, obj, value):
return self._setter(obj, value)
Usage:
class C(object):
#with_default_getter
def my_property(self, value):
print 'setting %s'
self._my_property = value
>>> c = C()
>>> c.my_property = 123
setting 123
>>> c.my_property
123
This is pretty much the same as #georg suggests, just unfolds the implementation down to descriptors.
You can make a decorator that auto-creates the getter, following the underscores convention:
def setter(fn):
def _get(self):
return getattr(self, '_' + fn.__name__)
def _set(self, val):
return fn(self, val)
return property(_get, _set)
or more concisely, if you like this style more:
def setter(fn):
return property(
lambda self: getattr(self, '_' + fn.__name__),
fn)
Usage:
class X(object):
#setter
def my_property(self, value):
self._my_property = value + 1
x = X()
x.my_property = 42
print x.my_property # 43
There is no shortcut that I am aware of- remember explicit is better than implicit (from the Zen of python).
It could be that in your code so far, a property is always like that - but you could at some point write a a property getter which fetches an entirely calculated value - in which case your property getter and setter wont look like that at all.
Haveing said that you could write a wrapper which provides those simple default methods as part of the wrapper, if you wish.
def set_my_property(self, value):
value.do_some_magic()
self._my_property = value
my_property = property(fset=set_my_property)
I've created an object called 'Frame'
class Frame:
def __init__(self, image):
self.image = image
def gray(self):
return cv2.cvtColor(self.image, cv2.COLOR_BGR2GRAY)
Some operations, e.g. gray(), are expensive. I'd like to cache the result in the instance so that subsequent calls don't have to recalculate this. What's the cleanest way to do that?
Pyramid uses this fantastic #reify decorator:
class reify(object):
""" Use as a class method decorator. It operates almost exactly like the
Python ``#property`` decorator, but it puts the result of the method it
decorates into the instance dict after the first call, effectively
replacing the function it decorates with an instance variable. It is, in
Python parlance, a non-data descriptor. An example:
.. code-block:: python
class Foo(object):
#reify
def jammy(self):
print('jammy called')
return 1
And usage of Foo:
>>> f = Foo()
>>> v = f.jammy
'jammy called'
>>> print(v)
1
>>> f.jammy
1
>>> # jammy func not called the second time; it replaced itself with 1
"""
def __init__(self, wrapped):
self.wrapped = wrapped
try:
self.__doc__ = wrapped.__doc__
except: # pragma: no cover
pass
def __get__(self, inst, objtype=None):
if inst is None:
return self
val = self.wrapped(inst)
setattr(inst, self.wrapped.__name__, val)
return val
The docstring speaks for itself. =)
The best goto for problems involving this would be to use a memoize like function simple decorator solution can be found here: (http://code.activestate.com/recipes/578231-probably-the-fastest-memoization-decorator-in-the-/)
Could this not simply be done on instantiation?
class Frame:
def __init__(self, image):
self.image = image
self.gray = cv2.cvtColor(self.image, cv2.COLOR_BGR2GRAY)
EDIT - Seems like a good time for a property?
class Frame(object):
def __init__(self, image):
self.image = image
self._grey = None
#property
def grey(self):
if self._grey is None:
self._grey = solve_for_grey(self.image, stuff)
return self._grey