I have something like this:
class SuperClass(object):
def __init__(self):
# initialization stuff
def always_do_this_last(self):
# cleanup stuff
class SubClass(SuperClass):
def __init__(self):
super().__init__()
# intermediate stuff
self.always_do_this_last()
Is it possible to automatically call that last line? Every subclass of SuperClass needs perform the cleanup.
Instead of overriding __init__, define a method that SuperClass.__init__ will call.
class SuperClass(object):
def __init__(self):
# do some stuff
self.child_init()
self.cleanup()
def cleanup():
...
def child_init(self):
pass
class SubClass(SuperClass):
def child_init(self):
...
You can define SuperClass.__init_subclass__ to ensure child_init is overriden, or use the abc module to make SuperClass.child_init an abstract method
One option could be to use a method that the subclasses could override without overriding __init__(). Maybe like this:
class SuperClass:
def __init__(self):
# initialization stuff
self.setup_subclass()
self.always_do_this_last()
def setup_subclass(self):
pass
def always_do_this_last(self):
# cleanup stuff
class SubClass(SuperClass):
def setup_subclass(self):
# intermediate stuff
Would that work for you?
You have 2 options:
Use a different method as your initializer and call always_do_this_last afterwards
class SuperClass(object):
def __init__(self):
self._init() # initialize
self.always_do_this_last() # clean up
def _init(self):
pass # initialization stuff
def always_do_this_last(self):
pass # cleanup stuff
class SubClass(SuperClass):
def _init(self):
super()._init()
# intermediate stuff
Use a metaclass
class CleanupMeta(type):
def __call__(cls, *args, **kwargs):
obj = super().__call__(*args, **kwargs)
obj.always_do_this_last()
return obj
class SuperClass(metaclass=CleanupMeta):
def __init__(self):
pass # initialization stuff
def always_do_this_last(self):
pass # cleanup stuff
class SubClass(SuperClass):
def __init__(self):
super().__init__()
# intermediate stuff
The other answers here are more than sufficient. I will add that you might want to have a look at the abstract base class if you are implementing a class that requires certain member functions to be implemented.
In the example below the parent requires the initialize and cleanup methods to be defined in each child (try removing one of them to verify an error is raised).
import abc
class SuperClass(object):
__metaclass__ = abc.ABCMeta
def __init__(self):
print("Instantiating Class")
self.initialize()
self.cleanup()
#abc.abstractmethod
def initialize(self):
pass
#abc.abstractmethod
def cleanup(self):
pass
class SubClass(SuperClass):
def __init__(self):
super(SubClass, self).__init__()
def initialize(self):
print("initializing...")
def cleanup(self):
print("... cleanup.")
a = SubClass()
I would like to know how I could take an object from a function and place it and all it's attributes into another object.
class Something:
def create(self):
print('Creating')
class Foo(Something):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def do_something(self):
print('Do somthing')
def bar():
# Can not change this function
return Something()
s = bar()
s.create() # 'Creating'
-- I want to do this --
f = Foo(s)
f.create()
f.do_something()
Limitations:
I cant alter bar(). I need to be able to access all of Something's methods and attributes from Foo. I would like to stay away form composition so that I can call Foo.create() directly (not like Foo.something.create()).
Change __init__(self, *args, **kwargs) to __init__(self, _, *args, **kwargs):
>>> Foo(Something()).create()
Creating
>>> Foo(Something()).do_something()
Do somthing
I honestly don't see the problem here. Or why you want to supply an instance of Something when creating an instance of Foo, but there you go.
I came up with this solution, which Im not very happy with as it requires me to call the function twice.
class Something:
def create(self):
print('Creating')
class Foo:
def __init__(self, something):
self.sometthing = something
def __getattr__(self, attr):
return getattr(self.obj, attr)
#some_special_decorator
def create(self):
return self.something.create()
def do_something(self):
print('Do somthing')
def bar():
# Can not change this function
return Something()
s = bar()
s.create() # 'Creating'
f = Foo(s)
f.create()
f.do_something()
I'm looking to do the following:
class A(object):
def first_method(self):
print "I'm the first method!"
#some_decorator(A.first_method)
def second_method(self):
print "I'm the second method!"
But I'm running into the problem that A is undefined within itself at the time that the decorator is parsed. Is there any way for me to reference A in the decorator? Alternatively, if I just pass the decorator the bound method first_method is it possible for me to recover that first_method belongs to A?
In python 3, you can just say #some_decorator(first_method), and it will work, as all methods are normal functions in the classes as the containers.
In python 2, there is a complicated system of bound & unbound, instance-, class-, and static methods. Due to this, you cannot access first_method while inside of a class definitionm (until the class if fully formed).
A little workaround would be to split that class into two classes:
class BaseA(object):
def first_method(self):
print "I'm the first method!"
class A(BaseA):
#some_decorator(BaseA.first_method)
def second_method(self):
print "I'm the second method!"
Not the best solution for all cases, but will work.
Also, keep in mind, that in both cases (py2 & py3), the decorator will refer to the first_method as it was declared here. If any descendant class redefines the method, the new method will NOT be used in the decorator; only the parent one will be used.
Probably, you should not refer to first_method at all. Instead, just accept self/cls first positional argument in the decorator's wrapper, and use self.first_method/cls.first_method there:
import functools
def some_decorator(fn):
#functools.wraps(fn)
def wrapper(self, *args, **kwargs):
first_method = self.first_method
first_method()
return fn(self, *args, **kwargs)
return wrapper
class A(object):
def first_method(self):
print "I'm the first method of A!"
#some_decorator
def second_method(self):
print "I'm the second method!"
class B(A):
def first_method(self):
print "I'm the first method of B!"
A().second_method()
# I'm the first method of A!
# I'm the second method!
B().second_method()
# I'm the first method of B!
# I'm the second method!
If you want to make that method configurable:
def some_decorator(method_name):
def decorator(fn):
#functools.wraps(fn)
def wrapper(self, *args, **kwargs):
first_method = getattr(self, method_name)
first_method()
return fn(self, *args, **kwargs)
return wrapper
return decorator
class A(object):
def first_method(self):
print "I'm the first method of A!"
#some_decorator('first_method')
def second_method(self):
print "I'm the second method!"
class B(A):
def first_method(self):
print "I'm the first method of B!"
You can use your decorator as a classic function (without the #):
def some_decorator(arg):
# ...
pass
class A(object):
def first_method(self):
print("I'm the first method!")
def second_method(self):
print("I'm the second method!")
A.second_method = some_decorator(A.first_method)(A.second_method)
The basic way of creating decorators is
def my_decorator(f):
def _f(*args, **kwargs):
# do something using f
pass
return _f
#my_decorator
def f(...):
...
But that way you cannot define decorators like #property.setter, because the name of the property (and thus the name of the decorator) is different every time.
How is it #property.setter defined then? Is it possible to do something similar in Python, or is it built-in feature available only from C (implementation) level?
What you are looking for is something called a descriptor:
class Descriptor(object):
def __get__(self, instance, _type=None):
pass
def __set__(self, obj, value):
pass
You are free to implement things as you see fit. The property decorator is just an instance of a descriptor (more or less) and you can see how they'd implement it in the documents describing this item.
Here's an example:
class _Wrapper(object):
def __init__(self, caller, instance):
self.caller = caller
self.instance = instance
def __call__(self, *args, **kwargs):
print "I've been wrapped!"
return self.caller(self.instance, *args, **kwargs)
class Accouncer(object):
def __init__(self, method):
self.method = method
def __get__(self, instance, _type=None):
return _Wrapper(self.method, instance)
def vocal(func):
return Accouncer(func)
class Ha(object):
#vocal
def stuff(self):
return 1
I understand from this question that if I want to have a set which is thread-safe I have to implement the thread-safety part on my own.
Therefore I could come up with:
from threading import Lock
class LockedSet(set):
"""A set where add() and remove() are thread-safe"""
def __init__(self, *args, **kwargs):
# Create a lock
self._lock = Lock()
# Call the original __init__
super(LockedSet, self).__init__(*args, **kwargs)
def add(self, elem):
self._lock.acquire()
try:
super(LockedSet, self).add(elem)
finally:
self._lock.release()
def remove(self, elem):
self._lock.acquire()
try:
super(LockedSet, self).remove(elem)
finally:
self._lock.release()
So, of course only add() and remove() are thread-safe in this implementation. The other methods are not because they were not overwritten in the subclass.
Now, the pattern is pretty simple: acquire lock, call original method, release lock.
If I follow the logic above, I would have to overwrite all methods exposed by set in essentially the same way, e.g.:
(pseudo-code)
def <method>(<args>):
1. acquire lock
2. try:
3. call original method passing <args>
4. finally:
5. release lock
(/pseudo-code)
This is not only tedious but also prone to errors. So, any ideas/suggestions on how to approach this in a better way?
You can use Python's metaprogramming facilities to accomplish this. (Note: written quickly and not thoroughly tested.) I prefer to use a class decorator.
I also think you may need to lock more than add and remove to make a set thread-safe, but I'm not sure. I'll ignore that problem and just concentrate on your question.
Also consider whether delegation (proxying) is a better fit than subclassing. Wrapping objects is the usual approach in Python.
Finally, there is no "magic wand" of metaprogramming that will magically add fine-grained locking to any mutable Python collection. The safest thing to do is to lock any method or attribute access using RLock, but this is very coarse-grained and slow and probably still not a guarantee that your object will be thread-safe in all cases. (For example, you may have a collection that manipulates another non-threadsafe object accessible to other threads.) You really do need to examine each and every data structure and think about what operations are atomic or require locks and which methods might call other methods using the same lock (i.e., deadlock itself).
That said, here are some techniques at your disposal in increasing order of abstraction:
Delegation
class LockProxy(object):
def __init__(self, obj):
self.__obj = obj
self.__lock = RLock()
# RLock because object methods may call own methods
def __getattr__(self, name):
def wrapped(*a, **k):
with self.__lock:
getattr(self.__obj, name)(*a, **k)
return wrapped
lockedset = LockProxy(set([1,2,3]))
Context manager
class LockedSet(set):
"""A set where add(), remove(), and 'in' operator are thread-safe"""
def __init__(self, *args, **kwargs):
self._lock = Lock()
super(LockedSet, self).__init__(*args, **kwargs)
def add(self, elem):
with self._lock:
super(LockedSet, self).add(elem)
def remove(self, elem):
with self._lock:
super(LockedSet, self).remove(elem)
def __contains__(self, elem):
with self._lock:
super(LockedSet, self).__contains__(elem)
Decorator
def locked_method(method):
"""Method decorator. Requires a lock object at self._lock"""
def newmethod(self, *args, **kwargs):
with self._lock:
return method(self, *args, **kwargs)
return newmethod
class DecoratorLockedSet(set):
def __init__(self, *args, **kwargs):
self._lock = Lock()
super(DecoratorLockedSet, self).__init__(*args, **kwargs)
#locked_method
def add(self, *args, **kwargs):
return super(DecoratorLockedSet, self).add(elem)
#locked_method
def remove(self, *args, **kwargs):
return super(DecoratorLockedSet, self).remove(elem)
Class Decorator
I think this is the cleanest and easiest-to-understand of the abstract methods, so I've expanded it to allow one to specify the methods to lock and a lock object factory.
def lock_class(methodnames, lockfactory):
return lambda cls: make_threadsafe(cls, methodnames, lockfactory)
def lock_method(method):
if getattr(method, '__is_locked', False):
raise TypeError("Method %r is already locked!" % method)
def locked_method(self, *arg, **kwarg):
with self._lock:
return method(self, *arg, **kwarg)
locked_method.__name__ = '%s(%s)' % ('lock_method', method.__name__)
locked_method.__is_locked = True
return locked_method
def make_threadsafe(cls, methodnames, lockfactory):
init = cls.__init__
def newinit(self, *arg, **kwarg):
init(self, *arg, **kwarg)
self._lock = lockfactory()
cls.__init__ = newinit
for methodname in methodnames:
oldmethod = getattr(cls, methodname)
newmethod = lock_method(oldmethod)
setattr(cls, methodname, newmethod)
return cls
#lock_class(['add','remove'], Lock)
class ClassDecoratorLockedSet(set):
#lock_method # if you double-lock a method, a TypeError is raised
def frobnify(self):
pass
Override Attribute access with __getattribute__
class AttrLockedSet(set):
def __init__(self, *args, **kwargs):
self._lock = Lock()
super(AttrLockedSet, self).__init__(*args, **kwargs)
def __getattribute__(self, name):
if name in ['add','remove']:
# note: makes a new callable object "lockedmethod" on every call
# best to add a layer of memoization
lock = self._lock
def lockedmethod(*args, **kwargs):
with lock:
return super(AttrLockedSet, self).__getattribute__(name)(*args, **kwargs)
return lockedmethod
else:
return super(AttrLockedSet, self).__getattribute__(name)
Dynamically-added wrapper methods with __new__
class NewLockedSet(set):
def __new__(cls, *args, **kwargs):
# modify the class by adding new unbound methods
# you could also attach a single __getattribute__ like above
for membername in ['add', 'remove']:
def scoper(membername=membername):
# You can also return the function or use a class
def lockedmethod(self, *args, **kwargs):
with self._lock:
m = getattr(super(NewLockedSet, self), membername)
return m(*args, **kwargs)
lockedmethod.__name__ = membername
setattr(cls, membername, lockedmethod)
self = super(NewLockedSet, cls).__new__(cls, *args, **kwargs)
self._lock = Lock()
return self
Dynamically-added wrapper methods with __metaclass__
def _lockname(classname):
return '_%s__%s' % (classname, 'lock')
class LockedClass(type):
def __new__(mcls, name, bases, dict_):
# we'll bind these after we add the methods
cls = None
def lockmethodfactory(methodname, lockattr):
def lockedmethod(self, *args, **kwargs):
with getattr(self, lockattr):
m = getattr(super(cls, self), methodname)
return m(*args,**kwargs)
lockedmethod.__name__ = methodname
return lockedmethod
lockattr = _lockname(name)
for methodname in ['add','remove']:
dict_[methodname] = lockmethodfactory(methodname, lockattr)
cls = type.__new__(mcls, name, bases, dict_)
return cls
def __call__(self, *args, **kwargs):
#self is a class--i.e. an "instance" of the LockedClass type
instance = super(LockedClass, self).__call__(*args, **kwargs)
setattr(instance, _lockname(self.__name__), Lock())
return instance
class MetaLockedSet(set):
__metaclass__ = LockedClass
Dynamically-created Metaclasses
def LockedClassMetaFactory(wrapmethods):
class LockedClass(type):
def __new__(mcls, name, bases, dict_):
# we'll bind these after we add the methods
cls = None
def lockmethodfactory(methodname, lockattr):
def lockedmethod(self, *args, **kwargs):
with getattr(self, lockattr):
m = getattr(super(cls, self), methodname)
return m(*args,**kwargs)
lockedmethod.__name__ = methodname
return lockedmethod
lockattr = _lockname(name)
for methodname in wrapmethods:
dict_[methodname] = lockmethodfactory(methodname, lockattr)
cls = type.__new__(mcls, name, bases, dict_)
return cls
def __call__(self, *args, **kwargs):
#self is a class--i.e. an "instance" of the LockedClass type
instance = super(LockedClass, self).__call__(*args, **kwargs)
setattr(instance, _lockname(self.__name__), Lock())
return instance
return LockedClass
class MetaFactoryLockedSet(set):
__metaclass__ = LockedClassMetaFactory(['add','remove'])
I'll bet using a simple, explicit try...finally doesn't look so bad now, right?
Exercise for the reader: let the caller pass in their own Lock() object (dependency injection) using any of these methods.
This is my first attempt to play with decorators (although my code doesn't actually use the #decorate syntax), and I don't have much experience with multi-threading/multiprocessing. With that disclaimer, though, here's an attempt I made:
from multiprocessing import Lock
def decorate_all(obj):
lock = Lock()
#you'll want to make this more robust:
fnc_names = [fnctn for fnctn in dir(obj) if '__' not in fnctn]
for name in fnc_names:
print 'decorating ' + name
fnc = getattr(obj, name)
setattr(obj, name, decorate(fnc, lock))
return obj
def decorate(fnctn, lock):
def decorated(*args):
print 'acquiring lock'
lock.acquire()
try:
print 'calling decorated function'
return fnctn(*args)
finally:
print 'releasing lock'
lock.release()
return decorated
def thread_safe(superclass):
lock = Lock()
class Thread_Safe(superclass):
def __init__(self, *args, **kwargs):
super(Thread_Safe, self).__init__(*args, **kwargs)
return decorate_all(Thread_Safe)
>>> thread_safe_set = thread_safe(set)
decorating add
decorating clear
decorating copy
decorating difference
decorating difference_update
decorating discard
decorating intersection
decorating intersection_update
decorating isdisjoint
decorating issubset
decorating issuperset
decorating pop
decorating remove
decorating symmetric_difference
decorating symmetric_difference_update
decorating union
decorating update
>>> s = thread_safe_set()
>>> s.add(1)
acquiring lock
calling decorated function
releasing lock
>>> s.add(4)
acquiring lock
calling decorated function
releasing lock
>>> s.pop()
acquiring lock
calling decorated function
releasing lock
1
>>> s.pop()
acquiring lock
calling decorated function
releasing lock
4
>>>
[Indeed, see the comments, it is not true]
If you are running CPython you can see from the set source code that it doesn't release the GIL (http://hg.python.org/cpython/file/db20367b20de/Objects/setobject.c) so all its operations should be atomic.
If it is all what you need and you are sure to run your code on CPython you can just use it directly.
You can implement your own context manager:
class LockableSet:
def __enter__(self):
self.lock()
return self
def __exit__(self, exc_type, exc_value, traceback):
#Do what you want with the error
self.unlock()
with LockableSet() as s:
s.whatever()
raise Exception()
No matter what, the object's __exit__ method will be called at the end. More detailed informations are available here (python official docs).
Another use for this could be a lock decorator for methods, like this:
def lock(func):
def safe_func(self, *args, **kwargs):
with self:
func(self, *args, **kwargs)
return safe_func