i had a class called CacheObject,and many class extend from it.
now i need to add something common on all classes from this class so i write this
class CacheObject(object):
def __init__(self):
self.updatedict = dict()
but the child class didn't obtain the updatedict attribute.i know calling super init function was optional in python,but is there an easy way to force all of them to add the init rather than walk all the classes and modify them one by one?
I was in a situation where I wanted classes to always call their base classes' constructor in order before they call their own. The following is Python3 code that should do what you want:
class meta(type):
def __init__(cls,name,bases,dct):
def auto__call__init__(self, *a, **kw):
for base in cls.__bases__:
base.__init__(self, *a, **kw)
cls.__init__child_(self, *a, **kw)
cls.__init__child_ = cls.__init__
cls.__init__ = auto__call__init__
class A(metaclass=meta):
def __init__(self):
print("Parent")
class B(A):
def __init__(self):
print("Child")
To illustrate, it will behave as follows:
>>> B()
Parent
Child
<__main__.B object at 0x000001F8EF251F28>
>>> A()
Parent
<__main__.A object at 0x000001F8EF2BB2B0>
I suggest a non-code fix:
Document that super().__init__() should be called by your subclasses before they use any other methods defined in it.
This is not an uncommon restriction. See, for instance, the documentation for threading.Thread in the standard library, which says:
If the subclass overrides the constructor, it must make sure to invoke the base class constructor (Thread.__init__()) before doing anything else to the thread.
There are probably many other examples, I just happened to have that doc page open.
You can override __new__. As long as your base classes doesn't override __new__ without calling super().__new__, then you'll be fine.
class CacheObject(object):
def __new__(cls, *args, **kwargs):
instance = super().__new__(cls, *args, **kwargs)
instance.updatedict = {}
return instance
class Foo(CacheObject):
def __init__(self):
pass
However, as some commenters said, the motivation for this seems a little shady. You should perhaps just add the super calls instead.
This isn't what you asked for, but how about making updatedict a property, so that it doesn't need to be set in __init__:
class CacheObject(object):
#property
def updatedict(self):
try:
return self._updatedict
except AttributeError:
self._updatedict = dict()
return self._updatedict
Hopefully this achieves the real goal, that you don't want to have to touch every subclass (other than to make sure none uses an attribute called updatedict for something else, of course).
There are some odd gotchas, though, because it is different from setting updatedict in __init__ as in your question. For example, the content of CacheObject().__dict__ is different. It has no key updatedict because I've put that key in the class, not in each instance.
Regardless of motivation, another option is to use __init_subclass__() (Python 3.6+) to get this kind of behavior. (For example, I'm using it because I want users not familiar with the intricacies of Python to be able to inherit from a class to create specific engineering models, and I'm trying to keep the structure of the class they have to define very basic.)
In the case of your example,
class CacheObject:
def __init__(self) -> None:
self.updatedict = dict()
def __init_subclass__(cls) -> None:
orig_init = cls.__init__
#wraps(orig_init)
def __init__(self, *args, **kwargs):
orig_init(self, *args, **kwargs)
super(self.__class__, self).__init__()
cls.__init__ = __init__
What this does is any class that subclasses CacheObject will now, when created, have its __init__ function wrapped by the parent class—we're replacing it with a new function that calls the original, and then calls super() (the parent's) __init__ function. So now, even if the child class overrides the parent __init__, at the instance's creation time, its __init__ is then wrapped by a function that calls it and then calls its parent.
You can add a decorator to your classes :
def my_decorator(cls):
old_init = cls.__init__
def new_init(self):
self.updatedict = dict()
old_init(self)
cls.__init__ = new_init
return cls
#my_decorator
class SubClass(CacheObject):
pass
if you want to add the decorators to all the subclasses automatically, use a metaclass:
class myMeta(type):
def __new__(cls, name, parents, dct):
return my_decorator(super().__new__(cls, name, parents, dct))
class CacheObject(object, metaclass=myMeta):
pass
Related
I am trying to understand the relation in between python metaclass and class. I was trying to create singleton class and found this code
class SingleTon(type):
def __call__(self, *args, **kwargs):
if self._instances is None:
self._instances = super(SingleTon, self).__call__(*args, **kwargs)
return self._instances
class Counter:
__metaclass__ = SingleTon
_instances = None
def __init__(self):
self.count = 1
c = Counter()
my question here is how counter class object is getting created using metaclass. I know metaclass call method gets called whenever we create an object but the confusion is here what this code super(SingleTon, self).__call__(*args, **kwargs) does here. Please explain. It would be very appreciable.
super will just forward the arguments to type.__call__ which is responsible for the class creation.
It's like calling super in a 'normal' class hierarchy only now, you're calling it in a metaclass. Since SingleTon is a subclass of type, that'll get called. In a class scenario, you'd (normally) forward calls to the base class object.
I have a class who's job is to wrap another class (code I don't control), intercept all calls to the wrapped class, perform some logic, and pass along the call to the underlying class. Here's an example:
class GithubRepository(object):
def get_commit(self, sha):
return 'Commit {}'.format(sha)
def get_contributors(self):
return ['bobbytables']
class LoggingGithubRepositoryWrapper(object):
def __init__(self, github_repository):
self._github_repository = github_repository
def __getattr__(self, name):
base_func = getattr(self._github_repository, name)
def log_wrap(*args, **kwargs):
print "Calling {}".format(name)
return base_func(*args, **kwargs)
return log_wrap
if __name__ == '__main__':
git_client = LoggingGithubRepositoryWrapper(GithubRepository())
print git_client.get_commit('abcdef1245')
print git_client.get_contributors()
As you can see, the way that I do this is by implementing __getattr__ on the wrapping class and delegating to the underlying class. The downside to this approach is that users of LoggingGithubRepositoryWrapper don't know which attributes/methods the underlying GithubRepository actually has.
This leads me to my question: is there a way to define or document the calls handled by __getattr__? Ideally, I'd like to be able to autocomplete on git_client. and be provided a list of supported methods. Thanks for your help in advance!
You can do this a few different ways, but they wont involve the use of __getattr__.
What you really need to do is dynamically create your class, or at least dynamically create the wrapped functions on your class. There are a few ways to do this in python.
You could build the class definition using type() or metaclasses, or build it on class instantiation using the __new__ method.
Every time you call LoggingGithubRepositoryWrapper(), the __new__ method will be called. Here, it looks at all the attributes on the github_repository argument and finds all the non-private methods. It then creates a function on the instantiated LoggingGithubRepositoryWrapper class instance that wraps the repo call in a logging statement.
At the end, it passes back the modified class instance. Then __init__ is called.
from types import MethodType
class LoggingGithubRepositoryWrapper(object):
def __new__(cls, github_repository):
self = super(LoggingGithubRepositoryWrapper, cls).__new__(cls)
for name in dir(github_repository):
if name.startswith('__'):
continue
func = getattr(github_repository, name)
if isinstance(func, MethodType):
setattr(self, name, cls.log_wrap(func))
return self
#staticmethod
def log_wrap(func):
def wrap(*args, **kwargs):
print 'Calling {0}'.format(func.__name__)
return func(*args, **kwargs)
return wrap
def __init__(self, github_repository):
... # this is all the same
ClassA to inherits from Base class which inherits from built-in dict class.
'name' and 'id' are Base class attributes. 'win' and 'mac' are attributes ClassA attributes.
How should I put a logic in this code so classA instance could be declared as easy as:
myInstance=ClassA(myDictArg)
===============================
class Base(dict):
"""Base is the base class from which all other classes derrive.
Base class inherits from build-in dict type.
"""
id = 'id'
name = 'name'
def __init__(self, arg=None):
"""Initialise Base Class"""
dict.__init__(self)
self[Base.id] = -1
self[Base.name] = None
if 'id' in arg.keys() and arg['id']: self['id']=arg['id']
if 'name' in arg.keys() and arg['name']: self['name']=arg['name']
class ClassA(Base):
"""ClassA is a class inherited from a Base class."""
def __init__(self, arg=None):
if arg==None: raise Exception('arg==None')
Base.__init__(self)
self.arg = arg
# set a generic to ClassA Attrs
self['win']=None
self['mac']=None
myDictArg= {'id':1, 'name':'MyName', 'win':'c:/windows', 'mac': '/Volumes/mac/'}
myInstance=ClassA(myDictArg)
print myInstance
This class structure has the advantage that it keeps the signature of dict which is pretty flexible and sets default values only if they aren't provided (which I think was the original goal). It also (due to judicious use of super) is well set up to support cooperative multiple inheritance (Horray!).
class Base(dict):
def __init__(self, *args, **kwargs):
super(Base, self).__init__(*args, **kwargs)
self.setdefault('id', -1)
self.setdefault('name', None)
class ClassA(Base):
"""ClassA is a class inherited from a Base class."""
def __init__(self, *args, **kwargs):
if not (args or kwargs):
raise Exception('you need to give me *something*!')
super(ClassA, self).__init__(*args, **kwargs)
self.setdefault('win', None)
self.setdefault('mac', None)
What you've written looks like it should work.. not that I've ran it myself. So I am making an assumption that you are looking for the bug in this situation...
one possibility for a problem is the fact that you are replacing the arg variable after 'id' and 'name' have been set, effectively erasing them.. I think a better idea would be to merge the args. Although the following code may not be the most pythonic.. It might look something like this.
for key in arg.keys()
self.arg[key] = arg[key]
another problem is that you aren't even passing in your args object into the base class's constructor.
I suggest you change that to
Base.__init__(self, args)
Otherwise, arg in the Base class will revert to the default; None.
I'd like to automatically run some code upon class creation that can call other class methods. I have not found a way of doing so from within the class declaration itself and end up creating a #classmethod called __clsinit__ and call it from the defining scope immediately after the class declaration. Is there a method I can define such that it will get automatically called after the class object is created?
You can do this with a metaclass or a class decorator.
A class decorator (since 2.6) is probably easier to understand:
def call_clsinit(cls):
cls._clsinit()
return cls
#call_clsinit
class MyClass:
#classmethod
def _clsinit(cls):
print "MyClass._clsinit()"
Metaclasses are more powerful; they can call code and modify the ingredients of the class before it is created as well as afterwards (also, they can be inherited):
def call_clsinit(*args, **kwargs):
cls = type(*args, **kwargs)
cls._clsinit()
return cls;
class MyClass(object):
__metaclass__ = call_clsinit
#classmethod
def _clsinit(cls):
print "MyClass._clsinit()"
This doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (which of course it isn't at this point)
cls = method.im_class
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
Decorators are like the movie Inception; the more levels in you go, the more confusing they are. I'm trying to access the class that defines a method (at definition time) so that I can set an attribute (or alter an attribute) of the class.
Version 2 also doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (of course it isn't bound at this point).
cls = method.__class__ # I don't really understand this.
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
The point of putting my broken code above when I already know why it's broken is that it conveys what I'm trying to do.
I don't think you can do what you want to do with a decorator (quick edit: with a decorator of the method, anyway). The decorator gets called when the method gets constructed, which is before the class is constructed. The reason your code isn't working is because the class doesn't exist when the decorator is called.
jldupont's comment is the way to go: if you want to set an attribute of the class, you should either decorate the class or use a metaclass.
EDIT: okay, having seen your comment, I can think of a two-part solution that might work for you. Use a decorator of the method to set an attribute of the method, and then use a metaclass to search for methods with that attribute and set the appropriate attribute of the class:
def TaggingDecorator(method):
"Decorate the method with an attribute to let the metaclass know it's there."
method.my_attr = 'FOO BAR'
return method # No need for a wrapper, we haven't changed
# what method actually does; your mileage may vary
class TaggingMetaclass(type):
"Metaclass to check for tags from TaggingDecorator and add them to the class."
def __new__(cls, name, bases, dct):
# Check for tagged members
has_tag = False
for member in dct.itervalues():
if hasattr(member, 'my_attr'):
has_tag = True
break
if has_tag:
# Set the class attribute
dct['my_attr'] = 'FOO BAR'
# Now let 'type' actually allocate the class object and go on with life
return type.__new__(cls, name, bases, dct)
That's it. Use as follows:
class Foo(object):
__metaclass__ = TaggingMetaclass
pass
class Baz(Foo):
"It's enough for a base class to have the right metaclass"
#TaggingDecorator
def Bar(self):
pass
>> Baz.my_attr
'FOO BAR'
Honestly, though? Use the supported_methods = [...] approach. Metaclasses are cool, but people who have to maintain your code after you will probably hate you.
Rather than use a metaclass, in python 2.6+ you should use a class decorator. You can wrap the function and class decorators up as methods of a class, like this real-world example.
I use this example with djcelery; the important aspects for this problem are the "task" method and the line "args, kw = self.marked[klass.dict[attr]]" which implicitly checks for "klass.dict[attr] in self.marked". If you want to use #methodtasks.task instead of #methodtasks.task() as a decorator, you could remove the nested def and use a set instead of a dict for self.marked. The use of self.marked, instead of setting a marking attribute on the function as the other answer did, allows this to work for classmethods and staticmethods which, because they use slots, won't allow setting arbitrary attributes. The downside of doing it this way is that the function decorator MUST go above other decorators, and the class decorator MUST go below, so that the functions are not modified / re=wrapped between one and the other.
class DummyClass(object):
"""Just a holder for attributes."""
pass
class MethodTasksHolder(object):
"""Register tasks with class AND method decorators, then use as a dispatcher, like so:
methodtasks = MethodTasksHolder()
#methodtasks.serve_tasks
class C:
#methodtasks.task()
##other_decorators_come_below
def some_task(self, *args):
pass
#methodtasks.task()
#classmethod
def classmethod_task(self, *args):
pass
def not_a_task(self):
pass
#..later
methodtasks.C.some_task.delay(c_instance,*args) #always treat as unbound
#analagous to c_instance.some_task(*args) (or C.some_task(c_instance,*args))
#...
methodtasks.C.classmethod_task.delay(C,*args) #treat as unbound classmethod!
#analagous to C.classmethod_task(*args)
"""
def __init__(self):
self.marked = {}
def task(self, *args, **kw):
def mark(fun):
self.marked[fun] = (args,kw)
return fun
return mark
def serve_tasks(self, klass):
setattr(self, klass.__name__, DummyClass())
for attr in klass.__dict__:
try:
args, kw = self.marked[klass.__dict__[attr]]
setattr(getattr(self, klass.__name__), attr, task(*args,**kw)(getattr(klass, attr)))
except KeyError:
pass
#reset for next class
self.marked = {}
return klass