Find out overridden method in parent - python

I am familiar with OOP, and understand we can inherit from a base class and extend user_call_api in a child class adding more definitions to it. But I'm wondering is there a way that in parent class, we could find out
which methods are overridden (by child classes)
the name of (child) classes that have overridden the method
class Parent:
def call_api(self):
print("API is called")
def use_call_api(self):
# if it's overridden, find out in parent,
# do something and then
self.call_api()
# if it's not overridden
self.call_api()
class Child(Parent):
def call_api(self):
print("call_api")
class Child2(Parent):
def call_api(self):
print("call_api2")
class Child3(Parent):
def call_api(self):
print("call_ap3")
def use_call_api(self):
print("custom call_api")

You can have a metaclass that will override the __new__ dunder-method and hold the necessary information (method name and class names that overrides it) into the singleton property of it.
import re
class Inspect(type):
implementations = {}
def __new__(mcs, cls, args, kwargs):
for attr in kwargs.keys():
if not re.match(r"^__\w+__$", attr):
mcs.implementations[attr] = (*mcs.implementations.get(attr, ()), cls)
return type(cls, args, kwargs)
The classes (primarily the child classes inherited from Parent) should use Inspect metaclass.
The Inspect.implementations will be in their final state after the application starts and all classes and functions are declared in dynamic memory to be ready to execute the script. So you can get declare an additional method in the Parent to get the list of classes that override the current method or even was the method overridden or not.
import inspect
class Parent:
#staticmethod
def overridden() -> tuple:
return Inspect.implementations.get(inspect.stack()[1].function, ())
def call_api(self):
print("API is called")
def use_call_api(self):
# if it's overridden, find out in parent,
if self.overridden():
print("use_call_api has been overridden in", self.overridden())
# do something and then
self.call_api()
# if it's not overridden
self.call_api()
class Child(Parent, metaclass=Inspect):
def call_api(self):
print("call_api")
def use_call_api(self):
pass
if __name__ == "__main__":
p = Parent()
p.use_call_api()
If you run the above code, then you will see that when Child overrides use_call_api method, then the overridden method called by the same method of the Parent will contain the Child, indicating that it overrides the method. If we do not implement use_call_api for Child, the overridden would return an empty tuple, and if self.overridden() condition would not work and will pass the case.

Related

Inheritance: How to make parent method work with other parent methods?

So I have two classes, where one inherits from another and overrides all parent methods:
class Parent:
def name(self):
return 'parent'
def run(self):
print(f'calling method from parent: I am {self.name()}')
class Child(Parent):
def name(self):
return 'child'
def run(self):
print(f'calling method from child: I am {self.name()}')
return super().run()
Running the following piece Child().run() triggers method run both for child and parent, and the output is:
calling method from child: I am child
calling method from parent: I am child
And result is clear - since we have redefined method name, new version is used in both run methods. (I am child on both lines)
And that's the main problem - the way I want it to work is for parent run method to use parent name.
What I have accomplished so far is replacing super with self.__class__.__mro__[1], so that method looks like
def run(self):
print(f'calling method from child: I am {self.name()}')
return self.__class__.__mro__[1]().run()
The way it works is it gets parent class using method resolution order and creates instance of parent class. It works fine and now result is:
calling method from child: I am child
calling method from parent: I am parent
But I don't like this solution:
Single inheritance - since we hardcode parent class index, we cannot make it work with several parent classes
Using MRO doesn't feel right for this case
Here we assume __init__ doesn't take extra arguments
I think clue is in changing self.name in parent method so that it will use parent method explicitly, but I don't know how to achieve this.
You can find the current class that the method is defined in by using __class__:
class Parent:
def name(self):
return 'parent'
def run(self):
print(f'calling method from parent: I am {__class__.name(self)}')
I'd suggest making name a name-mangled (__) attribute that's private to the class:
class Parent:
__name = 'parent'
def run(self):
print(f'calling method from parent: I am {self.__name}')
class Child(Parent):
__name = 'child'
def run(self):
print(f'calling method from child: I am {self.__name}')
return super().run()
Child().run()
# calling method from child: I am child
# calling method from parent: I am parent

Prevent a method that is called in the parent constructor from being called in the child constructor

Suppose I have a parent class and a child class that inherits from the parent.
class Parent:
def __init__(self)
stubborn()
class Child():
def __init__(self):
super().__init__(self)
I do not want the stubborn method to be called anytime I call the parent constructor
in the child class. How do I approach this?
You can define a classmethod of Parent that checks whether or not you are in Parent, then use that to determine whether to call stubborn
class Parent:
def __init__(self):
if self.is_parent():
self.stubborn()
#classmethod
def is_parent(cls):
return cls is Parent
def stubborn(self):
print("stubborn called")
class Child(Parent): pass
p = Parent() # stubborn called
c = Child() # no output
You wouldn't be able to do anything about it in parent.__init__() without actually changing that function or stubborn().
But, as the child, you could stop the stubborn() method from doing anything important by temporarily making it a stub:
class Child():
def __init__(self):
old_stubborn = self.stubborn
self.stubborn = lambda:None # stub function that does nothing
super().__init__(self)
# put stubborn() back to normal
self.stubborn = old_stubborn

In Python2, how to force a child class method to call a parent method without explicitly requiring the end user to include it?

A parent class I'm writing requires some specific internal cleanup after usage. The child class has its own cleanup to do, but the parent's cleanup function must be run afterward. Obviously, a call to super would solve this, but I'd like this to be as simple as possible on the child class's side.
I tried decorating the parent method. This did not work.
# The parent class whose inner-workings I don't expect the end user to understand
class ParentClass(object):
def __init__(self, *args, **kwargs):
self._personal_message = "Parent class says:"
self._important_message = "I'm important!"
# The method that NEEDS to be run in all instances of ParentClass and its subclasses
def _important_method(self):
print(self._important_message)
# The decorator I thought would work
def _pretty_decoration(func):
def func_wrapper(self):
func_self = func(self)
self._important_method()
return func_self
return func_wrapper
# The decorated function that will be overridden by the child class
#_pretty_decoration
def do_something(self):
print(self._personal_message)
# Make the decorator static
_pretty_decoration = staticmethod(_pretty_decoration)
# The blissfully naive Child class
class ChildClass(ParentClass):
def __init__(self, *args, **kwargs):
super(ChildClass, self).__init__(*args, **kwargs)
self._personal_message = "Child class says:"
# The overriding method
def do_something(self):
print(self._personal_message)
self.do_something_else()
def do_something_else(self):
print("I am blissfully naive.")
# The test drive
parent = ParentClass()
parent.do_something()
child = ChildClass()
child.do_something()
In this example, I'm getting:
Parent class says:
I'm important!
Child class says:
I am blissfully naive.
whereas I was hoping to get:
Parent class says:
I'm important!
Child class says:
I am blissfully naive.
I'm important!
What should I be doing for the expected result?
Rather than override the method, defer the real work to a callback method called from do_something. Then there is no reason to override do_something, and you can just put the call to _important_method directly in its body.
class ParentClass(object):
def __init__(self, *args, **kwargs):
self._personal_message = "Parent class says:"
self._important_message = "I'm important!"
# The method that NEEDS to be run in all instances
# of ParentClass and its subclasses
def _important_method(self):
print(self._important_message)
# This doesn't get overriden; it's a fixed entry point to do_body
def do_something(self):
self.do_body()
self._important_method()
# This shouldn't (need to) be called directly
def do_body(self):
print(self._personal_message)
class ChildClass(ParentClass):
def do_body(self):
print(self._personal_message) # or super().do_body()
self.do_something_else()
def do_something_else(self):
print("I am blissfully naive.")
Then the following still works
child = ChildClass()
child.do_something()

Parent method with extra arguments in Python

Parent class has a property called 'deserialize' that is static and abstract with one argument. Each Child class implemented that method. Now I have a situation that Child class needs more than one argument. When I add options=None to Parent class, children classes complain that they have a different signature(warning). I have to add options=None to each class. That is a refactoring. I want to know if I can omit the warning and continue, or there is a better solution? Or do I have to refactor?
class Serializable:
__metaclass__ = ABCMeta
#staticmethod
#abstractmethod
def deserialize(json_obj, options=None):
pass
class ChildWithNoExtraArguments(Serializable):
# warning is here...
#staticmethod
def deserialize(json_obj):
# some implementation
class ChildWithExtraArgumnets(Serializable):
#staticmethod
def deserialize(json_obj, options):
# some implementation, I need options
You need to decorate your child classes deserialize implementation with #staticmethod too. The exception you're seeing is because python is automatically adding self to each of the method calls. Decorating then with #staticmethod stops this behavior.
Additionally, you're second implementation needs to define options as a keyword argument. Keyword arguments have default values, for instance: options=None.
class Serializable:
__metaclass__ = ABCMeta
#staticmethod
#abstractmethod
def deserialize(json_obj, options=None):
pass
class ChildWithNoExtraArguments(Serializable):
# warning is here...
#staticmethod
def deserialize(json_obj, options=None):
# some implementation
class ChildWithExtraArgumnets(Serializable):
#staticmethod
def deserialize(json_obj, options=None):
# some implementation, I need options

how to make child class call parent class __init__ automatically?

i had a class called CacheObject,and many class extend from it.
now i need to add something common on all classes from this class so i write this
class CacheObject(object):
def __init__(self):
self.updatedict = dict()
but the child class didn't obtain the updatedict attribute.i know calling super init function was optional in python,but is there an easy way to force all of them to add the init rather than walk all the classes and modify them one by one?
I was in a situation where I wanted classes to always call their base classes' constructor in order before they call their own. The following is Python3 code that should do what you want:
class meta(type):
def __init__(cls,name,bases,dct):
def auto__call__init__(self, *a, **kw):
for base in cls.__bases__:
base.__init__(self, *a, **kw)
cls.__init__child_(self, *a, **kw)
cls.__init__child_ = cls.__init__
cls.__init__ = auto__call__init__
class A(metaclass=meta):
def __init__(self):
print("Parent")
class B(A):
def __init__(self):
print("Child")
To illustrate, it will behave as follows:
>>> B()
Parent
Child
<__main__.B object at 0x000001F8EF251F28>
>>> A()
Parent
<__main__.A object at 0x000001F8EF2BB2B0>
I suggest a non-code fix:
Document that super().__init__() should be called by your subclasses before they use any other methods defined in it.
This is not an uncommon restriction. See, for instance, the documentation for threading.Thread in the standard library, which says:
If the subclass overrides the constructor, it must make sure to invoke the base class constructor (Thread.__init__()) before doing anything else to the thread.
There are probably many other examples, I just happened to have that doc page open.
You can override __new__. As long as your base classes doesn't override __new__ without calling super().__new__, then you'll be fine.
class CacheObject(object):
def __new__(cls, *args, **kwargs):
instance = super().__new__(cls, *args, **kwargs)
instance.updatedict = {}
return instance
class Foo(CacheObject):
def __init__(self):
pass
However, as some commenters said, the motivation for this seems a little shady. You should perhaps just add the super calls instead.
This isn't what you asked for, but how about making updatedict a property, so that it doesn't need to be set in __init__:
class CacheObject(object):
#property
def updatedict(self):
try:
return self._updatedict
except AttributeError:
self._updatedict = dict()
return self._updatedict
Hopefully this achieves the real goal, that you don't want to have to touch every subclass (other than to make sure none uses an attribute called updatedict for something else, of course).
There are some odd gotchas, though, because it is different from setting updatedict in __init__ as in your question. For example, the content of CacheObject().__dict__ is different. It has no key updatedict because I've put that key in the class, not in each instance.
Regardless of motivation, another option is to use __init_subclass__() (Python 3.6+) to get this kind of behavior. (For example, I'm using it because I want users not familiar with the intricacies of Python to be able to inherit from a class to create specific engineering models, and I'm trying to keep the structure of the class they have to define very basic.)
In the case of your example,
class CacheObject:
def __init__(self) -> None:
self.updatedict = dict()
def __init_subclass__(cls) -> None:
orig_init = cls.__init__
#wraps(orig_init)
def __init__(self, *args, **kwargs):
orig_init(self, *args, **kwargs)
super(self.__class__, self).__init__()
cls.__init__ = __init__
What this does is any class that subclasses CacheObject will now, when created, have its __init__ function wrapped by the parent class—we're replacing it with a new function that calls the original, and then calls super() (the parent's) __init__ function. So now, even if the child class overrides the parent __init__, at the instance's creation time, its __init__ is then wrapped by a function that calls it and then calls its parent.
You can add a decorator to your classes :
def my_decorator(cls):
old_init = cls.__init__
def new_init(self):
self.updatedict = dict()
old_init(self)
cls.__init__ = new_init
return cls
#my_decorator
class SubClass(CacheObject):
pass
if you want to add the decorators to all the subclasses automatically, use a metaclass:
class myMeta(type):
def __new__(cls, name, parents, dct):
return my_decorator(super().__new__(cls, name, parents, dct))
class CacheObject(object, metaclass=myMeta):
pass

Categories