I have a base class that looks something like this:
class myBaseClass:
def __init__(self):
self.name = None # All subclasses must define this
def foo(self): # All subclasses must define this
raise NotImplementedError
def bar(self): # Optional -- not all subclasses will define this
raise NotImplementedError
My API specification stipulates that anyone creating a subclass of myBaseClass must provide a meaningful value for .name, and for the function .foo(). However, .bar() is optional and calling code should be able to handle the case where that results in a NotImplementedError.
When and how should I check that subclasses contributed by third parties meet these requirements?
The options seem to be:
Build subclasses exclusively via metaclasses. However, this approach will be unfamiliar and potentially confusing to most of the contributors to my project, who tend not to be expert developers.
Add an __init_subclass__ method to the base class and use this to infer whether the subclass has overridden everything it is supposed to override. Seems to work, but feels a bit 'kludgy'.
Write build-time tests to instantiate each subclass, call each 'required' method, and verify that they do not raise a NotImplementedError. Seems like an excessive computational effort to answer such a simple question (calling .foo() may be expensive).
Ignore the issue. Deal with it if and when it causes something else to break.
I'm sure I'm not the only person who needs to deal with this issue - is there a 'correct' approach here?
Here's how I would structure it.
First off, what you're looking for here is an abstract base class. Using the built-in modules you can easily define it as such and have methods be forced to have an implementation, otherwise the class will raise an exception when instantiated.
If the name attribute needs to be set always, then you should make it part of the constructor arguments.
Because bar is not always required I wouldn't define it as a method in the base class you have. Instead I would make a child class that is also abstract and define it there as required. When checking to see if the method is available you can use isinstance.
This is what my final code would look like:
from abc import ABC, abstractmethod
class FooBaseClass(ABC):
def __init__(self, name):
self.name = name
#abstractmethod
def foo(self):
"""Some useful docs for foo"""
class FooBarBaseClass(FooBaseClass, ABC):
#abstractmethod
def bar(self):
"""Some useful docs for bar"""
When creating instances you can pick the base class you want and will be forced to define the methods.
class FooClass(FooBaseClass):
def __init__(self):
super().__init__("foo")
def foo(self):
print("Calling foo from FooClass")
class FooBarClass(FooBarBaseClass):
def __init__(self):
super().__init__("foobar")
def foo(self):
print("Calling foo from FooBarClass")
def bar(self):
print("Calling bar from FooBarClass")
Example checking if bar is callable:
def do_operation(obj: FooBaseClass):
obj.foo()
if isinstance(obj, FooBarBaseClass):
obj.bar()
Example:
do_operation(FooClass())
do_operation(FooBarClass())
Calling foo from FooClass
Calling foo from FooBarClass
Calling bar from FooBarClass
An example of invalid code
class InvalidClass(FooBaseClass):
def __init__(self):
super().__init__("foo")
InvalidClass()
Traceback (most recent call last):
File "C:\workspace\so\test.py", line 52, in <module>
InvalidClass()
TypeError: Can't instantiate abstract class InvalidClass with abstract method foo
There are two main ways for a derived class to call a base class's methods.
Base.method(self):
class Derived(Base):
def method(self):
Base.method(self)
...
or super().method():
class Derived(Base):
def method(self):
super().method()
...
Suppose I now do this:
obj = Derived()
obj.method()
As far as I know, both Base.method(self) and super().method() do the same thing. Both will call Base.method with a reference to obj. In particular, super() doesn't do the legwork to instantiate an object of type Base. Instead, it creates a new object of type super and grafts the instance attributes from obj onto it, then it dynamically looks up the right attribute from Base when you try to get it from the super object.
The super() method has the advantage of minimizing the work you need to do when you change the base for a derived class. On the other hand, Base.method uses less magic and may be simpler and clearer when a class inherits from multiple base classes.
Most of the discussions I've seen recommend calling super(), but is this an established standard among Python coders? Or are both of these methods widely used in practice? For example, answers to this stackoverflow question go both ways, but generally use the super() method. On the other hand, the Python textbook I am teaching from this semester only shows the Base.method approach.
Using super() implies the idea that whatever follows should be delegated to the base class, no matter what it is. It's about the semantics of the statement. Referring explicitly to Base on the other hand conveys the idea that Base was chosen explicitly for some reason (perhaps unknown to the reader), which might have its applications too.
Apart from that however there is a very practical reason for using super(), namely cooperative multiple inheritance. Suppose you've designed the following class hierarchy:
class Base:
def test(self):
print('Base.test')
class Foo(Base):
def test(self):
print('Foo.test')
Base.test(self)
class Bar(Base):
def test(self):
print('Bar.test')
Base.test(self)
Now you can use both Foo and Bar and everything works as expected. However these two classes won't work together in a multiple inheritance schema:
class Test(Foo, Bar):
pass
Test().test()
# Output:
# Foo.test
# Base.test
That last call to test skips over Bar's implementation since Foo didn't specify that it wants to delegate to the next class in method resolution order but instead explicitly specified Base. Using super() resolves this issue:
class Base:
def test(self):
print('Base.test')
class Foo(Base):
def test(self):
print('Foo.test')
super().test()
class Bar(Base):
def test(self):
print('Bar.test')
super().test()
class Test(Foo, Bar):
pass
Test().test()
# Output:
# Foo.test
# Bar.test
# Base.test
I want to figure out the type of the class in which a certain method is defined (in essence, the enclosing static scope of the method), from within the method itself, and without specifying it explicitly, e.g.
class SomeClass:
def do_it(self):
cls = enclosing_class() # <-- I need this.
print(cls)
class DerivedClass(SomeClass):
pass
obj = DerivedClass()
# I want this to print 'SomeClass'.
obj.do_it()
Is this possible?
If you need this in Python 3.x, please see my other answer—the closure cell __class__ is all you need.
If you need to do this in CPython 2.6-2.7, RickyA's answer is close, but it doesn't work, because it relies on the fact that this method is not overriding any other method of the same name. Try adding a Foo.do_it method in his answer, and it will print out Foo, not SomeClass
The way to solve that is to find the method whose code object is identical to the current frame's code object:
def do_it(self):
mro = inspect.getmro(self.__class__)
method_code = inspect.currentframe().f_code
method_name = method_code.co_name
for base in reversed(mro):
try:
if getattr(base, method_name).func_code is method_code:
print(base.__name__)
break
except AttributeError:
pass
(Note that the AttributeError could be raised either by base not having something named do_it, or by base having something named do_it that isn't a function, and therefore doesn't have a func_code. But we don't care which; either way, base is not the match we're looking for.)
This may work in other Python 2.6+ implementations. Python does not require frame objects to exist, and if they don't, inspect.currentframe() will return None. And I'm pretty sure it doesn't require code objects to exist either, which means func_code could be None.
Meanwhile, if you want to use this in both 2.7+ and 3.0+, change that func_code to __code__, but that will break compatibility with earlier 2.x.
If you need CPython 2.5 or earlier, you can just replace the inpsect calls with the implementation-specific CPython attributes:
def do_it(self):
mro = self.__class__.mro()
method_code = sys._getframe().f_code
method_name = method_code.co_name
for base in reversed(mro):
try:
if getattr(base, method_name).func_code is method_code:
print(base.__name__)
break
except AttributeError:
pass
Note that this use of mro() will not work on classic classes; if you really want to handle those (which you really shouldn't want to…), you'll have to write your own mro function that just walks the hierarchy old-school… or just copy it from the 2.6 inspect source.
This will only work in Python 2.x implementations that bend over backward to be CPython-compatible… but that includes at least PyPy. inspect should be more portable, but then if an implementation is going to define frame and code objects with the same attributes as CPython's so it can support all of inspect, there's not much good reason not to make them attributes and provide sys._getframe in the first place…
First, this is almost certainly a bad idea, and not the way you want to solve whatever you're trying to solve but refuse to tell us about…
That being said, there is a very easy way to do it, at least in Python 3.0+. (If you need 2.x, see my other answer.)
Notice that Python 3.x's super pretty much has to be able to do this somehow. How else could super() mean super(THISCLASS, self), where that THISCLASS is exactly what you're asking for?*
Now, there are lots of ways that super could be implemented… but PEP 3135 spells out a specification for how to implement it:
Every function will have a cell named __class__ that contains the class object that the function is defined in.
This isn't part of the Python reference docs, so some other Python 3.x implementation could do it a different way… but at least as of 3.2+, they still have to have __class__ on functions, because Creating the class object explicitly says:
This class object is the one that will be referenced by the zero-argument form of super(). __class__ is an implicit closure reference created by the compiler if any methods in a class body refer to either __class__ or super. This allows the zero argument form of super() to correctly identify the class being defined based on lexical scoping, while the class or instance that was used to make the current call is identified based on the first argument passed to the method.
(And, needless to say, this is exactly how at least CPython 3.0-3.5 and PyPy3 2.0-2.1 implement super anyway.)
In [1]: class C:
...: def f(self):
...: print(__class__)
In [2]: class D(C):
...: pass
In [3]: D().f()
<class '__main__.C'>
Of course this gets the actual class object, not the name of the class, which is apparently what you were after. But that's easy; you just need to decide whether you mean __class__.__name__ or __class__.__qualname__ (in this simple case they're identical) and print that.
* In fact, this was one of the arguments against it: that the only plausible way to do this without changing the language syntax was to add a new closure cell to every function, or to require some horrible frame hacks which may not even be doable in other implementations of Python. You can't just use compiler magic, because there's no way the compiler can tell that some arbitrary expression will evaluate to the super function at runtime…
If you can use #abarnert's method, do it.
Otherwise, you can use some hardcore introspection (for python2.7):
import inspect
from http://stackoverflow.com/a/22898743/2096752 import getMethodClass
def enclosing_class():
frame = inspect.currentframe().f_back
caller_self = frame.f_locals['self']
caller_method_name = frame.f_code.co_name
return getMethodClass(caller_self.__class__, caller_method_name)
class SomeClass:
def do_it(self):
print(enclosing_class())
class DerivedClass(SomeClass):
pass
DerivedClass().do_it() # prints 'SomeClass'
Obviously, this is likely to raise an error if:
called from a regular function / staticmethod / classmethod
the calling function has a different name for self (as aptly pointed out by #abarnert, this can be solved by using frame.f_code.co_varnames[0])
Sorry for writing yet another answer, but here's how to do what you actually want to do, rather than what you asked for:
this is about adding instrumentation to a code base to be able to generate reports of method invocation counts, for the purpose of checking certain approximate runtime invariants (e.g. "the number of times that method ClassA.x() is executed is approximately equal to the number of times that method ClassB.y() is executed in the course of a run of a complicated program).
The way to do that is to make your instrumentation function inject the information statically. After all, it has to know the class and method it's injecting code into.
I will have to instrument many classes by hand, and to prevent mistakes I want to avoid typing the class names everywhere. In essence, it's the same reason why typing super() is preferable to typing super(ClassX, self).
If your instrumentation function is "do it manually", the very first thing you want to turn it into an actual function instead of doing it manually. Since you obviously only need static injection, using a decorator, either on the class (if you want to instrument every method) or on each method (if you don't) would make this nice and readable. (Or, if you want to instrument every method of every class, you might want to define a metaclass and have your root classes use it, instead of decorating every class.)
For example, here's an easy way to instrument every method of a class:
import collections
import functools
import inspect
_calls = {}
def inject(cls):
cls._calls = collections.Counter()
_calls[cls.__name__] = cls._calls
for name, method in cls.__dict__.items():
if inspect.isfunction(method):
#functools.wraps(method)
def wrapper(*args, **kwargs):
cls._calls[name] += 1
return method(*args, **kwargs)
setattr(cls, name, wrapper)
return cls
#inject
class A(object):
def f(self):
print('A.f here')
#inject
class B(A):
def f(self):
print('B.f here')
#inject
class C(B):
pass
#inject
class D(C):
def f(self):
print('D.f here')
d = D()
d.f()
B.f(d)
print(_calls)
The output:
{'A': Counter(),
'C': Counter(),
'B': Counter({'f': 1}),
'D': Counter({'f': 1})}
Exactly what you wanted, right?
You can either do what #mgilson suggested or take another approach.
class SomeClass:
pass
class DerivedClass(SomeClass):
pass
This makes SomeClass the base class for DerivedClass.
When you normally try to get the __class__.name__ then it will refer to derived class rather than the parent.
When you call do_it(), it's really passing DerivedClass as self, which is why you are most likely getting DerivedClass being printed.
Instead, try this:
class SomeClass:
pass
class DerivedClass(SomeClass):
def do_it(self):
for base in self.__class__.__bases__:
print base.__name__
obj = DerivedClass()
obj.do_it() # Prints SomeClass
Edit:
After reading your question a few more times I think I understand what you want.
class SomeClass:
def do_it(self):
cls = self.__class__.__bases__[0].__name__
print cls
class DerivedClass(SomeClass):
pass
obj = DerivedClass()
obj.do_it() # prints SomeClass
[Edited]
A somewhat more generic solution:
import inspect
class Foo:
pass
class SomeClass(Foo):
def do_it(self):
mro = inspect.getmro(self.__class__)
method_name = inspect.currentframe().f_code.co_name
for base in reversed(mro):
if hasattr(base, method_name):
print(base.__name__)
break
class DerivedClass(SomeClass):
pass
class DerivedClass2(DerivedClass):
pass
DerivedClass().do_it()
>> 'SomeClass'
DerivedClass2().do_it()
>> 'SomeClass'
SomeClass().do_it()
>> 'SomeClass'
This fails when some other class in the stack has attribute "do_it", since this is the signal name for stop walking the mro.
Suppose I want to create an abstract class in Python with some methods to be implemented by subclasses, for example:
class Base():
def f(self):
print "Hello."
self.g()
print "Bye!"
class A(Base):
def g(self):
print "I am A"
class B(Base):
def g(self):
print "I am B"
I'd like that if the base class is instantiated and its f() method called, when self.g() is called, that throws an exception telling you that a subclass should have implemented method g().
What's the usual thing to do here? Should I raise a NotImplementedError? or is there a more specific way of doing it?
In Python 2.6 and better, you can use the abc module to make Base an "actually" abstract base class:
import abc
class Base:
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def g(self):
pass
def f(self): # &c
this guarantees that Base cannot be instantiated -- and neither can any subclass which fails to override g -- while meeting #Aaron's target of allowing subclasses to use super in their g implementations. Overall, a much better solution than what we used to have in Python 2.5 and earlier!
Side note: having Base inherit from object would be redundant, because the metaclass needs to be set explicitly anyway.
Make a method that does nothing, but still has a docstring explaining the interface. Getting a NameError is confusing, and raising NotImplementedError (or any other exception, for that matter) will break proper usage of super.
Peter Norvig has given a solution for this in his Python Infrequently Asked Questions list. I'll reproduce it here. Do check out the IAQ, it is very useful.
## Python
class MyAbstractClass:
def method1(self): abstract
class MyClass(MyAbstractClass):
pass
def abstract():
import inspect
caller = inspect.getouterframes(inspect.currentframe())[1][3]
raise NotImplementedError(caller + ' must be implemented in subclass')
In Java, for example, the #Override annotation not only provides compile-time checking of an override but makes for excellent self-documenting code.
I'm just looking for documentation (although if it's an indicator to some checker like pylint, that's a bonus). I can add a comment or docstring somewhere, but what is the idiomatic way to indicate an override in Python?
Based on this and fwc:s answer I created a pip installable package https://github.com/mkorpela/overrides
From time to time I end up here looking at this question.
Mainly this happens after (again) seeing the same bug in our code base: Someone has forgotten some "interface" implementing class while renaming a method in the "interface"..
Well Python ain't Java but Python has power -- and explicit is better than implicit -- and there are real concrete cases in the real world where this thing would have helped me.
So here is a sketch of overrides decorator. This will check that the class given as a parameter has the same method (or something) name as the method being decorated.
If you can think of a better solution please post it here!
def overrides(interface_class):
def overrider(method):
assert(method.__name__ in dir(interface_class))
return method
return overrider
It works as follows:
class MySuperInterface(object):
def my_method(self):
print 'hello world!'
class ConcreteImplementer(MySuperInterface):
#overrides(MySuperInterface)
def my_method(self):
print 'hello kitty!'
and if you do a faulty version it will raise an assertion error during class loading:
class ConcreteFaultyImplementer(MySuperInterface):
#overrides(MySuperInterface)
def your_method(self):
print 'bye bye!'
>> AssertionError!!!!!!!
Here's an implementation that doesn't require specification of the interface_class name.
import inspect
import re
def overrides(method):
# actually can't do this because a method is really just a function while inside a class def'n
#assert(inspect.ismethod(method))
stack = inspect.stack()
base_classes = re.search(r'class.+\((.+)\)\s*\:', stack[2][4][0]).group(1)
# handle multiple inheritance
base_classes = [s.strip() for s in base_classes.split(',')]
if not base_classes:
raise ValueError('overrides decorator: unable to determine base class')
# stack[0]=overrides, stack[1]=inside class def'n, stack[2]=outside class def'n
derived_class_locals = stack[2][0].f_locals
# replace each class name in base_classes with the actual class type
for i, base_class in enumerate(base_classes):
if '.' not in base_class:
base_classes[i] = derived_class_locals[base_class]
else:
components = base_class.split('.')
# obj is either a module or a class
obj = derived_class_locals[components[0]]
for c in components[1:]:
assert(inspect.ismodule(obj) or inspect.isclass(obj))
obj = getattr(obj, c)
base_classes[i] = obj
assert( any( hasattr(cls, method.__name__) for cls in base_classes ) )
return method
If you want this for documentation purposes only, you can define your own override decorator:
def override(f):
return f
class MyClass (BaseClass):
#override
def method(self):
pass
This is really nothing but eye-candy, unless you create override(f) in such a way that is actually checks for an override.
But then, this is Python, why write it like it was Java?
Improvising on #mkorpela great answer, here is a version with
more precise checks, naming, and raised Error objects
def overrides(interface_class):
"""
Function override annotation.
Corollary to #abc.abstractmethod where the override is not of an
abstractmethod.
Modified from answer https://stackoverflow.com/a/8313042/471376
"""
def confirm_override(method):
if method.__name__ not in dir(interface_class):
raise NotImplementedError('function "%s" is an #override but that'
' function is not implemented in base'
' class %s'
% (method.__name__,
interface_class)
)
def func():
pass
attr = getattr(interface_class, method.__name__)
if type(attr) is not type(func):
raise NotImplementedError('function "%s" is an #override'
' but that is implemented as type %s'
' in base class %s, expected implemented'
' type %s'
% (method.__name__,
type(attr),
interface_class,
type(func))
)
return method
return confirm_override
Here is what it looks like in practice:
NotImplementedError "not implemented in base class"
class A(object):
# ERROR: `a` is not a implemented!
pass
class B(A):
#overrides(A)
def a(self):
pass
results in more descriptive NotImplementedError error
function "a" is an #override but that function is not implemented in base class <class '__main__.A'>
full stack
Traceback (most recent call last):
…
File "C:/Users/user1/project.py", line 135, in <module>
class B(A):
File "C:/Users/user1/project.py", line 136, in B
#overrides(A)
File "C:/Users/user1/project.py", line 110, in confirm_override
interface_class)
NotImplementedError: function "a" is an #override but that function is not implemented in base class <class '__main__.A'>
NotImplementedError "expected implemented type"
class A(object):
# ERROR: `a` is not a function!
a = ''
class B(A):
#overrides(A)
def a(self):
pass
results in more descriptive NotImplementedError error
function "a" is an #override but that is implemented as type <class 'str'> in base class <class '__main__.A'>, expected implemented type <class 'function'>
full stack
Traceback (most recent call last):
…
File "C:/Users/user1/project.py", line 135, in <module>
class B(A):
File "C:/Users/user1/project.py", line 136, in B
#overrides(A)
File "C:/Users/user1/project.py", line 125, in confirm_override
type(func))
NotImplementedError: function "a" is an #override but that is implemented as type <class 'str'> in base class <class '__main__.A'>, expected implemented type <class 'function'>
The great thing about #mkorpela answer is the check happens during some initialization phase. The check does not need to be "run". Referring to the prior examples, class B is never initialized (B()) yet the NotImplementedError will still raise. This means overrides errors are caught sooner.
Python ain't Java. There's of course no such thing really as compile-time checking.
I think a comment in the docstring is plenty. This allows any user of your method to type help(obj.method) and see that the method is an override.
You can also explicitly extend an interface with class Foo(Interface), which will allow users to type help(Interface.method) to get an idea about the functionality your method is intended to provide.
Like others have said unlike Java there is not #Overide tag however above you can create your own using decorators however I would suggest using the getattrib() global method instead of using the internal dict so you get something like the following:
def Override(superClass):
def method(func)
getattr(superClass,method.__name__)
return method
If you wanted to you could catch getattr() in your own try catch raise your own error but I think getattr method is better in this case.
Also this catches all items bound to a class including class methods and vairables
Based on #mkorpela's great answer, I've written a similar package (ipromise pypi github) that does many more checks:
Suppose A inherits from B and C, B inherits from C.
Module ipromise checks that:
If A.f overrides B.f, B.f must exist, and A must inherit from B. (This is the check from the overrides package).
You don't have the pattern A.f declares that it overrides B.f, which then declares that it overrides C.f. A should say that it overrides from C.f since B might decide to stop overriding this method, and that should not result in downstream updates.
You don't have the pattern A.f declares that it overrides C.f, but B.f does not declare its override.
You don't have the pattern A.f declares that it overrides C.f, but B.f declares that it overrides from some D.f.
It also has various features for marking and checking implementing an abstract method.
You can use protocols from PEP 544. With this method, the interface-implementation relation is declared only at the use site.
Assuming you already have an implementation (let's call it MyFoobar), you define an interface (a Protocol), which has the signatures of all the methods and fields of your implementation, let's call that IFoobar.
Then, at the use site, you declare the implementation instance binding to have the interface type e.g. myFoobar: IFoobar = MyFoobar(). Now, if you use a field/method that is missing in the interface, Mypy will complain at the use site (even if it would work at runtime!). If you failed to implement a method from the interface in the implementation, Mypy will also complain. Mypy won't complain if you implement something that doesn't exist in the interface. But that case is rare, since the interface definition is compact and easy to review. You wouldn't be able to actually use that code, since Mypy would complain.
Now, this won't cover cases where you have implementations both in the superclass and the implementing class, like some uses of ABC. But override is used in Java even with no implementation in the interface. This solution covers that case.
from typing import Protocol
class A(Protocol):
def b(self):
...
def d(self): # we forgot to implement this in C
...
class C:
def b(self):
return 0
bob: A = C()
Type checking results in:
test.py:13: error: Incompatible types in assignment (expression has type "C", variable has type "A")
test.py:13: note: 'C' is missing following 'A' protocol member:
test.py:13: note: d
Found 1 error in 1 file (checked 1 source file)
as python 3.6 and above, the functionality provided by #override can be easily implemented using the descriptor protocol of python, namingly the set_name dunder method:
class override:
def __init__(self, func):
self._func = func
update_wrapper(self, func)
def __get__(self, obj, obj_type):
if obj is None:
return self
return self._func
def __set_name__(self, obj_type, name):
self.validate_override(obj_type, name)
def validate_override(self, obj_type, name):
for parent in obj_type.__bases__:
func = parent.__dict__.get(name, None)
if callable(func):
return
else:
raise NotImplementedError(f"{obj_type.__name__} does not override {name}")
Note that here set_name is called once the wrapped class is defined, and we can get the parent class of the wrapped class by calling its dunder method bases.
for each for its parent class, we would like to check if the wrapped function is implemented in the class by
check that the function name is in the class dict
it is a callable
Using i would be as simple as:
class AbstractShoppingCartService:
def add_item(self, request: AddItemRequest) -> Cart:
...
class ShoppingCartService(AbstractShoppingCartService):
#override
def add_item(self, request: AddItemRequest) -> Cart:
...
Not only did the decorator I made check if the name of the overriding attribute in is any superclass of the class the attribute is in without having to specify a superclass, this decorator also check to ensure the overriding attribute must be the same type as the overridden attribute. Class Methods are treated like methods and Static Methods are treated like functions. This decorator works for callables, class methods, static methods, and properties.
For source code see: https://github.com/fireuser909/override
This decorator only works for classes that are instances of override.OverridesMeta but if your class is an instance of a custom metaclass use the create_custom_overrides_meta function to create a metaclass that is compatible with the override decorator. For tests, run the override.__init__ module.
In Python 2.6+ and Python 3.2+ you can do it (Actually simulate it, Python doesn't support function overloading and child class automatically overrides parent's method). We can use Decorators for this. But first, note that Python's #decorators and Java's #Annotations are totally different things. The prior one is a wrapper with concrete code while later one is a flag to compiler.
For this, first do pip install multipledispatch
from multipledispatch import dispatch as Override
# using alias 'Override' just to give you some feel :)
class A:
def foo(self):
print('foo in A')
# More methods here
class B(A):
#Override()
def foo(self):
print('foo in B')
#Override(int)
def foo(self,a):
print('foo in B; arg =',a)
#Override(str,float)
def foo(self,a,b):
print('foo in B; arg =',(a,b))
a=A()
b=B()
a.foo()
b.foo()
b.foo(4)
b.foo('Wheee',3.14)
output:
foo in A
foo in B
foo in B; arg = 4
foo in B; arg = ('Wheee', 3.14)
Note that you must have to use decorator here with parenthesis
One thing to remember is that since Python doesn't have function overloading directly, so even if Class B don't inherit from Class A but needs all those foos than also you need to use #Override (though using alias 'Overload' will look better in that case)
Here is a different solution without annotation.
It has a slightly other goal in mind. While the other proposed solutions check if the given method actually overrides a parent, this one checks, if all parent methods were overridden.
You don't have to raise an AssertionError, but can print a warning or disable it in production by checking for the env in __init__ and return before checking.
class Parent:
def a():
pass
def b():
pass
class Child(Overrides, Parent):
def a()
# raises an error, as b() is not overridden
class Overrides:
def __init__(self):
# collect all defined methods of all base-classes
bases = [b for b in self.__class__.__bases__ if b != Overrides]
required_methods = set()
for base in bases:
required_methods = required_methods.union(set([f for f in dir(base) if not f.startswith('_')]))
# check for each method in each base class (in required_methods)
# if the class, that inherits `Overrides` implements them all
missing = []
# me is the fully qualified name of the CLASS, which inherits
# `Overrides`
me = self.__class__.__qualname__
for required_method in required_methods:
# The method can be either defined in the parent or the child
# class. To check it, we get a reference to the method via
# getattr
try:
found = getattr(self, required_method)
except AttributeError:
# this should not happen, as getattr returns the method in
# the parent class if it is not defined in the cild class.
# It has to be in a parent class, as the required_methods
# is a union of all base-class methods.
missing.append(required_method)
continue
# here is, where the magic happens.
# found is a reference to a method, and found.__qualname__ is
# the full-name of the METHOD. Remember, that me is the full
# name of the class.
# We want to check, where the method is defined. If it is
# defined in an parent class, we did no override it, thus it
# is missing.
# If we did not override, the __qualname__ is Parent.method
# If we did override it, the __qualname__ is Child.method
# With this fact, we can determine if the class, which uses
# `Override` did implement it.
if not found.__qualname__.startswith(me + '.'):
missing.append(required_method)
# Maybe a warning would be enough here
if missing != []:
raise AssertionError(f'{me} did not override these methods: {missing}')
Hear is simplest and working under Jython with Java classes:
class MyClass(SomeJavaClass):
def __init__(self):
setattr(self, "name_of_method_to_override", __method_override__)
def __method_override__(self, some_args):
some_thing_to_do()