class F(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def f(self, a, b): print("F")
class FF(F):
def f(self): print("FF")
f = FF()
f.f()
Here, I define an abstract method f with two arguments. I want to restrict subclass so that it has same arguments like superclass.
How to do that?
There isn't any trivial way to check that the method signatures match.
However, assuming:
Your subclass only the derives from the concerned abstract class or the abstract class is the first in its mro
And that the implementation of the abstract class is empty, to eliminate any possibility of unwanted results such as binding new attributes to the subclass,
You could make a super call to the abstract method passing the paramters from the subclass method. The call only serves to enforce a signature match.
class F(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def f(self, a, b): pass
class FF(F):
def f(self, *args, **kwargs):
super(FF, self).f(*args, **kwargs)
...
f = FF()
f.f()
Traceback (most recent call last):
File "python", line 17, in <module>
File "python", line 12, in f
TypeError: f() takes exactly 3 arguments (1 given)
Python abstract methods (unlike others) can have implementations and can be called via super; so maybe this is one use case after all.
You can use abcmeta library: https://github.com/mortymacs/abcmeta
. It gives you more restrictions on derived classes.
Related
Consider the following code example
import abc
class ABCtest(abc.ABC):
#abc.abstractmethod
def foo(self):
raise RuntimeError("Abstract method was called, this should be impossible")
class ABCtest_B(ABCtest):
pass
test = ABCtest_B()
This correctly raises the error:
Traceback (most recent call last):
File "/.../test.py", line 10, in <module>
test = ABCtest_B()
TypeError: Can't instantiate abstract class ABCtest_B with abstract methods foo
However when the subclass of ABCtest also inherits from a built in type like str or list there is no error and test.foo() calls the abstract method:
class ABCtest_C(ABCtest, str):
pass
>>> test = ABCtest_C()
>>> test.foo()
Traceback (most recent call last):
File "<pyshell#0>", line 1, in <module>
test.foo()
File "/.../test.py", line 5, in foo
raise RuntimeError("Abstract method was called, this should be impossible")
RuntimeError: Abstract method was called, this should be impossible
This seems to happen when inheriting from any class defined in C including itertools.chain and numpy.ndarray but still correctly raises errors with classes defined in python. Why would implementing one of a built in types break the functionality of abstract classes?
Surprisingly, the test that prevents instantiating abstract classes happens in object.__new__, rather than anything defined by the abc module itself:
static PyObject *
object_new(PyTypeObject *type, PyObject *args, PyObject *kwds)
{
...
if (type->tp_flags & Py_TPFLAGS_IS_ABSTRACT) {
...
PyErr_Format(PyExc_TypeError,
"Can't instantiate abstract class %s "
"with abstract methods %U",
type->tp_name,
joined);
(Almost?) all built-in types that aren't object supply a different __new__ that overrides object.__new__ and does not call object.__new__. When you multiple-inherit from a non-object built-in type, you inherit its __new__ method, bypassing the abstract method check.
I don't see anything about __new__ or multiple inheritance from built-in types in the abc documentation. The documentation could use enhancement here.
It seems kind of strange that they'd use a metaclass for the ABC implementation, making it a mess to use other metaclasses with abstract classes, and then put the crucial check in core language code that has nothing to do with abc and runs for both abstract and non-abstract classes.
There's a report for this issue on the issue tracker that's been languishing since 2009.
I asked a similar question and based on user2357112 supports Monicas linked bug report, I came up with this workaround (based on the suggestion from Xiang Zhang):
from abc import ABC, abstractmethod
class Base(ABC):
#abstractmethod
def foo(self):
pass
#abstractmethod
def bar(self):
pass
def __new__(cls, *args, **kwargs):
abstractmethods = getattr(cls, '__abstractmethods__', None)
if abstractmethods:
msg = "Can't instantiate abstract class {name} with abstract method{suffix} {methods}"
suffix = 's' if len(abstractmethods) > 1 else ''
raise TypeError(msg.format(name=cls.__name__, suffix=suffix, methods=', '.join(abstractmethods)))
return super().__new__(cls, *args, **kwargs)
class Derived(Base, tuple):
pass
Derived()
This raises TypeError: Can't instantiate abstract class Derived with abstract methods bar, foo, which is the original behaviour.
So, here is a problem:
I want to define an abstract class, let's say AbstractA, which does not require subclasses to implement any of its methods, but rather to extend its functionality. In terms of Java that would be interface class.
Moreover, I want to be able to create an abstract subclass, let's say AbstractB, of the AbstractA with the same properties, but some methods redefining or extending base class methods.
I don't want though to make class (AbstractA) abstract e.g. through the check of class name in __init__ or __new__, because that would require abstract subclass (AbstractB) to redefine that method along with it's main functionality, i.e. construction or initialization of a new instance. Or to call super().__init__(...) which I'd prefer to avoid as well (maybe I'm wrong here).
So, I want something like that:
class AbstractA:
def __init__(self):
# do initialization stuff
def very_common_method(self, ...):
# do very common stuff
class AbstractB(AbstractA):
# do not duplicate initialization stuff here, inherit instead
def less_common_method(self, ...):
# do less common stuff
class AX(AbstractA):
def specific_method_1(self, ...):
class BX(AbstractB):
def specific_method_2(self, ...):
# Instantiating AbstractA or AbstractB should result in error.
# Instantiating AX or BX should not.
Below I have a possible solution.
Is there any disadvantages I overlook? Better solution?
Thanks!
Here's possible solution:
class AbstractA:
_is_abstract = True
def __init__(self):
if self._is_abstract:
raise RuntimeError("Abstract class instantiation.")
# do initialization stuff
def __init_subclass__(self): # is called every time class is subclassed
self._is_abstract = False # thus makes sure abstract check fails on a subclass
class AbstractMixin:
def __init_subclass__(self):
self._is_abstract = True
class AbstractB(AbstractMixin, AbstractA): # AbstractMixin takes precendence on MRO,
# inherit __init__ # so the class abstract check returns True.
def __init_subclass__(self):
self._is_abstract = False
class A(AbstractA):
pass
class B(AbstractB):
pass
AbstractA()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in __init__
RuntimeError: Abstract class instantiation.
AbstractB()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in __init__
RuntimeError: Abstract class instantiation.
A()
<__main__.A object at 0x7f0bba5112e8>
B()
<__main__.B object at 0x7f0bba511438>
class A(object):
def __init__(self):
if self.__class__ == A:
raise RuntimeError("Abstract class instantiation.")
print(self.__class__.__name__)
class B(A):
pass
>>> A()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in __init__
RuntimeError: Abstract class instantiation.
>>> B()
B
<__main__.B object at 0x7f8c816a58d0>
>>>
In Python, you would probably implement the "abstract" base classes as mix-ins. There's nothing special you need to do; by convention, you would add Mixin to the name to indicate that it isn't meant to be instantiated directly, but simply used as a base class for other classes.
class AMixin:
def __init__(self):
# do initialization stuff
def very_common_method(self, ...):
# do very common stuff
class BMixin(AMixin):
# do not duplicate initialization stuff here, inherit instead
def less_common_method(self, ...):
# do less common stuff
class AX(AMixin):
def specific_method_1(self, ...):
class BX(BMixin):
def specific_method_2(self, ...):
class Foo:
...
class Bar(Foo, BMixin):
...
I have written the follow code to demonstrate abstract methods that must be implemented by its subclasses. I read that when a method in a parent class is decorated as abstract, its subclass must implement it otherwise it cannot be instantiated. However, in the following code, the 'Slug' subclass does not implement the abstract method, yet it can still instantiate without errors being thrown. I thought in this case Python will complain?
Have I misunderstood something?
Thanks
import abc
class Animal(object):
__metaclass__ = abc.ABCMeta
def __init__(self, species):
self._species=species
def get_species(self):
return self._species
#abc.abstractmethod
def eat(self):
pass
class Slug(Animal):
def __init(self, species):
super(species)
def run(self):
print("running")
# def eat(self):
# pass
sl = Slug("slug")
print(sl.get_species())
no, you understood perfectly well! it is just that the python 3 syntax for abc.ABCMeta is:
class Animal(metaclass=abc.ABCMeta):
...
instead of __metaclass__ = abc.ABCMeta which was used in python 2. alternatively you can just inherit from ABC: class Animal(abc.ABC):. see doc.
and then:
class Slug(Animal):
def __init(self, species):
super.__init__(species)
...
this will result in
Traceback (most recent call last):
File "/home/.../...py", line 33, in <module>
sl = Slug("slug")
TypeError: Can't instantiate abstract class Slug with abstract methods eat
I have been experimenting a little with the abc module in python. A la
>>> import abc
In the normal case you expect your ABC class to not be instantiated if it contains an unimplemented abstractmethod. You know like as follows:
>>> class MyClass(metaclass=abc.ABCMeta):
... #abc.abstractmethod
... def mymethod(self):
... return -1
...
>>> MyClass()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class MyClass with abstract methods mymethod
OR for any derived Class. It all seems to work fine until you inherit from something ... say dict or list as in the following:
>>> class YourClass(list, metaclass=abc.ABCMeta):
... #abc.abstractmethod
... def yourmethod(self):
... return -1
...
>>> YourClass()
[]
This is surprising because type is probably the primary factory or metaclass -ish thing anyway or so I assume from the following.
>>> type(abc.ABCMeta)
<class 'type'>
>>> type(list)
<class 'type'>
From some investigation I found out that it boils down to something as simple as adding an __abstractmethod__ attribute to the class' object and rest happens by itself:
>>> class AbstractClass:
... pass
...
>>> AbstractClass.__abstractmethods__ = {'abstractmethod'}
>>> AbstractClass()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class AbstractClass with abstract methods abstractmethod
So one can simply avoid the check by intentionally overriding the __new__ method and clearing out __abstractmethods__ as in below:
>>> class SupposedlyAbstractClass(metaclass=abc.ABCMeta):
... def __new__(cls):
... cls.__abstractmethods__ = {}
... return super(AbstractClass, cls).__new__(cls)
... #abc.abstractmethod
... def abstractmethod(self):
... return -1
...
>>> SupposedlyAbstractClass()
<__main__.SupposedlyAbstractClass object at 0x000001FA6BF05828>
This behaviour is the same in Python 2.7 and in Python 3.7 as I have personally checked. I am not aware if this is the same for all other python implementations.
Finally, down to the question ... Why has this been made to behave like so? Is it wise we should never make abstract classes out of list, tuple or dict? or should I just go ahead and add a __new__ class method checking for __abstractmethods__ before instantiation?
The problem
If you have the next class:
from abc import ABC, abstractmethod
class Foo(list, ABC):
#abstractmethod
def yourmethod(self):
pass
the problem is that and object of Foo can be created without any error because Foo.__new__(Foo) delegates the call directly to list.__new__(Foo) instead of ABC.__new__(Foo) (which is responsible of checking that all abstract methods are implemented in the class that is going to be instantiated)
We could implement __new__ on Foo and try to call ABC.__new__:
class Foo(list, ABC):
def __new__(cls, *args, **kwargs):
return ABC.__new__(cls)
#abstractmethod
def yourmethod(self):
pass
Foo()
But he next error is raised:
TypeError: object.__new__(Foo) is not safe, use list.__new__()
This is due to ABC.__new__(Foo) invokes object.__new__(Foo) which seems that is not allowed when Foo inherits from list
A possible solution
You can add additional code on Foo.__new__ in order to check that all abstract methods in the class to be instantiated are implemented (basically do the job of ABC.__new__).
Something like this:
class Foo(list, ABC):
def __new__(cls, *args, **kwargs):
if hasattr(cls, '__abstractmethods__') and len(cls.__abstractmethods__) > 0:
raise TypeError(f"Can't instantiate abstract class {cls.__name__} with abstract methods {', '.join(cls.__abstractmethods__)}")
return super(Foo, cls).__new__(cls)
#abstractmethod
def yourmethod(self):
return -1
Now Foo() raises an error. But the next code runs without any issue:
class Bar(Foo):
def yourmethod(self):
pass
Bar()
I'm trying to figure out how to provide a base class to plugin writers so that they provide definitions for several static methods.
A plugin class is a collection of static methods which will never be instantiated.
I know how to use ABC to prevent instantiation of a class missing method implementations, which will not provide the safety I would like. Is there a pattern to prevent definition?
You can do it by writing your own metaclass similar to ABCMeta, which checks for abstract methods at class-definition time and raises an error if it finds any. Here's an example:
class ClassABC(type):
def __init__(cls, name, bases, attrs):
abstracts = set()
for base in bases:
abstracts.update(getattr(base, '__abstractclassmethods__', set()))
for abstract in abstracts:
if getattr(getattr(cls, abstract), '__isabstractmethod__', False):
raise TypeError("Your class doesn't define {0}".format(abstract))
for attr in attrs:
if getattr(attrs[attr], '__isabstractmethod__', False):
abstracts.add(attr)
cls.__abstractclassmethods__ = abstracts
class BaseClass(object):
__metaclass__ = ClassABC
#abc.abstractmethod
def foo(self):
print("I am base foo")
Then:
>>> class Derived(BaseClass):
... pass
Traceback (most recent call last):
File "<pyshell#10>", line 1, in <module>
class Derived(BaseClass):
File "<pyshell#8>", line 8, in __init__
raise TypeError("Your class doesn't define {0}".format(abstract))
TypeError: Your class doesn't define foo
My example is fairly quick and dirty and only scantily tested, and you might want to refine it to check various sorts of edge cases. What I did is I raised the error if there is an abstract method that wasn't defined in the currently-being-defined class. (Otherwise an error would be raised because the abstract base class doesn't define concrete implementations of its own abstract methods.)