I am adding abstract classes to my python package like this:
class AbstractClass(ABC):
#abstractmethod
def do_something(self):
pass
There will be multiple subclasses that inherit from AbstractClass like this:
class SubClass(AbstractClass):
def do_something(self):
pass
I am wondering if there are any conventions for python packages regarding abstract classes that I am unaware of.
should the abstract classes be separated from the subclasses, or should they all be in the same directory?
what about naming of abstract functions, any conventions there?
I realize this is fairly subjective. I am asking for opinions, not what is 'right' or 'wrong'
Thanks!
No convention comes to mind. If this is all for just one project, I'd be looking to put the abstract class in with the concrete subclasses at whatever level they're at. If all the subclasses were in one file, then I'd put the abstract class in that file too. If all the subclasses were in individual files in a dir, I'd put the abstract class right next to them in its own file. The reason I'd put the abstract class somewhere else is if I'm separating off utility code that can be reused with other projects. In short, an abstract class is just a piece of code. Just another class. Treat it like any other.
As far as naming, the natural naming of the class given what it is/does is usually enough. If your abstract class were Animal, and your subclasses were Goat, Horse, etc., I'd see no reason to call your abstract class AbstractAnimal, as it's pretty clear already what's going on...that you wouldn't instantiate Animal directly. Also, if you're looking at a class thinking of reusing it, and you see an abstract method in it, then you know it's abstract.
Depending on your style guide there might be. The only thing in my opinion is that any abstract functions should raise errors if not implemented, and should be part of the abstract method docstring.
For example if you're using numpystyle docstrings something like this is ideal:
class AbstractClass(ABC):
#abstractmethod
def do_something(self):
"""Does something
Raises
------
NotImplementedError:
Error is raised if function is not implemented in subclass
"""
raise NotImplementedError
Related
What is the python way of defining abstract class constants?
For example, if I have this abstract class:
class MyBaseClass(SomeOtherClass, metaclass=ABCMeta):
CONS_A: str
CONS_B: str
So, the type checker/"compiler" (at least Pycharm's one) doesn't complain about the above.
But if I inherit it to another class and don't implement CONS_A and CONS_B, it doesn't complain either:
class MyChildClass(MyBaseClass):
pass
What is the python way of enforcing (the much we can with a duck-type language) implementation of MyBaseClass to actually implement CONS_A and CONS_B?
Bonus question: once I can enforce, how can I only enforce for non-abstract classes? I suppose it should be the same answer as the main question. But, I might be wrong. So, for example:
# "compiler" should not complain
class MyChildAbstractClass(MyBaseClass):
pass
# here it must complain if I don't implement `CONS_A` and `CONS_B`
class MyChildImplementationClass(MyChildAbstractClass):
pass
Obs: I am aware of #abstractmethod, but I didn't find a solution for abstract class constants.
Thank you.
Forgive me for my ignorance, but does anyone know any languages that strictly enforce the condition I've given on the title? For example, using Python syntax, we can extend a class with a new method like this
class A:
pass
class B(A):
def foo(self):
pass
But is there a language that needs an additional keyword, let's say new, to specify that this method is unique to the child class and is not an override of the methods of its parent class/es? For example:
class A:
pass
class B(A):
def new foo(self):
pass
I am asking this because, when I am working on a project that requires multiple inheritance such as class B(A, C, D), and I saw a method defined in B, I need to check if the given method is from one of its parent class or its own method, and I find it extremely tedious.
The closest I can think of is the #Override annotation in Java, which can be applied to a method declaration in order for the compiler to check that it overrides an inherited method (or implements an interface method).
When used in conjunction with a linter which checks that all method overrides are annotated with #Override, then your IDE will give you a linter warning when you omit the annotation. IntellIJ IDEA and SonarSource both have linter rules for this, for example.
So long as you are strict about obeying the linter warning, then it's "strict" in that sense, but of course linter warnings don't actually prevent your code from being compiled or executed. Nonetheless, I don't know of a closer example from a real programming language. Unfortunately Java doesn't have multiple inheritance so it's not directly applicable to your problem.
Consider the following class and mixin:
class Target(ClassThatUsesAMetaclass):
def foo(self):
pass
class Mixin:
def __init__(self):
self.foo() # type error: type checker doesn't know Mixin will have
# access to foo once in use.
class Combined(Mixin, Target):
def __init__(self):
Target.__init__(self)
Mixin.__init__(self)
I'm trying to avoid the type checker error in the above scenario. One option is this:
from typing import Protocol
class Fooable(Protocol):
def foo(self): ...
class Mixin(Fooable):
def __init__(self):
self.foo()
Would've worked great, except that Target inherits from a class that uses a metaclass, so Combined can't inherit from both Target and Mixin.
So now I'm trying an alternative, annotating self in Mixin:
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .this import Mixin, Target
Mixin_T = type('Mixin_T', (Mixin, Target), {})
class Mixin:
def __init__(self: Mixin_T):
self.foo() # No longer an error
class Combined(Mixin, Target):
def __init__(self):
Target.__init__(self)
Mixin.__init__(self) # Now this is an error: "Type[Mixin]" is not
# assignable to parameter "self"
# "Mixin" is incompatible with "Mixin_T"
So how am I supposed to win this aside from using # type: ignore?
I found a very simple solution:
if TYPE_CHECKING:
from .this import Target
Mixin_T = Target
else:
Mixin_T = object
class Mixin(Mixin_T):
...
Now all of Target's methods are recognized within Mixin by the type checker, and there's no need to override the type of self into something imcompatible with Mixin. This might be a little awkward if the mixin is destined to all kinds of Target classes, but for my uses this is perfectly acceptable, since my case is a group of mixins extending a very specific target class.
Other than that, there is to little code and some msconceptions above that make this question not answrable at all, apart from providing some clarifications.
To start, are you sure you are "inheriting from a metaclass"?? It does not make sense to inherit a metaclass unless to create another metaclass. Your snippets show you inhriting froma supposed metaclass (with no code given), to create Target and them attempting to use Target as a parent to a normal class (a non-meta class). That makes no sense.
You might just have confused the terms and the hidden InheritFromMetaclass class actually just uses the metaclass, and do not "inherit" from it. Then your problem does not have to do with metaclasses at all.
So, the real visible problem in the snippet is that the static checkr does not "see" a self.foo method in the Mixin class - and guess what? There is no self.foo method in Mixin - the checker is just throwing a cold truth in your face: while Python does allow one to reference methods and attributes that are not available in a class, knowing that it will be used along other classes that do have those attributes, that is no good design and error prone. The kind of bad design static type checking exists to weed-off.
So, what you need is to have a base of Mixin that is an abstract class and have Foo as an abstract method. (Or have Mixin itself be that abstract class).
If - due to usage of other metaclass you can't have Mixin inheit from abc.ABC due to metaclass conflict, you have to either: create a combined metaclass from the metaclass acutually used by InheritsFromMetaclass with ABCMeta , nd use that as the metaclass for Mixin - or just create a stub foo method in Mixin as is (which could raise a NotImplementedError - thus having the same behavior of an abstract method, but without really having to inherit from it.
The important part to have in and is that an methods and attributes you access in code inside a class body have to exist in that class, without depending on attributes that will exist in a subclass of it.
If that does not solve your problem, you need to provide more data - including a reproducible complete example involving your actual metaclass. (and it mgt be solved just by combining the metaclasses as mentioned above)
I've written a mixin class that's designed to be layered on top of a new-style class, for example via
class MixedClass(MixinClass, BaseClass):
pass
What's the smoothest way to apply this mixin to an old-style class? It is using a call to super in its __init__ method, so this will presumably (?) have to change, but otherwise I'd like to make as few changes as possible to MixinClass. I should be able to derive a subclass that makes the necessary changes.
I'm considering using a class decorator on top of a class derived from BaseClass, e.g.
#old_style_mix(MixinOldSchoolRemix)
class MixedWithOldStyleClass(OldStyleClass)
where MixinOldSchoolRemix is derived from MixinClass and just re-implements methods that use super to instead use a class variable that contains the class it is mixed with, in this case OldStyleClass. This class variable would be set by old_style_mix as part of the mixing process.
old_style_mix would just update the class dictionary of e.g. MixedWithOldStyleClass with the contents of the mixin class (e.g. MixinOldSchoolRemix) dictionary.
Is this a reasonable strategy? Is there a better way? It seems like this would be a common problem, given that there are numerous available modules still using old-style classes.
This class variable would be set by
old_style_mix as part of the mixing
process.
...I assume you mean: "...on the class it's decorating..." as opposed to "on the class that is its argument" (the latter would be a disaster).
old_style_mix would just update the
class dictionary of e.g.
MixedWithOldStyleClass with the
contents of the mixin class (e.g.
MixinOldSchoolRemix) dictionary.
No good -- the information that MixinOldSchoolRemix derives from MixinClass, for example, is not in the former's dictionary. So, old_style_mix must take a different strategy: for example, build a new class (which I believe has to be a new-style one, because old-style ones do not accept new-style ones as __bases__) with the appropriate sequence of bases, as well as a suitably tweaked dictionary.
Is this a reasonable strategy?
With the above provisos.
It seems like this would be a common
problem, given that there are numerous
available modules still using
old-style classes.
...but mixins with classes that were never designed to take mixins are definitely not a common design pattern, so the problem isn't common at all (I don't remember seeing it even once in the many years since new-style classes were born, and I was actively consulting, teaching advanced classes, and helping people with Python problems for many of those years, as well as doing a lot of software development myself -- I do tend to have encountered any "reasonably common" problem that people may have with features which have been around long enough!-).
Here's example code for what your class decorator could do (if you prefer to have it in a class decorator rather than directly inline...):
>>> class Mixo(object):
... def foo(self):
... print 'Mixo.foo'
... self.thesuper.foo(self)
...
>>> class Old:
... def foo(self):
... print 'Old.foo'
...
>>> class Mixed(Mixo, Old):
... thesuper = Old
...
>>> m = Mixed()
>>> m.foo()
Mixo.foo
Old.foo
If you want to build Mixed under the assumed name/binding of Mixo in your decorator, you could do it with a call to type, or by setting Mixed.__name__ = cls.__name__ (where cls is the class you're decorating). I think the latter approach is simpler (warning, untested code -- the above interactive shell session is a real one, but I have not tested the following code):
def oldstylemix(mixin):
def makemix(cls):
class Mixed(mixin, cls):
thesuper = cls
Mixed.__name__ = cls.__name__
return Mixed
return makemix
What is the difference between abstract class and interface in Python?
What you'll see sometimes is the following:
class Abstract1:
"""Some description that tells you it's abstract,
often listing the methods you're expected to supply."""
def aMethod(self):
raise NotImplementedError("Should have implemented this")
Because Python doesn't have (and doesn't need) a formal Interface contract, the Java-style distinction between abstraction and interface doesn't exist. If someone goes through the effort to define a formal interface, it will also be an abstract class. The only differences would be in the stated intent in the docstring.
And the difference between abstract and interface is a hairsplitting thing when you have duck typing.
Java uses interfaces because it doesn't have multiple inheritance.
Because Python has multiple inheritance, you may also see something like this
class SomeAbstraction:
pass # lots of stuff - but missing something
class Mixin1:
def something(self):
pass # one implementation
class Mixin2:
def something(self):
pass # another
class Concrete1(SomeAbstraction, Mixin1):
pass
class Concrete2(SomeAbstraction, Mixin2):
pass
This uses a kind of abstract superclass with mixins to create concrete subclasses that are disjoint.
What is the difference between abstract class and interface in Python?
An interface, for an object, is a set of methods and attributes on that object.
In Python, we can use an abstract base class to define and enforce an interface.
Using an Abstract Base Class
For example, say we want to use one of the abstract base classes from the collections module:
import collections
class MySet(collections.Set):
pass
If we try to use it, we get an TypeError because the class we created does not support the expected behavior of sets:
>>> MySet()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class MySet with abstract methods
__contains__, __iter__, __len__
So we are required to implement at least __contains__, __iter__, and __len__. Let's use this implementation example from the documentation:
class ListBasedSet(collections.Set):
"""Alternate set implementation favoring space over speed
and not requiring the set elements to be hashable.
"""
def __init__(self, iterable):
self.elements = lst = []
for value in iterable:
if value not in lst:
lst.append(value)
def __iter__(self):
return iter(self.elements)
def __contains__(self, value):
return value in self.elements
def __len__(self):
return len(self.elements)
s1 = ListBasedSet('abcdef')
s2 = ListBasedSet('defghi')
overlap = s1 & s2
Implementation: Creating an Abstract Base Class
We can create our own Abstract Base Class by setting the metaclass to abc.ABCMeta and using the abc.abstractmethod decorator on relevant methods. The metaclass will be add the decorated functions to the __abstractmethods__ attribute, preventing instantiation until those are defined.
import abc
For example, "effable" is defined as something that can be expressed in words. Say we wanted to define an abstract base class that is effable, in Python 2:
class Effable(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def __str__(self):
raise NotImplementedError('users must define __str__ to use this base class')
Or in Python 3, with the slight change in metaclass declaration:
class Effable(object, metaclass=abc.ABCMeta):
#abc.abstractmethod
def __str__(self):
raise NotImplementedError('users must define __str__ to use this base class')
Now if we try to create an effable object without implementing the interface:
class MyEffable(Effable):
pass
and attempt to instantiate it:
>>> MyEffable()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class MyEffable with abstract methods __str__
We are told that we haven't finished the job.
Now if we comply by providing the expected interface:
class MyEffable(Effable):
def __str__(self):
return 'expressable!'
we are then able to use the concrete version of the class derived from the abstract one:
>>> me = MyEffable()
>>> print(me)
expressable!
There are other things we could do with this, like register virtual subclasses that already implement these interfaces, but I think that is beyond the scope of this question. The other methods demonstrated here would have to adapt this method using the abc module to do so, however.
Conclusion
We have demonstrated that the creation of an Abstract Base Class defines interfaces for custom objects in Python.
Python >= 2.6 has Abstract Base Classes.
Abstract Base Classes (abbreviated
ABCs) complement duck-typing by
providing a way to define interfaces
when other techniques like hasattr()
would be clumsy. Python comes with
many builtin ABCs for data structures
(in the collections module), numbers
(in the numbers module), and streams
(in the io module). You can create
your own ABC with the abc module.
There is also the Zope Interface module, which is used by projects outside of zope, like twisted. I'm not really familiar with it, but there's a wiki page here that might help.
In general, you don't need the concept of abstract classes, or interfaces in python (edited - see S.Lott's answer for details).
In a more basic way to explain:
An interface is sort of like an empty muffin pan.
It's a class file with a set of method definitions that have no code.
An abstract class is the same thing, but not all functions need to be empty. Some can have code. It's not strictly empty.
Why differentiate:
There's not much practical difference in Python, but on the planning level for a large project, it could be more common to talk about interfaces, since there's no code. Especially if you're working with Java programmers who are accustomed to the term.
Python doesn't really have either concept.
It uses duck typing, which removed the need for interfaces (at least for the computer :-))
Python <= 2.5:
Base classes obviously exist, but there is no explicit way to mark a method as 'pure virtual', so the class isn't really abstract.
Python >= 2.6:
Abstract base classes do exist (http://docs.python.org/library/abc.html). And allow you to specify methods that must be implemented in subclasses. I don't much like the syntax, but the feature is there. Most of the time it's probably better to use duck typing from the 'using' client side.
In general, interfaces are used only in languages that use the single-inheritance class model. In these single-inheritance languages, interfaces are typically used if any class could use a particular method or set of methods. Also in these single-inheritance languages, abstract classes are used to either have defined class variables in addition to none or more methods, or to exploit the single-inheritance model to limit the range of classes that could use a set of methods.
Languages that support the multiple-inheritance model tend to use only classes or abstract base classes and not interfaces. Since Python supports multiple inheritance, it does not use interfaces and you would want to use base classes or abstract base classes.
http://docs.python.org/library/abc.html
Abstract classes are classes that contain one or more abstract methods. Along with abstract methods, Abstract classes can have static, class and instance methods.
But in case of interface, it will only have abstract methods not other. Hence it is not compulsory to inherit abstract class but it is compulsory to inherit interface.
For completeness, we should mention PEP3119
where ABC was introduced and compared with interfaces,
and original Talin's comment.
The abstract class is not perfect interface:
belongs to the inheritance hierarchy
is mutable
But if you consider writing it your own way:
def some_function(self):
raise NotImplementedError()
interface = type(
'your_interface', (object,),
{'extra_func': some_function,
'__slots__': ['extra_func', ...]
...
'__instancecheck__': your_instance_checker,
'__subclasscheck__': your_subclass_checker
...
}
)
ok, rather as a class
or as a metaclass
and fighting with python to achieve the immutable object
and doing refactoring
...
you'll quite fast realize that you're inventing the wheel
to eventually achieve
abc.ABCMeta
abc.ABCMeta was proposed as a useful addition of the missing interface functionality,
and that's fair enough in a language like python.
Certainly, it was able to be enhanced better whilst writing version 3, and adding new syntax and immutable interface concept ...
Conclusion:
The abc.ABCMeta IS "pythonic" interface in python