I'm trying to pick up Python for a project, and I'm a bit confused about how to use abstraction and classes. (I'm not a very experienced programmer, so apologies for the basic level of this question.) I come from a Java/Ocaml background, and what I've been trying to do is as follows: I have abstract classes for a graph and a graphadvanced (a graph with some more fancy methods), that look something like this
class AbstractGraph:
def method1(self):
raise NotImplementedError
...
class AbstractAdvanced:
def method2(self):
raise NotImplementedError
...
I then have an implementation of a graph:
class Graph(AbstractGraph):
def method1(self):
* actual code *
Now my question is: can I do something like this?
class Advanced(AbstractAdvanced, AbstractGraph):
def method2(self):
*actual code, using the methods from AbstractGraph*
In other words, how can I define the methods of Advanced abstractly in terms of the methods of AbstractGraph, and then somehow pass Graph into a constructor to get an instance of Advanced that uses Advanced's definitions with Graph's implementation?
In terms of Ocaml, I'm trying to treat AbstractAdvanced and AbstractGraph as module types, but I've played around a little with python and I'm not sure how to get this to work.
If you want to create abstract base classes, you can, but they are of limited utility. It's more normal to start your class hierarchy (after inheriting from object, or some other third-party class) with concrete classes.
If you want to create a class that pieces together various classes that partially some protocol, then just inherit from your implementing classes:
#Always inherit from object, or some subtype thereof, unless you want your code to behave differently in python 2 and python 3
class AbstractGraph(object):
def method1(self):
raise NotImplementedError
class Graph(AbstractGraph):
def method1(self):
* actual code *
class GraphToo(AbstractGraph):
def method1(self):
* actual code *
class AbstractAdvanced(AbstractGraph):
def method2(self):
raise NotImplementedError
class Advanced(Graph,AbstractAdvanced):
def method2(self):
*actual code, using the methods from Graph*
# order of classes in the inheritance list matters - it will affect the method resolution order
class AdvancedToo(GraphToo, Advanced): pass
Related
I have a pretty big class that i want to break down in smaller classes that each handle a single part of the whole. So each child takes care of only one aspect of the whole.
Each of these child classes still need to communicate with one another.
For example Data Access creates a dictionary that Plotting Controller needs to have access to.
And then plotting Controller needs to update stuff on Main GUI Controller. But these children have various more inter-communication functions.
How do I achieve this?
I've read Metaclasses, Cooperative Multiple Inheritence and Wonders of Cooperative Multiple Inheritence, but i cannot figure out how to do this.
The closest I've come is the following code:
class A:
def __init__(self):
self.myself = 'ClassA'
def method_ONE_from_class_A(self, caller):
print(f"I am method ONE from {self.myself} called by {caller}")
self.method_ONE_from_class_B(self.myself)
def method_TWO_from_class_A(self, caller):
print(f"I am method TWO from {self.myself} called by {caller}")
self.method_TWO_from_class_B(self.myself)
class B:
def __init__(self):
self.me = 'ClassB'
def method_ONE_from_class_B(self, caller):
print(f"I am method ONE from {self.me} called by {caller}")
self.method_TWO_from_class_A(self.me)
def method_TWO_from_class_B(self, caller):
print(f"I am method TWO from {self.me} called by {caller}")
class C(A, B):
def __init__(self):
A.__init__(self)
B.__init__(self)
def children_start_talking(self):
self.method_ONE_from_class_A('Big Poppa')
poppa = C()
poppa.children_start_talking()
which results correctly in:
I am method ONE from ClassA called by Big Poppa
I am method ONE from ClassB called by ClassA
I am method TWO from ClassA called by ClassB
I am method TWO from ClassB called by ClassA
But... even though Class B and Class A correctly call the other children's functions, they don't actually find their declaration. Nor do i "see" them when i'm typing the code, which is both frustrating and worrisome that i might be doing something wrong.
Is there a good way to achieve this? Or is it an actually bad idea?
EDIT: Python 3.7 if it makes any difference.
Inheritance
When breaking a class hierarchy like this, the individual "partial" classes, we call "mixins", will "see" only what is declared directly on them, and on their base-classes. In your example, when writing class A, it does not know anything about class B - you as the author, can know that methods from class B will be present, because methods from class A will only be called from class C, that inherits both.
Your programming tools, the IDE including, can't know that. (That you should know better than your programming aid, is a side track). It would work, if run, but this is a poor design.
If all methods are to be present directly on a single instance of your final class, all of them have to be "present" in a super-class for them all - you can even write independent subclasses in different files, and then a single subclass that will inherit all of them:
from abc import abstractmethod, ABC
class Base(ABC):
#abstractmethod
def method_A_1(self):
pass
#abstractmethod
def method_A_2(self):
pass
#abstractmethod
def method_B_1(self):
pass
class A(Base):
def __init__(self, *args, **kwargs):
# pop consumed named parameters from "kwargs"
...
super().__init__(*args, **kwargs)
# This call ensures all __init__ in bases are called
# because Python linearize the base classes on multiple inheritance
def method_A_1(self):
...
def method_A_2(self):
...
class B(Base):
def __init__(self, *args, **kwargs):
# pop consumed named parameters from "kwargs"
...
super().__init__(*args, **kwargs)
# This call ensures all __init__ in bases are called
# because Python linearize the base classes on multiple inheritance
def method_B_1(self):
...
...
class C(A, B):
pass
(The "ABC" and "abstractmethod" are a bit of sugar - they will work, but this design would work without any of that - thought their presence help whoever is looking at your code to figure out what is going on, and will raise an earlier runtime error if you per mistake create an instance of one of the incomplete base classes)
Composite
This works, but if your methods are actually for wildly different domains, instead
of multiple inheritance, you should try using the "composite design pattern".
No need for multiple inheritance if it does not arise naturally.
In this case, you instantiate objects of the classes that drive the different domains on the __init__ of the shell class, and pass its own instance to those child, which will keep a reference to it (in a self.parent attribute, for example). Chances are your IDE still won't know what you are talking about, but you will have a saner design.
class Parent:
def __init__(self):
self.a_domain = A(self)
self.b_domain = B(self)
class A:
def __init__(self, parent):
self.parent = parent
# no need to call any "super...init", this is called
# as part of the initialization of the parent class
def method_A_1(self):
...
def method_A_2(self):
...
class B:
def __init__(self, parent):
self.parent = parent
def method_B_1(self):
# need result from 'A' domain:
a_value = self.parent.a_domain.method_A_1()
...
This example uses the basic of the language features, but if you decide
to go for it in a complex application, you can sophisticate it - there are
interface patterns, that could allow you to swap the classes used
for different domains, in specialized subclasses, and so on. But typically
the pattern above is what you would need.
I'm creating a class hierarchy with a base type that will have 2 (and probably more in the future) subclass implementations.
My first idea is to create an abstract base class (inheriting from abc.ABC) with #abstractmethods where necessary (for methods that will be different in my concrete subclasses) but also with common method implementations (for methods used by all concrete subclasses).
Here's an example of what I mean:
from abc import ABC, abstractmethod
class BaseClass(ABC):
def __init__(self, var1, var2):
self.var1 = var1
self.var2 = var2
def common_method(self):
"""This method is used as-is by all subclasses."""
return self.var1 + 1
#abstractmethod
def specific_method(self):
"""This method needs specific implementations."""
Is this good practice (not "best practice"; I'm looking for whether this is an appropriate use of these constructs) for writing a base class? Is using instance methods in my BaseClass appropriate?
i would say: yes.
There is a design pattern called Template Method which captures what you describe in your question, see https://sourcemaking.com/design_patterns/template_method.
Abstract Base Classes are used to ensure that any derived class provides some specified behavior (https://www.python.org/dev/peps/pep-3119 -> rationale). For example, you want to create custom data manager that must be able to save/load data and must support iteration over elementary pieces of data:
import abc
class SavableLoadable(abc.ABC):
#abc.abstractmethod
def save(self):
raise NotImplementedError
#abc.abstractmethod
def load(self):
raise NotImplementedError
class Iteratable(abc.ABC):
#abc.abstractmethod
def __iter__(self):
raise NotImplementedError
class MyDataManager(Iteratable, SavableLoadable):
def __init__(self):
pass
def __iter__(self):
pass
def save(self):
pass
def load(self):
pass
It is not a class hierarchy. It is a way to define behaivioral layer through inheritance. ABCs are very usefull for large-scale projects where different parts are created by different authors and should be able to run together.
So you do not need ABCs to create class hierarchy. ABCs serve another (although pretty close) purpose.
I have a public class which the module exports, and 3 implementations of it.
At certain points in the program, the implementation used will be changed dynamically,
something along the lines of:
class PublicClass(object):
_IMPLEMENTATION_TO_USE = _Imp1
def func1(self, arg1):
_IMPLEMENTATION_TO_USE.func1(arg1)
class _Imp1(PublicClass):
def func1(self, arg1): pass
class _Imp2(PublicClass):
def func1(self, arg1): pass
What's the best (Pythonic) way of achieving it?
Instead of composition, you could consider using a Abstract Base Class, with three different subclasses.
In python, duck typing is favoured over explicitly checking types.
What behaviour will provide Python if you have inheritance from many Classes, which have same method implemented?
Class A():
def method():
pass
Class B():
def method():
pass
Class C(A,B):
pass
The question is quite complex... there have been at least three different algorithms for Python method resolution order.
For simple cases it does what you expect, for the subtle differences see
See http://python-history.blogspot.it/2010/06/method-resolution-order.html
I need something like an abstract protected method in Python (3.2):
class Abstract:
def use_concrete_implementation(self):
print(self._concrete_method())
def _concrete_method(self):
raise NotImplementedError()
class Concrete(Abstract):
def _concrete_method(self):
return 2 * 3
Is it actually useful to define an "abstract" method only to raise a NotImplementedError?
Is it good style to use an underscore for abstract methods, that would be protected in other languages?
Would an abstract base class (abc) improve anything?
In Python, you usually avoid having such abstract methods alltogether. You define an interface by the documentation, and simply assume the objects that are passed in fulfil that interface ("duck typing").
If you really want to define an abstract base class with abstract methods, this can be done using the abc module:
from abc import ABCMeta, abstractmethod
class Abstract(metaclass=ABCMeta):
def use_concrete_implementation(self):
print(self._concrete_method())
#abstractmethod
def _concrete_method(self):
pass
class Concrete(Abstract):
def _concrete_method(self):
return 2 * 3
Again, that is not the usual Python way to do things. One of the main objectives of the abc module was to introduce a mechanism to overload isinstance(), but isinstance() checks are normally avoided in favour of duck typing. Use it if you need it, but not as a general pattern for defining interfaces.
When in doubt, do as Guido does.
No underscore. Just define the "abstract method" as a one-liner which raises NotImplementedError:
class Abstract():
def ConcreteMethod(self):
raise NotImplementedError("error message")
Basically, an empty method in the base class is not necessary here. Just do it like this:
class Abstract:
def use_concrete_implementation(self):
print(self._concrete_method())
class Concrete(Abstract):
def _concrete_method(self):
return 2 * 3
In fact, you usually don't even need the base class in Python. Since all calls are resolved dynamically, if the method is present, it will be invoked, if not, an AttributeError will be raised.
Attention: It is import to mention in the documentation that _concrete_method needs to be implemented in subclasses.