Different behavior in base class __init__ depending on derived class - python

I have an abstract class Base containing a member boolean do_thing which will either trigger a one-time action on start-up, or do nothing. This variable can be overridden by a derived class Derived, but doing a super().__init__() call at the beginning of the Derived's __init__ leads to the one-time action being always based on what do_thing is set to in Base.
I only see two options for getting around this, neither of which seem ideal to me:
Call super().__init__() at the end of every derived class's __init__ rather than the beginning, which means I can't rely on other default variables set in Base.
Explicitly call the one-time action at the end of every derived class's __init__, which means either duplicated code, or an extra function in Base that will only ever be called at startup.
Some example code
from abc import ABC
class Base(ABC):
def __init__(self):
self.do_thing = False
# Want to wait for child class init before running this
if self.do_thing:
configuration.set(do_thing_parameters)
class Derived(Base):
def __init__(self):
super().__init__()
# Should have configs properly set based on this being true
self.do_thing = True
class RegularDerived(Base):
def __init__(self):
super().__init__()
# Don't modify the config
Is there a better way of doing this that I'm missing?

From your description, it sounds like your do_thing functionality is to do with your classes, not your instances. If that's so, it doesn't seem right to have it as a parameter to __init__. You have other options, and I'd go with
A class attribute
class Base:
_do_thing = False
def __init__(self):
if self._do_thing:
configuration.set(do_thing_parameters)
class Derived(Base):
_do_thing = True
class RegularDerived(Base):
pass
Then you don't even need to define __init__ in the subclasses

Try setting the "do_thing" variable as a default parameter, as shown below...
from abc import ABC
class Base(ABC):
def __init__(self, do_thing=False):
if do_thing:
configuration.set(do_thing_parameters)
class Derived(Base):
def __init__(self):
super().__init__(True)

Related

How to modify the same method in a set of sibling classes?

I have two classes (Table and Button) inherited from the same class Widget. Both subclasses have their own keyEvent() methods and both call Widget.keyEvent() when necessary. I want to modify the keyEvent() behaviour for both classes in the same way (for example make A and D keys to trigger LEFT and RIGHT keys).
This code works exactly as I want
class KeyModifier:
def keyEvent():
# some lines of code
super().keyEvent()
class MyTable(KeyModifier,Table):
pass
class MyButton(KeyModifier,Button):
pass
But Pylance is angry because KeyModifier.super() doesn't have any keyEvent() method (which is true).
Is there a way to do it better? Also, I would like Pylance to warn me when using the KeyModifier with something not inherited from Widget.
This example comes from a PyQT app, but the question is more general.
Edit:
Making KeyModifier a subclasss of Widget makes KeyModifier.super().keyEvent() call Widget.keyEvent() and I want to call the child class method (Table.keyEvent() or Button.keyEvent())
Does it help?
from abc import abstractmethod
class Table:
pass
class Button:
pass
class KeyModifier:
#abstractmethod
def custom_operation(self):
pass
def key_event(self, condition):
if condition:
self.custom_operation()
class MyTable(KeyModifier, Table):
def __init__(self):
super(MyTable, self).__init__()
def custom_operation(self):
pass
class MyButton(KeyModifier, Button):
def custom_operation(self):
pass
If you make KeyModifier inherit from Widget, the warning will be gone because keyEvent will actually be defined for the object. If you also add super().keyEvent() calls to your modified classes, all the proper events will fire thanks to something called MRO - Method Resolution Order.
class Base:
def event(self):
print("Base")
class A(Base):
def event(self):
print("A")
class B(Base):
def event(self):
print("B")
class Modifier(Base):
def event(self):
print("Modified")
super().event()
class ModifiedA(Modifier, A):
def event(self):
print("ModifiedA")
super().event()
class ModifiedB(Modifier, B):
def event(self):
print("ModifiedB")
super().event()
ModifiedA().event()
Output:
ModifiedA
Modified
A
It is important to note that if A and B do not call a super on their own (I'm fairly certain that PyQt widgets DO call their parent though), Modifier has to be the first class inherited, as it will cause it to be first in MRO and have a chance to call the other class method in turn.
I've found a workaround as I don't really have to manage any class methods but the event that is being handled.
def KeyModifier(event: Event) -> Event:
# some lines of code and edit 'event' if necessary
return event
class MyButton(Button):
def keyEvent(self, event: Event):
super().keyEvent(KeyModifier(event))
I think this is the simplest way to write it. Thank you all for your suggestions :)

Python - Child Class to call a function from another Child Class

I have a pretty big class that i want to break down in smaller classes that each handle a single part of the whole. So each child takes care of only one aspect of the whole.
Each of these child classes still need to communicate with one another.
For example Data Access creates a dictionary that Plotting Controller needs to have access to.
And then plotting Controller needs to update stuff on Main GUI Controller. But these children have various more inter-communication functions.
How do I achieve this?
I've read Metaclasses, Cooperative Multiple Inheritence and Wonders of Cooperative Multiple Inheritence, but i cannot figure out how to do this.
The closest I've come is the following code:
class A:
def __init__(self):
self.myself = 'ClassA'
def method_ONE_from_class_A(self, caller):
print(f"I am method ONE from {self.myself} called by {caller}")
self.method_ONE_from_class_B(self.myself)
def method_TWO_from_class_A(self, caller):
print(f"I am method TWO from {self.myself} called by {caller}")
self.method_TWO_from_class_B(self.myself)
class B:
def __init__(self):
self.me = 'ClassB'
def method_ONE_from_class_B(self, caller):
print(f"I am method ONE from {self.me} called by {caller}")
self.method_TWO_from_class_A(self.me)
def method_TWO_from_class_B(self, caller):
print(f"I am method TWO from {self.me} called by {caller}")
class C(A, B):
def __init__(self):
A.__init__(self)
B.__init__(self)
def children_start_talking(self):
self.method_ONE_from_class_A('Big Poppa')
poppa = C()
poppa.children_start_talking()
which results correctly in:
I am method ONE from ClassA called by Big Poppa
I am method ONE from ClassB called by ClassA
I am method TWO from ClassA called by ClassB
I am method TWO from ClassB called by ClassA
But... even though Class B and Class A correctly call the other children's functions, they don't actually find their declaration. Nor do i "see" them when i'm typing the code, which is both frustrating and worrisome that i might be doing something wrong.
Is there a good way to achieve this? Or is it an actually bad idea?
EDIT: Python 3.7 if it makes any difference.
Inheritance
When breaking a class hierarchy like this, the individual "partial" classes, we call "mixins", will "see" only what is declared directly on them, and on their base-classes. In your example, when writing class A, it does not know anything about class B - you as the author, can know that methods from class B will be present, because methods from class A will only be called from class C, that inherits both.
Your programming tools, the IDE including, can't know that. (That you should know better than your programming aid, is a side track). It would work, if run, but this is a poor design.
If all methods are to be present directly on a single instance of your final class, all of them have to be "present" in a super-class for them all - you can even write independent subclasses in different files, and then a single subclass that will inherit all of them:
from abc import abstractmethod, ABC
class Base(ABC):
#abstractmethod
def method_A_1(self):
pass
#abstractmethod
def method_A_2(self):
pass
#abstractmethod
def method_B_1(self):
pass
class A(Base):
def __init__(self, *args, **kwargs):
# pop consumed named parameters from "kwargs"
...
super().__init__(*args, **kwargs)
# This call ensures all __init__ in bases are called
# because Python linearize the base classes on multiple inheritance
def method_A_1(self):
...
def method_A_2(self):
...
class B(Base):
def __init__(self, *args, **kwargs):
# pop consumed named parameters from "kwargs"
...
super().__init__(*args, **kwargs)
# This call ensures all __init__ in bases are called
# because Python linearize the base classes on multiple inheritance
def method_B_1(self):
...
...
class C(A, B):
pass
(The "ABC" and "abstractmethod" are a bit of sugar - they will work, but this design would work without any of that - thought their presence help whoever is looking at your code to figure out what is going on, and will raise an earlier runtime error if you per mistake create an instance of one of the incomplete base classes)
Composite
This works, but if your methods are actually for wildly different domains, instead
of multiple inheritance, you should try using the "composite design pattern".
No need for multiple inheritance if it does not arise naturally.
In this case, you instantiate objects of the classes that drive the different domains on the __init__ of the shell class, and pass its own instance to those child, which will keep a reference to it (in a self.parent attribute, for example). Chances are your IDE still won't know what you are talking about, but you will have a saner design.
class Parent:
def __init__(self):
self.a_domain = A(self)
self.b_domain = B(self)
class A:
def __init__(self, parent):
self.parent = parent
# no need to call any "super...init", this is called
# as part of the initialization of the parent class
def method_A_1(self):
...
def method_A_2(self):
...
class B:
def __init__(self, parent):
self.parent = parent
def method_B_1(self):
# need result from 'A' domain:
a_value = self.parent.a_domain.method_A_1()
...
This example uses the basic of the language features, but if you decide
to go for it in a complex application, you can sophisticate it - there are
interface patterns, that could allow you to swap the classes used
for different domains, in specialized subclasses, and so on. But typically
the pattern above is what you would need.

What is the proper way to deal with Python overriding of class methods with a mixin in multiple inheritance without redundant code?

I'm having a minor, I hope, issue with theory and the proper way to deal with a problem. It's easier for me to show an example then to explain as I seem to fail with my vocabulary.
class Original_1:
def __init__(self):
pass
def meth1(self):
pass
def meth2(self):
pass
class Original_2(Original_1):
def __init__(self):
Original_1.__init__(self)
def meth3(self):
pass
class Mixin:
def __init__(self):
pass
def meth4(self):
...
meth1(self)
meth2(self)
class NewClass_1(Original_1, Mixin):
def __init__(self):
Original_1.__init__(self)
Mixin.__init__(self)
class NewClass_2(Original_2, Mixin):
def __init__(self):
Original_2.__init__(self)
Mixin.__init__(self)
Now the goal is to extend Original_1 or Original_2 with new methods in the Mixin, but I run into some questions if I use meth1(), meth2(), or meth3() in the mixin. 1. I'm not referencing Original_1 or Origninal_2 in the mixin. (At this point it runs but I don't like it.) 2. If I make Mixin a child of Original_1, it breaks. I could make two separate NewClass_X but then I'm duplicating all of that code.
Mixins are used to add functionality (usually methods) to classes by using multiple inheritance.
For example, let's say you want to make a class's __str__ method return everything in uppercase. There are two ways you can do this:
Manually change every single class's __str__ method:
class SomeClass(SomeBase):
def __str__(self):
return super(SomeClass, self).__str__().upper()
Create a mixin class that does only this and inherit from it:
class UpperStrMixin(object):
def __str__(self):
return super(UpperStrMixin, self).__str__().upper()
class SomeClass(SomeBase, UpperStrMixin):
...
In the second example, notice how UpperStrMixin is completely useless as a standalone class. Its only purpose is to be used with multiple inheritance as a base class and to override your class's __str__ method.
In your particular case, the following will work:
class Mixin:
def __init__(self, option):
...
def meth4(self):
...
self.meth1()
self.meth2()
class NewClass_1(Original_1, Mixin):
def __init__(self, option):
Original_1.__init__(self)
Mixin.__init__(self, option)
...
class NewClass_2(Original_2, Mixin):
def __init__(self, option):
Original_2.__init__(self)
Mixin.__init__(self, option)
...
Even though Mixin.meth1 and Mixin.meth2 aren't defined, this isn't an issue because an instance of Mixin is never created directly and it's only used indirectly through multiple inheritance.
Since Mixin is not a standalone class, you can just write it to assume that the necessary methods exist, and it will find them on self assuming the self in question provides, or derives from another class which provides, meth1 and meth2.
If you want to ensure the methods exist, you can either document it in the Mixin docstring, or for programmatic enforcement, use the abc module to make Mixin an ABC and specify what methods must be defined; if a given class doesn't provide them (directly or via inheritance) then you'll get an error if you attempt to instantiate it (because the class is still abstract until those methods are defined):
from abc import ABCMeta, abstractmethod
class Mixin(metaclass=ABCMeta):
def __init__(self):
pass
#abstractmethod
def meth1(self): pass
#abstractmethod
def meth2(self): pass
def meth4(self):
...
self.meth1() # Method call on self will dispatch to other class's meth1 dynamically
self.meth2() # Method call on self will dispatch to other class's meth2 dynamically
Beyond that, you can simplify your code significantly by using super appropriately, which would remove the need to explicitly call the __init__s for each parent class; they'd be called automatically so long as all classes use super appropriately (note: for safety, in cooperative inheritance like this, you usually accept the current class's recognized arguments plus varargs, passing the varargs you don't recognize up the call chain blindly):
class Original_1:
def __init__(self, orig1arg, *args, **kwargs):
self.orig1val = orig1arg # Use what you know
super().__init__(*args, **kwargs) # Pass what you don't
def meth1(self):
pass
def meth2(self):
pass
class Original_2(Original_1):
def __init__(self, orig2arg, *args, **kwargs):
self.orig2val = orig2arg # Use what you know
super().__init__(self, *args, **kwargs) # Pass what you don't
def meth3(self):
pass
class Mixin(metaclass=ABCMeta):
# If Mixin, or any class in your hierarchy, doesn't need to do anything to
# be initialized, just omit __init__ entirely, and the super from other
# classes will skip over it entirely
def __init__(self, mixinarg, *args, **kwargs):
self.mixinval = mixinarg # Use what you know
super().__init__(self, *args, **kwargs) # Pass what you don't
#abstractmethod
def meth1(self): pass
#abstractmethod
def meth2(self): pass
def meth4(self):
...
self.meth1() # Method call on self will dispatch to other class's meth1
self.meth2() # Method call on self will dispatch to other class's meth1
class NewClass_1(Original_1, Mixin):
def __init__(self, newarg1, *args, **kwargs):
self.newval1 = newarg1 # Use what you know
super().__init__(self, *args, **kwargs) # Pass what you don't
class NewClass_2(Original_2, Mixin):
def __init__(self, newarg2, *args, **kwargs):
self.newval2 = newarg2 # Use what you know
super().__init__(self, *args, **kwargs) # Pass what you don't
Note that using super everywhere means you don't need to explicitly call each __init__ for your parents; it automatically linearizes the calls, so for example, in NewClass_2, that single super().__init__ will delegate to the first parent (Original_2), which then delegates to Original_1, which then delegates to Mixin (even though Original_1 knows nothing about Mixin).
In more complicated multiple inheritance (say, you inherit from Mixin through two different parent classes that both inherit from it), using super is the only way to handle it reasonably; super naturally linearizes and deduplicates the parent class tree, so even though two parents derive from it, Mixin.__init__ would still only be called once, preventing subtle errors from initializing Mixin more than once.
Note: You didn't specify which version of Python you're using. Metaclasses and super are both better and simpler in Python 3, so I've used Python 3 syntax. For Python 2, you'd need to set the metaclass a different way, and call super providing the current class object and self explicitly, which makes it less nice, but then, Python 2 is generally less nice at this point, so consider writing new code for Python 3?

Updating Classes that inherit from abstract classes

I have an abstract class ship.
from abc import ABC, abstractmethod
class ship(ABC):
def __init__(self):
...
#abstractmethod
def do_stuff(self,stuff,things):
pass
I have multiple classes that inherit from it (destroyer,cruiser,patrol_boat, etc...)
class carrier(ship):
def __init__(self):
....
def do_stuff(self,stuff,things):
....
Currently, if I were to add, let's say def do_more_stuff(self): to ship
class ship(ABC):
def __init__(self):
...
#abstractmethod
def do_stuff(self,stuff,things):
pass
#abstractmethod
def do_more_stuff(self,stuff,things):
pass
The changes would not affect any of the subclasses until I reentered them into the console. How do I change this?
If you actually redefine a class from scratch, it's a different class; the subclasses are still inheriting from the old version of ship. You can't just define a new class named ship and expect the subclasses to find it magically.
Normally, if you wanted to monkey-patch ship after creation to add new methods, you could just do something like:
def do_more_stuff(self,stuff,things):
pass
ship.do_more_stuff = do_more_stuff
But unfortunately, abstractmethods for ABCs are an explicit exception to this rule:
Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are not supported.
You must either define the abstract base class completely up front, or redefine the entire class hierarchy later if you want to add new abstract methods to the base class.

inheriting python method with a call to super inside the method

I am developing a system, which has a series of single multilevel inheritance hierarachy. one of the methods (applicable to all the classes) has to perform the same thing for most of the classes, which is to pass a list to its parent class.
I know that if one doesn't define a method in one of the inherited classes, its parents' methods are used. But when we use the super method, we need to mention the name of the class being called.
One method I know to achieve this is to redefine the method at every class with class name as argument. Is there any elegant method where I can define it once at the topmost parent, and then override it only when necessary?
The implementation right now looks like this
class a(object):
def __init__(self):
self.myL = list()
print 'hello'
class b(a):
def __init__(self):
super(b,self).__init__()
def resolve(self, passVal):
print passVal
self.myL.append(passVal)
super(b,self).resolve(passVal+1)
class c(b):
def __init__(self):
super(c,self).__init__()
def resolve(self, passVal):
print passVal
self.myL.append(passVal)
super(c,self).resolve(passVal+1)
Instead if I can define resolve in class a, and then all other classes inherit the method from it. I understand a will never be able to use it. but redefining the method seems a lot unnecessary extra work.

Categories