I'm trying to provide framework which allows people to write their own plugins. These plugins are basically derived classes. My base class needs some variables to initialize, how can I initialize my base class without having to let my derived class feed the variable in the base class initialization?
#!/bin/python
class BaseClass():
def __init__(self,config):
self.config=config
def showConfig(self):
print "I am using %s" % self.config
class UserPlugin(BaseClass):
def __init__(self,config):
BaseClass.__init__(self,config)
def doSomething(self):
print "Something"
fubar = UserPlugin('/tmp/config.cfg')
fubar.showConfig()
My goal is to avoid the need to define the config parameter in the UserPlugin class, since this is something I don't want the user who writes a plugin to be bothered with.
You can use argument lists to pass any remaining arguments to the base class:
class UserPlugin(BaseClass):
def __init__(self, *args, **kwargs):
BaseClass.__init__(self, *args, **kwargs)
Based on your Pastebin code, how about this? This avoids using a separate global, instead using a class attribute, which is accessible as a member to all derived classes and their instances.
#!/bin/python
class BaseClass():
config = '/tmp/config.cfg'
def __init__(self):
pass
def showConfig(self):
print "I am using %s" % self.config
class UserPlugin(BaseClass):
def __init__(self):
BaseClass.__init__(self)
def doSomething(self):
print "Something"
fubar = UserPlugin()
fubar.showConfig()
This was the other way to do it that I mentioned before. Keep in mind that if you want to change the value of BaseClass.config itself, you should access it directly (i.e. BaseClass.config = '/foo/path'; otherwise, you wind up creating a custom UPinstance.config value, leaving BaseClass.config unchanged.
Related
Currently I am writing a Python program with a plugin system. To develop a new plugin a new class must be created and inherit from a base plugin class. Now it should be possible to add optional functions via mixins. Some mixins provide new functions others access builtin types of the base class and can act with them or change them.
In the following a simplified structure:
import abc
import threading
class Base:
def __init__(self):
self.config = dict()
if hasattr(self, "edit_config"):
self.edit_config()
def start(self):
"""Starts the Plugin"""
if hasattr(self, "loop"):
self._loop()
class AMixin:
def edit_config(self):
self.config["foo"] = 123
class BMixin(abc.ABC):
def _loop(self):
thread = threading.Thread(target=self.loop, daemon=True)
thread.start()
#abc.abstractmethod
def loop(self):
"""Override this method with a while true loop to establish a ongoing loop
"""
pass
class NewPlugin(Base, AMixin, BMixin):
def loop(self):
while True:
print("Hello")
plugin = NewPlugin()
plugin.start()
What is the best way to tackle this problem?
EDIT: I need to make my question more specific. The question is whether the above is the Pythonic way and is it possible to ensure that the mixin are inherited exclusively in conjunction with the Base class. Additionally it would be good in an IDE like VSCode to get support for e.g. autocomplete when accessing builtin types of the Base class, like in AMixin, without inheriting from it of course.
If you want to allow but not require subclasses to define some behaviour in a method called by the base class, the simplest way is to declare the method in the base class, have an empty implementation, and just call the method unconditionally. This way you don't have to check whether the method exists before calling it.
class Base:
def __init__(self):
self.config = dict()
self.edit_config()
def start(self):
self.loop()
def edit_config(self):
pass
def loop(self):
pass
class AMixin:
def edit_config(self):
self.config["foo"] = 123
class NewPlugin(AMixin, Base):
def loop(self):
for i in range(10):
print("Hello")
Note that you have to write AMixin before Base in the list of superclasses, so that its edit_config method overrides the one from Base, and not the other way around. You can avoid this by writing class AMixin(Base): so that AMixin.edit_config always overrides Base.edit_config in the method resolution order.
If you want to require subclasses to implement one of the methods, then you can raise TypeError() instead of pass in the base class's method.
I would move the calls to the methods provided by the mix-ins to __init__ methods defined by those classes.
import abc
import threading
class Base:
def __init__(self, **kwargs):
super.__init__(**kwargs)
self.config = dict()
class AMixin:
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.edit_config()
def edit_config(self):
self.config["foo"] = 123
class BMixin(abc.ABC):
def __init__(self, **kwargs):
super().__init__(**kwargs):
self.loop()
def _loop(self):
thread = threading.Thread(target=self.loop, daemon=True)
thread.start()
#abc.abstractmethod
def loop(self):
"""Override this method with a while true loop to establish a ongoing loop
"""
pass
class NewPlugin(Base, AMixin, BMixin):
pass
When you instantiate a concrete subclass of NewPlugin, Base.__init__, AMixin.__init__, and BMixin.__init__ will be called in that order.
I wrote a Python module, with several classes that inherit from a single class called MasterBlock.
I want to import this module in a script, create several instances of these classes, and then get a list of all the existing instances of all the childrens of this MasterBlock class. I found some solutions with vars()['Blocks.MasterBlock'].__subclasses__() but as the instances I have are child of child of MasterBlock, it doesn't work.
Here is some example code:
Module:
Class MasterBlock:
def main(self):
pass
Class RandomA(MasterBlock):
def __init__(self):
pass
# inherit the main function
Class AnotherRandom(MasterBlock):
def __init__(self):
pass
# inherit the main function
Script:
import module
a=module.RandomA()
b=module.AnotherRandom()
c=module.AnotherRandom()
# here I need to get list_of_instances=[a,b,c]
Th ultimate goal is to be able to do:
for instance in list_of_instances:
instance.main()
If you add a __new__() method as shown below to your base class which keeps track of all instances created in a class variable, you could make the process more-or-less automatic and not have to remember to call something in the __init__() of each subclass.
class MasterBlock(object):
instances = []
def __new__(cls, *args, **kwargs):
instance = super(MasterBlock, cls).__new__(cls, *args, **kwargs)
instance.instances.append(instance)
return instance
def main(self):
print('in main of', self.__class__.__name__) # for testing purposes
class RandomA(MasterBlock):
def __init__(self):
pass
# inherit the main function
class AnotherRandom(RandomA): # works for sub-subclasses, too
def __init__(self):
pass
# inherit the main function
a=RandomA()
b=AnotherRandom()
c=AnotherRandom()
for instance in MasterBlock.instances:
instance.main()
Output:
in main of RandomA
in main of AnotherRandom
in main of AnotherRandom
What about adding a class variable, that contains all the instances of MasterBlock? You can record them with:
Class MasterBlock(object):
all_instances = [] # All instances of MasterBlock
def __init__(self,…):
…
self.all_instances.append(self) # Not added if an exception is raised before
You get all the instances of MasterBlock with MasterBlock.all_instances (or instance.all_instances).
This works if all base classes call the __init__ of the master class (either implicitly through inheritance or explicitly through the usual super() call).
Here's a way of doing that using a class variable:
class MasterBlock(object):
instances = []
def __init__(self):
self.instances.append(self)
def main(self):
print "I am", self
class RandomA(MasterBlock):
def __init__(self):
super(RandomA, self).__init__()
# other init...
class AnotherRandom(MasterBlock):
def __init__(self):
super(AnotherRandom, self).__init__()
# other init...
a = RandomA()
b = AnotherRandom()
c = AnotherRandom()
# here I need to get list_of_instances=[a,b,c]
for instance in MasterBlock.instances:
instance.main()
(you can make it simpler if you don't need __init__ in the subclasses)
output:
I am <__main__.RandomA object at 0x7faa46683610>
I am <__main__.AnotherRandom object at 0x7faa46683650>
I am <__main__.AnotherRandom object at 0x7faa46683690>
I'm trying to do the following:
class A:
#classmethod
def test_function(cls, message):
cls.__get_the_function()
class B(A):
#classmethod
def __get_the_function(cls):
return print("BBBB")
class C(A):
#classmethod
def __get_the_function(cls):
return print("CCCC")
however when I call:
B.test_function("Test")
I get the following:
AttributeError: type object 'B' has no attribute '_A__get_the_function'
I want class A to __get_the_function from the subclass (either class B or C depends on which one I use), but it looks like it is trying to look for it in itself.
NOTE: I'm using Python 3.8.2
__-prefixed names are handled specially during class creation. The name is replaced when the function is defined by a mangled name, as if you had defined the function as
#classmethod
def test_function(cls, message):
cls._A__get_the_function()
in the first place.
This is done to explicitly provide a way to hide a name from a subclass. Since you want to override the name, __get_the_function isn't an appropriate name; use an ordinary _-prefixed name if you want to mark it as private:
class A:
#classmethod
def test_function(cls, message):
cls._get_the_function()
# Define *something*, since test_function assumes it
# will exist. It doesn't have to *do* anything, though,
# until you override it.
#classmethod
def _get_the_function(cls):
pass
I'm trying to write a tracker class where the instances of the tracker class track the sub-classes of another class that are in the scope of the tracker instance.
More concretely, the following is an example of what I am trying to achieve:
class Foo(object): pass
class FooTracker(object):
def __init__(self):
# use Foo.__subclasses__() or a metaclass to track subclasses
# - but how do I filter this to only get the ones in scope?
self.inscope = <something magic goes here>
ft1 = FooTracker()
assert ft1.inscope == []
class Bar(Foo): pass
ft2 = FooTracker()
assert ft2.inscope == [<class '__main__.Bar'>]
def afunction():
class Baz(Foo): pass # the global definition of Bar is now hidden
class Bar(Foo): pass
ft3 = FooTracker()
assert (set(ft3.inscope) == set([<class '__main__.afunction.<locals>.Baz'>,
<class '__main__.afunction.<locals>.Bar'>])
ft4 = FooTracker() # afunction.Baz and afunction.Bar are no longer in scope
assert ft4.inscope == [<class '__main__.Bar'>]
So I want the instances of FooTracker to track the sub-classes of Foo that were in scope at the time the FooTracker object was created.
I've tried a few different things, such as parsing the qualified names of the Foo sub-classes and using exec() to do the name resolution but the fundamental problem is that it always works out the sub-classes relative to the scope within FooTracker.__init__() and not where it was called.
My only other thought was to try something with inspect.currentframe() but even if this were possible it would probably be too much of a hack and would make the code too brittle (e.g., there is a comment in the docs that not all Python implementations will have frame support in the interpreter").
There's no easy way to do exactly what you're asking for. But you might be able to use some Python features to get something with a roughly similar API, without as much hassle.
One option would be to require each subclass to be decorated with a method of your Tracker class. This would make it really easy to keep track of them, since you'd just append each caller of the method to a list:
class Tracker:
def __init__(self):
self.subclasses = []
def register(self, cls):
self.subclasses.append(cls)
return cls
class Foo(): pass
foo_tracker = Tracker()
#foo_tracker.register
class FooSubclass1(Foo): pass
#foo_tracker.register
class FooSubclass2(Foo): pass
print(foo_tracker.subclasses)
This doesn't actually require that the classes being tracked are subclasses of Foo, all classes (and even non-class objects) can be tracked if you pass them to the register method. Decorator syntax makes it a little nicer than just appending each class to a list after you define it, but not by a whole lot (you still repeat yourself a fair amount, which may be annoying unless you make the tracker and method names very short).
A slightly trickier version might get passed the base class, so that it would detect subclasses automatically (via Foo.__subclasses__). To limit the subclasses it detects (rather than getting all subclasses of the base that have ever existed), you could make it behave as a context manager, and only track new subclasses defined within a with block:
class Tracker:
def __init__(self, base):
self.base = base
self._exclude = set()
self.subclasses = set()
def __enter__(self):
self._exclude = set(self.base.__subclasses__())
return self
def __exit__(self, *args):
self.subclasses = set(self.base.__subclasses__()) - self._exclude
return False
class Foo(): pass
class UntrackedSubclass1(Foo): pass
with Tracker(Foo) as foo_tracker:
class TrackedSubclass1(Foo): pass
class TrackedSubclass2(Foo): pass
class UntrackedSubclass2(Foo): pass
print(foo_tracker.subclasses)
If you're using Python 3.6 or later, you can do the tracking a different way by injecting an __init_subclass__ class method into the tracked base class, rather than relying upon __subclasses__. If you don't need to support class hierarchies that are already using __init_subclass__ for their own purposes (and you don't need to support nested trackers), it can be quite elegant:
class Tracker:
def __init__(self, base):
self.base = base
self.subclasses = []
def __enter__(self):
#classmethod
def __init_subclass__(cls, **kwargs):
self.subclasses.append(cls)
self.base.__init_subclass__ = __init_subclass__
return self
def __exit__(self, *args):
del self.base.__init_subclass__
return False
class Foo(): pass
class UntrackedSubclass1(Foo): pass
with Tracker(Foo) as foo_tracker:
class TrackedSubclass1(Foo): pass
class TrackedSubclass2(Foo): pass
class UntrackedSubclass2(Foo): pass
print(foo_tracker.subclasses)
One nice feature of this version is that it automatically tracks deeper inheritance hierarchies. If a subclass of a subclass is created within the with block, that "grandchild" class will still be tracked. We could make the previous __subclasses__ based version work this way too, if you wanted, by adding another function to recursively expand out the subclasses of each class we find.
If you do want to play nice with existing __init_subclass__ methods, or want to be able to nest trackers, you need to make the code a bit more complicated. Injecting a well behaved classmethod in a reversible way is tricky since you need handle both the case where the base class has its own method, and the case where it's inheriting a version from its parents.
class Tracker:
def __init__(self, base):
self.base = base
self.subclasses = []
def __enter__(self):
if '__init_subclass__' in self.base.__dict__:
self.old_init_subclass = self.base.__dict__['__init_subclass__']
else:
self.old_init_subclass = None
#classmethod
def __init_subclass__(cls, **kwargs):
if self.old_init_subclass is not None:
self.old_init_subclass.__get__(None, cls)(**kwargs)
else:
super(self.base, cls).__init_subclass__(**kwargs)
self.subclasses.append(cls)
self.base.__init_subclass__ = __init_subclass__
return self
def __exit__(self, *args):
if self.old_init_subclass is not None:
self.base.__init_subclass__ = self.old_init_subclass
else:
del self.base.__init_subclass__
return False
class Foo:
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
print("Foo!")
class Bar(Foo): pass # every class definition from here on prints "Foo!" when it runs
with Tracker(Bar) as tracker1:
class Baz(Bar): pass
with Tracker(Foo) as tracker2:
class Quux(Foo): pass
with Tracker(Bar) as tracker3:
class Plop(Bar): pass
# four Foo! lines will have be printed by now by Foo.__init_subclass__
print(tracker1.subclasses) # will describe Baz and Plop, but not Quux
print(tracker2.subclasses) # will describe Quux and Plop
print(tracker3.subclasses) # will describe only Plop
I am working on a code base that uses oop and I am relavtively new to it. My question specifically is, why NewMenuItem not inherit from File?
code bunk to play with code: https://codebunk.com/b/350127244/
"""Build class hierarchy and get values from decendants"""
import inspect
def selftest():
class Menu(object):
def __init__(self):
super(Menu, self).__init__()
self.value = "Menu"
class MenuBar(Menu):
#having object in there makes it a new style object, which allows us to use super
def __init__(self):
super(MenuBar, self).__init__()
self.value = "MenuBar"
class File(MenuBar):
def __init__(self):
Menu.__init__()
super(File, self).__init__()
self.value = "File"
self.FileValue = "File here!"
class New(Menu):
def __init__(self):
Menu.__init__()
pass
class NewMenuItem(Menu):
def __init__(self):
"""
Q 1- Why do I need self here?
Menu.__init__(self)
"""
Menu.__init__(self)
pass
def show_vals(self):
print(self.value)
"""
Q 2 -why wont this work?
def show_vals2(self):
print(self.FileValue)
"""
example = File.New.NewMenuItem()
example.show_vals()
"""
Q 3 - Why do I get this error with this line?
inspect.getmro(example)
AttributeError: 'ManageProduct' object has no attribute '__bases__'
"""
I'm trying to understand what is happening line by line, but what I don't get is why NewMenuItem doesn't inherit from File.
I tried hard-coding the instantiation of File,like so:
File.init()
but then I get an error unless I pass the File object:
File.__init__(File())
I guess what I am struggling with is:
-inheritance trees
-super classes
-why we need to hard-code instantiations in this case
Keep in mind that this is the code I have come across. I am not sure why this is the way it is.
Inheritance and scope are two completely different things. NewMenuItem is defined inside the scope of the class New, inside of the scope of the class File, but it inherits from Menu, which inherits from object. So while NewMenuItem will only be accessible through the class File and again through New, it will inherit its methods from Menu, and super will refer to Menu.