Python Class inherit from all submodules - python

I'm currently writing a wrapper in python for a lot of custom company tools.
I'm basically going to break each tool into its own py file with a class containing the call to the tool as a method.
These will all be contained in a package.
Then there'll be a master class that will import all from the package, then inherit from each and every class, so as to appear as one cohesive class.
masterClass.py
pyPackage
- __ init__.py
- module1.py
--class Module1
---method tool1
- module2.py
--class Module2
---method tool2
etc
Right now, I'm autogenerating the master class file to inherit from the packages modules, but I was wondering if there was a more elegant way to do it?
ie
from package import *
class MasterClass(package.all):
pass

I am not really sure what your reasons are for trying to create a larger master class with inheritance from all of your other smaller classes, but the first thing that comes to mind is that the design might be backwards.
What you might want to do instead is have a base class, for which all your command modules can subclass. You can use packages as intelligent namespaces for collecting similar functionality such as "network" or "compression", etc.
class ToolBase(object):
# common functionality here
# class attributes
# base constructor
# methods: static, class, instance
pass
class Module1(ToolBase):
def __init__(self):
super(Module1, self).__init__()
class Module2(ToolBase):
def __init__(self):
super(Module2, self).__init__()
In this base class example, every subclass can expect the functionality of ToolBase to be there, including any setup from the constructor like a database connection, sockets, resource.
And maybe a project structure like this:
pyPackage
__init__.py
# class ToolBase
# __all__ = ['network', 'compress']
network/
__init__.py
module1.py
# class Module1
compress/
__init__.py
module2.py
# class Module2
Update
As a way to use the base class and have a shared "port" object, you could make it a class level attribute that is only initialized once:
class ToolBase(object):
_PORT = None
def __init__(self):
if self._PORT is None:
self._PORT = "PORT"
#property
def port(self):
return self._PORT
class Foo(ToolBase):
def go(self):
print self.port
Now your classes are useful on their own, and will share the port object.

Related

Dynamic importing of registered subclasses stored in various modules

Please consider the following derived script structure:
import abc
class BaseMeta(abc.ABCMeta):
__registry__ = {}
def __init__(cls, name, bases, namespace):
if bases:
BaseMeta.__registry__.update({ cls.__name__: cls })
# other modifications of namespaces
super().__init__(name, bases, namespace)
class Base(metaclass = BaseMeta):
pass # Various abstract properties and methods
class Derived0(Base): pass
class Derived1(Base): pass
# ...
class DerivedN(Base): pass
The code above was adapted from this answer; I cannot use __subclass__ because I need to track direct as well as indirect subclasses. While I can circumvent this requirement for now, I would like to keep it for forward compatibility.
I would like to dynamically dispatch these subclasses using __getattr__() from the Base.__registry__. Because these subclasses can grow to several hundreds of SLOC and can depend on dozens of files, primarily SQL schemas, I would like to organize them into packages, e.g.
__main__.py
gui # Irrelevant package
db # Irrelevant package
# [...] Other irrelevant packages
relevant
__init__.py # Contains Base
derived0
__init__.py # Contains Derived0 and related machinery
derived1
__init__.py # Contains Derived1 and relevant machinery and imports module-xxx.py
module-xxx.py
schema_main.sql
copy-query-.xxx.sql
# [...]
# [...]
The problem is then that we would need to import dynamically as well. The only solution that I am aware of is to use pkgutil to walk the packages and exec(compile()) the buffers. This, however, messes up the namespaces.
I believe this problem to be somewhat analogous to this one, which seems as idiomatic as code for such problems can get, but I do not know result in exactly the same behavior as a simple import from a parent package? Does anyone know how to solve this?
All help is more than welcome, and so is other critique of the above code. Thank you very much.
I managed to solve the problem using pkgutil, importlib, and pathlib like so:
import abc
import importlib
import pathlib
import pkgutil
class BaseMeta(abc.ABCMeta):
__registry__ = {}
def __init__(cls, name, bases, namespace):
if bases:
BaseMeta.__registry__.update({ cls.__name__: cls })
# other modifications of namespaces
super().__init__(name, bases, namespace)
class Base(metaclass = BaseMeta):
pass # Various abstract properties and methods
# Subclasses have now been moved to their respective subpackages
for parent_FF, name, _ in pkgutil.walk_packages(__path__):
importlib.machinery.SourceFileLoader(
name,
(pathlib.Path(parent_FF.path) / name / "__init__.py").as_posix()).load_module()
Because the way Python handles paths is highly non-standardized, I am unaware of any more elegant solutions.

get abstract class info declared in another module in base class module

UPDATE: I am having trouble getting the names of inherited/derived class names declared in other modules in the base class module.
Use case: I would like to create a common interface to call the subclasses from the base class.
Example:
#basemodule.py
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
def get_subclasses():
#I want derived class information here
for cls in list(BaseClass.__subclasses__):
# call subclasses here
# module1.py
import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
# module2.py
import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
Any suggestions are appreciated! Thanks :)
Your question is a little bit vague, and lacks important details, like the error you're getting, and a bit more context regarding your actual goals here. Having said that, I've noticed a couple of things that might be the cause of the problems you're having:
BaseClass.__subclasses__ is a method, therefore, you need to call it instead of accessing it, like you access a class property, or attribute. To do so, use BaseClass.__subclasses__() instead.
In order to your get_subclasses() function to work, you need to first import the subclasses. Otherwise python won't know which classes inherit from BaseClass.
Corrected code
Here's the correct implementation of get_subclasses() function, as mentioned above:
from basemodule import BaseClass
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
Example
Without importing module1, and module2
Here's an example of the output from get_subclasses() I receive, when I don't import the modules that host the subclasses:
Importing module1, and module2
When I import both modules that host the subclasses, you then get the output I think you're expecting:
Full-code
Here's the full code of the examples:
# my_pckg/basemodule.py
from abc import ABCMeta, abstractmethod
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
# ================================================
# my_pckg/module1.py
from my_pckg.basemodule import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
pass
# ================================================
# my_pckg/module2.py
from my_pckg.basemodule import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
pass
# ================================================
# my_pckg/test.ipynb
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
Important notes
The imports as shown in the example pictures won't work, if you're trying to use them from outside the parent module. In the example I just gave, here's the complete tree view of the entire package structure:
my_pckg
|______init__.py # <-- Needed to make my_pckg submodules "importable".
|____basemodule.py # <-- Hosts the BaseClass class.
|____module1.py # <-- Hosts the derivedClass1 subclass.
|____module2.py # <-- Hosts the derivedClass2 subclass.
|____Test.ipynb # <-- Where the test from the screenshots took place.
If you want to import these modules from outside the package you have two options:
Create a setup for you package, and pip install it (use the -e flag to install it in development mode).
Import sys, and add my_pckg path to the known paths.
import sys
sys.path.insert(0, './my_pckg')
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
For example:
Circular Imports
Do NOT import module1, and module2 inside basemodule, as this leads to a circular import. This happens because when you import basemodule, python will see that the module needs to import module1, and module2 and therefore goes to these modules. There, it finds out that both actually require basemodule themselves, so it goes back to basemodule. You can see that this becomes an infinite circle, where no module is able to be imported. To overcome this, place get_subclasses() function in a separate module, alongside all your necessary imports, like the example pictures.

sphinx-apidoc generate documentation for class re-exported in __init__.py

In mypkg/_core.py I have
class SomeClass:
"""Here is my class."""
def __init__(self, x: int):
self.x = x
In __init__.py I have
from mypkg._core import SomeClass
I figure there's a default configuration to avoid documenting everything that's imported, which is fine. In my case, _core.py is a private module, and I don't want to document it. I want that documentation to be exported to the top-level mypkg module documentation.
I want SomeClass to appear in the top-level mypkg documentation, but it does not.

Class with only class methods

I have a class with only class methods. Is it a Pythonic way of namespacing? If not, what is the best way to group similar kinds of methods?.
class OnlyClassMethods(object):
#classmethod
def method_1(cls):
pass
#classmethod
def method_2(cls):
pass
A class is meant to have instances, not to serve as namespace. If your class is never instantiated, it does not serve the intended purpose of Python's class.
If you want to namespace a group of methods which are related, create a new module, that is another .py file, and import it.
Example
Here we create a module named helpers which contains some related methods. This module can then be imported in our main file.
helpers.py
def method_1():
...
def method_2():
...
main.py
import helpers
helpers.method_1()
helpers.method_2()

Pycharm does not autocomplete methods for class inheriting with aliased import

I have two python files which define classes, one of these files imports a class as an alias and uses it as the parent of a class. My issue is that PyCharm does not provide any information on the inherited properties or methods when I use the import alias.
class_a.py
class A(object):
def do_something(self, arg):
print(arg)
class_b.py
from class_a import A as BaseA
class B(BaseA):
def do_some # I expect pycharm to show autocomplete here because `def do_something(...) exists on the parent class
So as I'm typing methods or properties in class B it doesn't show any autocomplete as if it has no idea that BaseA is actually A. Is this a bug in Pycharm?

Categories