get abstract class info declared in another module in base class module - python

UPDATE: I am having trouble getting the names of inherited/derived class names declared in other modules in the base class module.
Use case: I would like to create a common interface to call the subclasses from the base class.
Example:
#basemodule.py
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
def get_subclasses():
#I want derived class information here
for cls in list(BaseClass.__subclasses__):
# call subclasses here
# module1.py
import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
# module2.py
import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
Any suggestions are appreciated! Thanks :)

Your question is a little bit vague, and lacks important details, like the error you're getting, and a bit more context regarding your actual goals here. Having said that, I've noticed a couple of things that might be the cause of the problems you're having:
BaseClass.__subclasses__ is a method, therefore, you need to call it instead of accessing it, like you access a class property, or attribute. To do so, use BaseClass.__subclasses__() instead.
In order to your get_subclasses() function to work, you need to first import the subclasses. Otherwise python won't know which classes inherit from BaseClass.
Corrected code
Here's the correct implementation of get_subclasses() function, as mentioned above:
from basemodule import BaseClass
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
Example
Without importing module1, and module2
Here's an example of the output from get_subclasses() I receive, when I don't import the modules that host the subclasses:
Importing module1, and module2
When I import both modules that host the subclasses, you then get the output I think you're expecting:
Full-code
Here's the full code of the examples:
# my_pckg/basemodule.py
from abc import ABCMeta, abstractmethod
class BaseClass(object):
#abstractmethod
def method(self, inputs=None):
pass
# ================================================
# my_pckg/module1.py
from my_pckg.basemodule import BaseClass
class derivedClass1(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass1
pass
# ================================================
# my_pckg/module2.py
from my_pckg.basemodule import BaseClass
class derivedClass2(BaseClass):
def __init__(self):
super().__init__()
def method(self, inputs):
# method implemented in derivedClass2
pass
# ================================================
# my_pckg/test.ipynb
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
Important notes
The imports as shown in the example pictures won't work, if you're trying to use them from outside the parent module. In the example I just gave, here's the complete tree view of the entire package structure:
my_pckg
|______init__.py # <-- Needed to make my_pckg submodules "importable".
|____basemodule.py # <-- Hosts the BaseClass class.
|____module1.py # <-- Hosts the derivedClass1 subclass.
|____module2.py # <-- Hosts the derivedClass2 subclass.
|____Test.ipynb # <-- Where the test from the screenshots took place.
If you want to import these modules from outside the package you have two options:
Create a setup for you package, and pip install it (use the -e flag to install it in development mode).
Import sys, and add my_pckg path to the known paths.
import sys
sys.path.insert(0, './my_pckg')
from basemodule import BaseClass
from module1 import *
from module2 import *
def get_subclasses():
"""Get subclasses from `basemodule.BaseClass`."""
for cls in list(BaseClass.__subclasses__()):
# call subclasses here
print(cls.__name__) # Added print statement to test the solution.
get_subclasses()
# Prints:
# derivedClass1
# derivedClass2
For example:
Circular Imports
Do NOT import module1, and module2 inside basemodule, as this leads to a circular import. This happens because when you import basemodule, python will see that the module needs to import module1, and module2 and therefore goes to these modules. There, it finds out that both actually require basemodule themselves, so it goes back to basemodule. You can see that this becomes an infinite circle, where no module is able to be imported. To overcome this, place get_subclasses() function in a separate module, alongside all your necessary imports, like the example pictures.

Related

Python type hints, how to avoid cross module hell? [duplicate]

I'm trying to split my huge class into two; well, basically into the "main" class and a mixin with additional functions, like so:
main.py file:
import mymixin.py
class Main(object, MyMixin):
def func1(self, xxx):
...
mymixin.py file:
class MyMixin(object):
def func2(self: Main, xxx): # <--- note the type hint
...
Now, while this works just fine, the type hint in MyMixin.func2 of course can't work. I can't import main.py, because I'd get a cyclic import and without the hint, my editor (PyCharm) can't tell what self is.
I'm using Python 3.4, but I'm willing to move to 3.5 if a solution is available there.
Is there any way I can split my class into two files and keep all the "connections" so that my IDE still offers me auto-completion and all the other goodies that come from it knowing the types?
There isn't a hugely elegant way to handle import cycles in general, I'm afraid. Your choices are to either redesign your code to remove the cyclic dependency, or if it isn't feasible, do something like this:
# some_file.py
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from main import Main
class MyObject(object):
def func2(self, some_param: 'Main'):
...
The TYPE_CHECKING constant is always False at runtime, so the import won't be evaluated, but mypy (and other type-checking tools) will evaluate the contents of that block.
We also need to make the Main type annotation into a string, effectively forward declaring it since the Main symbol isn't available at runtime.
If you are using Python 3.7+, we can at least skip having to provide an explicit string annotation by taking advantage of PEP 563:
# some_file.py
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from main import Main
class MyObject(object):
# Hooray, cleaner annotations!
def func2(self, some_param: Main):
...
The from __future__ import annotations import will make all type hints be strings and skip evaluating them. This can help make our code here mildly more ergonomic.
All that said, using mixins with mypy will likely require a bit more structure then you currently have. Mypy recommends an approach that's basically what deceze is describing -- to create an ABC that both your Main and MyMixin classes inherit. I wouldn't be surprised if you ended up needing to do something similar in order to make Pycharm's checker happy.
For people struggling with cyclic imports when importing class only for Type checking: you will likely want to use a Forward Reference (PEP 484 - Type Hints):
When a type hint contains names that have not been defined yet, that definition may be expressed as a string literal, to be resolved later.
So instead of:
class Tree:
def __init__(self, left: Tree, right: Tree):
self.left = left
self.right = right
you do:
class Tree:
def __init__(self, left: 'Tree', right: 'Tree'):
self.left = left
self.right = right
The bigger issue is that your types aren't sane to begin with. MyMixin makes a hardcoded assumption that it will be mixed into Main, whereas it could be mixed into any number of other classes, in which case it would probably break. If your mixin is hardcoded to be mixed into one specific class, you may as well write the methods directly into that class instead of separating them out.
To properly do this with sane typing, MyMixin should be coded against an interface, or abstract class in Python parlance:
import abc
class MixinDependencyInterface(abc.ABC):
#abc.abstractmethod
def foo(self):
pass
class MyMixin:
def func2(self: MixinDependencyInterface, xxx):
self.foo() # ← mixin only depends on the interface
class Main(MixinDependencyInterface, MyMixin):
def foo(self):
print('bar')
Since Python 3.5, breaking your classes up into separate files is easy.
It's actually possible to use import statements inside of a class ClassName: block in order to import methods into a class. For instance,
class_def.py:
class C:
from _methods1 import a
from _methods2 import b
def x(self):
return self.a() + " " + self.b()
In my example,
C.a() will be a method which returns the string hello
C.b() will be a method which returns hello goodbye
C.x() will thus return hello hello goodbye.
To implement a and b, do the following:
_methods1.py:
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from class_def import C
def a(self: C):
return "hello"
Explanation: TYPE_CHECKING is True when the type checker is reading the code. Since the type checker doesn't need to execute the code, circular imports are fine when they occur within the if TYPE_CHECKING: block. The __future__ import enables postponed annotations. This is an optional; without it you must quote the type annotations (i.e. def a(self: "C"):).
We define _methods2.py similarly:
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from class_def import C
def b(self: C):
return self.a() + " goodbye"
In VS Code, I can see the type detected from self.a() when hovering:
And everything runs as expected:
>>> from class_def import C
>>> c = C()
>>> c.x()
'hello hello goodbye'
Notes on older Python versions
For Python versions ≤3.4, TYPE_CHECKING is not defined, so this solution won't work.
For Python versions ≤3.6, postponed annotations are not defined. As a workaround, omit from __future__ import annotations and quote the type declarations as mentioned above.
Turns out my original attempt was quite close to the solution as well. This is what I'm currently using:
# main.py
import mymixin.py
class Main(object, MyMixin):
def func1(self, xxx):
...
# mymixin.py
if False:
from main import Main
class MyMixin(object):
def func2(self: 'Main', xxx): # <--- note the type hint
...
Note the import within if False statement that never gets imported (but IDE knows about it anyway) and using the Main class as string because it's not known at runtime.
Rather than forcing oneself to engage in typing.TYPE_CHECKING shenanigans, there is a simple way to avoid circular type-hints: don't use from imports, and use either from __future__ import annotations or string annotations.
# foo.py
from __future__ import annotations
import bar
class Foo:
bar: bar.Bar
# bar.py
import foo
class Bar:
foo: "foo.Foo"
This style of import is "lazily evaluated", whereas using from foo import Foo would force Python to run the entire foo module to get the final value of Foo immediately at the import line. It's quite useful if you need to use it at runtime as well e.g. if foo.Foo or bar.Bar needs to be used within a function/method, since your functions/methods should only be called once foo.Foo and bar.Bar can be used.
I would advice refactoring your code, as some other persons suggested.
I can show you a circular error I recently faced:
BEFORE:
# person.py
from spell import Heal, Lightning
class Person:
def __init__(self):
self.life = 100
class Jedi(Person):
def heal(self, other: Person):
Heal(self, other)
class Sith(Person):
def lightning(self, other: Person):
Lightning(self, other)
# spell.py
from person import Person, Jedi, Sith
class Spell:
def __init__(self, caster: Person, target: Person):
self.caster: Person = caster
self.target: Person = target
class Heal(Spell):
def __init__(self, caster: Jedi, target: Person):
super().__init__(caster, target)
target.life += 10
class Lightning(Spell):
def __init__(self, caster: Sith, target: Person):
super().__init__(caster, target)
target.life -= 10
# main.py
from person import Jedi, Sith
Step by step:
# main starts to import person
from person import Jedi, Sith
# main did not reach end of person but ...
# person starts to import spell
from spell import Heal, Lightning
# Remember: main is still importing person
# spell starts to import person
from person import Person, Jedi, Sith
console:
ImportError: cannot import name 'Person' from partially initialized module
'person' (most likely due to a circular import)
A script/module can be imported only by one and only one script.
AFTER:
# person.py
class Person:
def __init__(self):
self.life = 100
# spell.py
from person import Person
class Spell:
def __init__(self, caster: Person, target: Person):
self.caster: Person = caster
self.target: Person = target
# jedi.py
from person import Person
from spell import Spell
class Jedi(Person):
def heal(self, other: Person):
Heal(self, other)
class Heal(Spell):
def __init__(self, caster: Jedi, target: Person):
super().__init__(caster, target)
target.life += 10
# sith.py
from person import Person
from spell import Spell
class Sith(Person):
def lightning(self, other: Person):
Lightning(self, other)
class Lightning(Spell):
def __init__(self, caster: Sith, target: Person):
super().__init__(caster, target)
target.life -= 10
# main.py
from jedi import Jedi
from sith import Sith
jedi = Jedi()
print(jedi.life)
Sith().lightning(jedi)
print(jedi.life)
order of executed lines:
from jedi import Jedi # start read of jedi.py
from person import Person # start AND finish read of person.py
from spell import Spell # start read of spell.py
from person import Person # start AND finish read of person.py
# finish read of spell.py
# idem for sith.py
console:
100
90
File composition is key
Hope it will help :D
I think the perfect way should be to import all the classes and dependencies in a file (like __init__.py) and then from __init__ import * in all the other files.
In this case you are
avoiding multiple references to those files and classes and
also only have to add one line in each of the other files and
the third would be the pycharm knowing about all of the classes that you might use.

Class with only class methods

I have a class with only class methods. Is it a Pythonic way of namespacing? If not, what is the best way to group similar kinds of methods?.
class OnlyClassMethods(object):
#classmethod
def method_1(cls):
pass
#classmethod
def method_2(cls):
pass
A class is meant to have instances, not to serve as namespace. If your class is never instantiated, it does not serve the intended purpose of Python's class.
If you want to namespace a group of methods which are related, create a new module, that is another .py file, and import it.
Example
Here we create a module named helpers which contains some related methods. This module can then be imported in our main file.
helpers.py
def method_1():
...
def method_2():
...
main.py
import helpers
helpers.method_1()
helpers.method_2()

How to override a method used by a 3rd party library

This would be the layout
some_function.py
def some_function():
print("some_function")
some_library.py
from some_function import some_function
class A:
def xxx(self):
some_function()
main.py
from some_library import A
from some_function import some_function
def new_some_function():
print("new_some_function")
if __name__ == '__main__':
some_function = new_some_function
a = A()
a.xxx()
In the class A, the method xxx, calls some_function, so is it possible to override it with something else, without re-implementing the entire class?
I think you are looking for monkey patching (means changing classes/modules dynamically while running). This way you don't need to overwrite the class A and use inheritance as suggested by other comments - you said you don't want that, so try this solution:
import some_class # import like this or will not work (cos of namespaces)
def new_some_function():
print("new_some_function")
if __name__ == '__main__':
# Import and use like this, other ways of import will not work.
# Because other way imports method to your namespace and then change it in your namespace,
# but you need to change it in the original namespace
some_class.some_function = new_some_function
That way replace the original method and even other classes will use it then. Be careful, if the original method is a class/instance method, you need to create new function with proper params, like this:
def new_some_function(self):
# for instance methods, you may add other args, but 'self' is important
def new_some_function(cls):
# for class methods, you may add other args, but 'cls' is important
You provide very little information about your use case here. As one of the comments points out, this might be a case for inheritance. If you are in a testing context, you may not want to use inheritance though, but you might rather want to use a mock-object.
Here is the inheritance version:
from some_library import A
def new_some_function():
print("new_some_function")
class B(A):
def xxx(self):
new_some_function()
if __name__ == '__main__':
a = B()
a.xxx()
Note, how class B derives from class A through the class B(A) statement. This way, class B inherits all functionality from A and the definition of class B only consists of the parts where B differs from A. In your example, that is the fact that the xxx method should call new_some_function instead of some_function.
Here is the mock version:
from unittest import mock
from some_library import A
def new_some_function():
print("new_some_function")
if __name__ == '__main__':
with mock.patch('some_library.some_function') as mock_some_function:
mock_some_function.side_effect = new_some_function
a = A()
a.xxx()
As mentioned above this approach is mostly useful if you are in a testing context and if some_function does something costly and/or unpredictable. In order to test code that involves a call to some_function, you may temporarily want to replace some_function by something else, that is cheap to call and behaves in a predictable way. In fact, for this scenario, replacing some_function by new_some_function might even be more than what is actually needed. Maybe, you just want an empty hull that can be called and that always returns the same value (instead of the side_effect line, you can specify a constant .return_value in the above code example). One of the key functionalities of mock objects is that you can later check if that function has been called. If testing is your use case, I would very much recommend looking at the documentation of the python mock module.
Note that the example uses the mock.patch context manager. This means that within the managed context (i.e. the block inside the with-statement) some_library.some_function is replaced by a mock object, but once you leave the managed context, the original functionality is put back in place.
You may just create another class and override the method you need.
Just as an example:
class myInt(int):
def __pow__(self, x):
return 0
a = myInt(10)
a+10 # 20
a**2 # 0
In this case a is an int and has access to all the method of the int class, but will use the __pow__ method I've defined.
What you need is inheritance, You can subclass a class and with super method you can inherit all the parent class functions. If you want to override parent class functions, you just need to provide a different implementation by the same name in child class.
from some_library import A
from some_function import some_function
def new_some_function():
print("new_some_function")
class B(A):
def __init__(*args, **kwargs):
super().__init__(self)
pass
def xxx(self):
new_some_function()
if __name__ == '__main__':
a = B()
a.xxx()
Output:
new_some_function
you syntax may differ depending upon the python version.
In python3
class B(A):
def __init__(self):
super().__init__()
In Python 2,
class B(A):
def __init__(self):
super(ChildB, self).__init__()

Python mockito - Mocking a class which is being instantiated from the testable function

I am bit lost while writing the test case for UserCompanyRateLimitValidation class. I am finding difficulty in mocking the class which is being instantiated from inside this class.
class UserCompanyRateLimitValidation:
def __init__(self, user_public_key):
self.adapter = UserAdapter(user_public_key)
container = self.adapter.get_user_company_rate_limit()
super(UserCompanyRateLimitValidation, self).__init__(container,\
UserCompanyRateLimitValidation.TYPE)
I have to test this class. I have written test case something like this. I have tried to mock the UserAdapter class but I am not able to do so completely.
def test_case_1():
self.user_public_key = 'TEST_USER_PUBLIC_KEY_XXXXXX1234567890XXXXX'
UserAdapter_mock = mock(UserAdapter)
when(UserAdapter_mock).get_user_company_rate_limit().\
thenReturn(get_fake_container_object())
self.test_obj = UserCompanyRateLimitValidation(self.user_public_key)
Here if you see I have mocked get_user_company_rate_limit() call from the testable function, container = self.adapter.get_user_company_rate_limit()
but I am still not able to figure out the way in which I can mock this call,
self.adapter = UserAdapter(user_public_key)
It is quite simple if you know the trick.
Creating an object in Python is very much like a function call to the class object. UserCompanyRateLimitValidation is 'invoking' UserAdapter(user_public_key). You want to stub the return value of that 'call' to return UserAdapter_mock.
You can stub this like you would stub a function in a module. The line you're missing is:
when(module_declaring_UserAdapter)\
.UserAdapter(self.user_public_key)\
.thenReturn(UserAdapter_mock)
After that, calling module_declaring_UserAdapter.UserAdapter(self.user_public_key) will return UserAdapter_mock.
Here's the link to the section in the manual: https://code.google.com/p/mockito-python/wiki/Stubbing#Modules
You have to be careful to choose the right module_declaring_UserAdapter, due to the way the from ... import ... statement works. From your code, I'd say you have to pick the module in which UserCompanyRateLimitValidation is declared.
Here is another way of looking at it. Say I have this code in which I would like to mock MyClass:
from some.module import MyClass
class AnotherClass:
def __init__(self):
self.my_class = MyClass()
One would typically call the imports as shown above. With some slight modification of the import, we can get it into a state where MyClass it can be mocked using mockito:
from some import module
class AnotherClass:
def __init__(self):
self.my_class = module.MyClass()
Then the mocking would work like so:
from some import module
when(module).MyClass().thenReturn(mock())

Python Class inherit from all submodules

I'm currently writing a wrapper in python for a lot of custom company tools.
I'm basically going to break each tool into its own py file with a class containing the call to the tool as a method.
These will all be contained in a package.
Then there'll be a master class that will import all from the package, then inherit from each and every class, so as to appear as one cohesive class.
masterClass.py
pyPackage
- __ init__.py
- module1.py
--class Module1
---method tool1
- module2.py
--class Module2
---method tool2
etc
Right now, I'm autogenerating the master class file to inherit from the packages modules, but I was wondering if there was a more elegant way to do it?
ie
from package import *
class MasterClass(package.all):
pass
I am not really sure what your reasons are for trying to create a larger master class with inheritance from all of your other smaller classes, but the first thing that comes to mind is that the design might be backwards.
What you might want to do instead is have a base class, for which all your command modules can subclass. You can use packages as intelligent namespaces for collecting similar functionality such as "network" or "compression", etc.
class ToolBase(object):
# common functionality here
# class attributes
# base constructor
# methods: static, class, instance
pass
class Module1(ToolBase):
def __init__(self):
super(Module1, self).__init__()
class Module2(ToolBase):
def __init__(self):
super(Module2, self).__init__()
In this base class example, every subclass can expect the functionality of ToolBase to be there, including any setup from the constructor like a database connection, sockets, resource.
And maybe a project structure like this:
pyPackage
__init__.py
# class ToolBase
# __all__ = ['network', 'compress']
network/
__init__.py
module1.py
# class Module1
compress/
__init__.py
module2.py
# class Module2
Update
As a way to use the base class and have a shared "port" object, you could make it a class level attribute that is only initialized once:
class ToolBase(object):
_PORT = None
def __init__(self):
if self._PORT is None:
self._PORT = "PORT"
#property
def port(self):
return self._PORT
class Foo(ToolBase):
def go(self):
print self.port
Now your classes are useful on their own, and will share the port object.

Categories