Dynamically load module with Inheritance - python

I know that there are several posts on this topic, however for what ever reason I can't get my head around it, or at least implement it. Below is some sample code of what I am trying to do.
Base Class:
class Animal(object):
def __init__(self, age):
self._age = age
def getAge(self):
return self._age
def speak(self):
raise NotImplementedError()
def speak_twice(self):
self.speak()
self.speak()
Sub Class
from Animal import Animal
class Dog(Animal):
def speak(self):
print "woff!"
Test Code
mod = __import__("Dog")
spot = mod(5)
After running test Code I get this error:
Traceback (most recent call last):
File "C:~test.py", line 2, in <module>
spot = mod(5)
TypeError: 'module' object is not callable
So basically my question is how do I load modules dynamically and initialize them correctly?
EDIT:
I will not know the subclass until runtime

You have to import the module itself, then get its class member. You can't just import the class. Assuming your subclass is in a file accessible from the pythonpath as 'animal':
mod = __import__('animal')
spot = mod.Dog(5)
When you import a module, the interpreter first looks to see if a module with that name exists in sys.modules, then if it fails to find it there, it searches over the pythonpath looking for a package or module matching the given name. If and when it finds one, it parses the code therein, builds a module object out of it, places it on sys.modules, and returns the module object to the calling scope to be bound to the name it was imported with in the given namespace. All the items in the module (classes, variables, functions) in the module scope (not nested inside something else in the code) are then available as members of that module instance.
Edit:
In response to your comment, the real problem is that you are trying to look up an attribute of the module dynamically, not that you are trying to import anything dynamically. The most direct way to do that would be:
import sub_animal
getattr(sub_animal, 'Dog')
However, if you are trying to dynamically determine the class to initialize based upon some conditions, you probably want to read up on the factory pattern, and possibly decorators or even metaclasses, so that you can dynamically add subclasses automatically to the factory.
class AnimalFactory(type):
animal_classes = {}
def __new__(cls, name, bases, attrs):
new_class = super(AnimalFactory, cls).__new__(cls, name, bases, attrs)
AnimalFactory.animal_classes[name] = new_class
return new_class
#classmethod
def build(cls, name, *args, **kwargs):
try:
klass = cls.animal_classes[name]
except KeyError:
raise ValueError('No known animal %s' % name)
return klass(*args, **kwargs)
class Animal(object):
__metaclass__ = AnimalFactory
def __init__(self, age):
self.age = age
def speak(self):
raise NotImplementedError()
# As long as the file it is implemented in is imported at some point,
# the following can be anywhere
class Dog(Animal):
def speak(self):
return 'woof'
# And then to use, again, anywhere
new_animal = AnimalFactory.build('Dog', 5)

Related

What is the corretly way to call super in dynamically added methods?

I defined a metaclass which add a method named "test" to the created classes:
class FooMeta(type):
def __new__(mcls, name, bases, attrs):
def test(self):
return super().test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls
Then I create two classes using this Metaclass
class A(metaclass=FooMeta):
pass
class B(A):
pass
When I run
a = A()
a.test()
a TypeError is raised at super().test():
super(type, obj): obj must be an instance or subtype of type
Which means super() cannot infer the parent class correctly. If I change the super call into
def __new__(mcls, name, bases, attrs):
def test(self):
return super(cls, self).test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls
then the raised error becomes:
AttributeError: 'super' object has no attribute 'test'
which is expected as the parent of A does not implement test method.
So my question is what is the correct way to call super() in a dynamically added method? Should I always write super(cls, self) in this case? If so, it is too ugly (for python3)!
Parameterless super() is very special in Python because it triggers some behavior during code compilation time itself: Python creates an invisible __class__ variable which is a reference to the "physical" class statement body were the super() call is embedded (it also happens if one makes direct use of the __class__ variable inside a class method).
In this case, the "physical" class where super() is called is the metaclass FooMeta itself, not the class it is creating.
The workaround for that is to use the version of super which takes 2 positional arguments: the class in which it will search the immediate superclass, and the instance itself.
In Python 2 and other occasions one may prefer the parameterized use of super, it is normal to use the class name itself as the first parameter: at runtime, this name will be available as a global variable in the current module. That is, if class A would be statically coded in the source file, with a def test(...): method, you would use super(A, self).test(...) inside its body.
However, although the class name won't be available as a variable in the module defining the metaclass, you really need to pass a reference to the class as the first argument to super. Since the (test) method receives self as a reference to the instance, its class is given by either self.__class__ or type(self).
TL;DR: just change the super call in your dynamic method to read:
class FooMeta(type):
def __new__(mcls, name, bases, attrs):
def test(self):
return super(type(self), self).test()
attrs["test"] = test
cls = type.__new__(mcls, name, bases, attrs)
return cls

werkzeug's LocalProxy.__local, where is it initialized?

I am curious about how LocalProxy from the werkzeug package works. Specifically, where is the __local field initialized?
#implements_bool
class LocalProxy(object):
__slots__ = ("__local", "__dict__", "__name__", "__wrapped__")
def __init__(self, local, name=None):
object.__setattr__(self, "_LocalProxy__local", local)
object.__setattr__(self, "__name__", name)
if callable(local) and not hasattr(local, "__release_local__"):
object.__setattr__(self, "__wrapped__", local)
def _get_current_object(self):
if not hasattr(self.__local, "__release_local__"):
return self.__local()
try:
return getattr(self.__local, self.__name__)
except AttributeError:
raise RuntimeError("no object bound to %s" % self.__name__)
...
There are no other places in LocalProxy class definition that reference self.__local, and it seems to me that self.__local is not initialized anywhere. Is it somehow magically aliased to self._LocalProxy__local?
What you call aliasing is called name mangling in Python.
Given this example:
class Customer:
def __init__(self, name):
self.__name = name
name is then available as Customer._customer__name.
While this is really rarely used, the intention is to make the access a bit harder. For whatever reason you want this.
The other interesting part of your code example is called slots.
This is a way to define/initialize attributes in an advanced way. It helps reducing memory and it prevents new attributes to be dynamically defined.

Get decorated class from its name in the decorator?

I decorated some methods with #bot_thinking, which stores some information about the decorated method in the functions attribute.
One piece of information is 'class_name', but my program needs the class type as a variable, e.g. RandomBot. I would like to get this class.
Here is some sample code:
class DepthPrunedMinimaxAgent(Agent):
#bot_thinking(associated_name="minimax profondeur")
def select_move(self, game_state: GameState):
Above is the decorated part of the code.
The decorator:
functions = {}
def bot_thinking(associated_name, active=True):
def _(func):
if active:
class_name = func.__qualname__.rsplit('.')[-2]
import sys
# class_name_2=getattr(sys.modules[__name__], class_name)
# module=importlib.import_module('sources.agent')
functions[associated_name] = (associated_name, class_name,
globals()[class_name], func)
else:
functions.pop(associated_name)
return _
bot_thinking isn't a real decorator, it's a decorator factory.
From the func function, I get the class_name, but I can't use the accepted answer by #m.kocikowski, to find the correct class because this class is decorated, so it already imports the annotation module, so importing from the module of the annotation the annotated module would result in a cyclic import, which python does not seem to permit.
Do you see a method to get the class from its name?
ps:
ps:
to be clearer : the annotation part of the code need an import to the annotated classes(to retrieve the class from its name), which also need an importation of the annotation (for the annotation to work).
You can do what you want if you use a descriptor class, rather than a function, as the decorator, at least if you're using Python 3.6 or newer. That's because there's a new method added to the descriptor protocol, __set_name__. It gets called when the descriptor object is saved as a class variable. While most descriptors will use it to record the name they're being saved as, you can use it to get the class you're in.
You do need to make your decorator object wrap the real function (implementing calling and descriptor lookup methods), rather than being able to return the unmodified function you were decorating. Here's my attempt at a quick and dirty implementation. I don't really understand what you're doing with functions, so I may not have put the right data in it, but it should be close enough to get the idea across (owner is the class the method stored in).
functions = {}
def bot_thinking(associated_name, active=True):
class decorator:
def __init__(self, func):
self.func = func
def __set_name__(self, owner, name):
if active:
functions[associated_name] = (associated_name, owner.__name__,
owner, self.func)
else:
functions.pop(associated_name)
def __get__(self, obj, owner):
return self.func.__get__(obj, owner)
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
return decorator
The problem is the class hasn't been defined yet when the bot_thinking() decorator factory (and decorator itself) are executing. The only workaround I can think of would be to patch things up after the class is defined, as illustrated below:
from pprint import pprint, pformat
functions = {}
def bot_thinking(associated_name, active=True):
def _(func):
if active:
class_name = func.__qualname__.split(".")[-2]
functions[associated_name] = (associated_name, class_name, class_name, func)
else:
functions.pop(associated_name, None)
return func # Decorators must return a callable.
return _
class Agent: pass
class GameState: pass
class DepthPrunedMinimaxAgent(Agent):
#bot_thinking(associated_name="minimax profondeur")
def select_move(self, game_state: GameState):
pass
# After class is defined, update data put into functions dictionary.
for associated_name, info in functions.items():
functions[associated_name] = (info[0], info[1], globals()[info[2]], info[3])
pprint(functions)
Output:
{'minimax profondeur': ('minimax profondeur',
'DepthPrunedMinimaxAgent',
<class '__main__.DepthPrunedMinimaxAgent'>,
<function DepthPrunedMinimaxAgent.select_move at 0x00F158A0>)}

The best way to register all derived Python classes implementing one

The following code tries to solve the question asked, but the pattern presented does that in a not very clean way.
class Command(object):
__COMMANDS = {}
class __metaclass__(type):
def __init__(cls, name, parents, dct):
for parent in parents:
if hasattr(parent, '_Command__COMMANDS'):
getattr(parent, '_Command__COMMANDS')[cls.NAME] = cls
type.__init__(cls, name, parents, dct)
#classmethod
def find(cls, command_name):
""" Returns the Command implementation for a specific command name."""
return cls.__COMMANDS[command_name]
class Foo(Command):
NAME = 'foo'
Because of derived classes uses also the same __metaclass__ of the parent class, this pattern can be used to register all derived classes testing if the parent class has the attribute _Command__COMMANDS.
This pattern could get few controversial from other people, such as:
1) The Command by it self also uses the metaclass, but because of its parent is the type class and it has not the _Command__COMMANDS attribute it works fine.
2) Use the attribute testing leaves a dirty code. The use of functions like type or isinstance are not allowed but these would be more clear than the pattern used.
Does somebody a good recommendation to improve that ?
You can simply ask a class for a list of all its subclasses, by using the class.__subclasses__() method:
>>> class Command(object):
... pass
...
>>> class Foo(Command):
... NAME = 'foo'
...
>>> Command.__subclasses__()
[<class '__main__.Foo'>]
>>> Command.__subclasses__()[0].NAME
'foo'
You can use this method to implement your find() class method:
#classmethod
def find(cls, command_name):
"""Returns the Command implementation for a specific command name."""
try:
return next(c for c in cls.__subclasses__() if c.NAME == command_name)
except StopIteration:
raise KeyError(command_name)
or, if you expect to only call find() after all command subclasses have been imported, you can cache the results in a weakref.WeakValueDictionary() object (to avoid circular reference issues):
from weakref import WeakValueDictionary
class Command(object):
#classmethod
def find(cls, command_name):
"""Returns the Command implementation for a specific command name."""
try:
mapping = cls.__COMMANDS
except AttributeError:
mapping = cls.__COMMANDS = WeakValueDictionary({
c.NAME: c for c in cls.__subclasses__()})
return mapping[command_name]
You can always clear the cache again by explicitly deleting the Command._Command__COMMANDS class attribute.

Import modules in each other class in python using metaclass

I need to create a business query model, in which I need to create a circular dependency, I am using a look a like design of django models to implement it,
#Modeule a.py
import b
class A:
b_obj = B()
a_property_1 = ObjectAttribute(b_obj.b_property_1) # a_property_1 is dependent on b_property_1
a_property_2 = ObjectAttribute(b_obj.b_property_2)
#Module b.py
import a
class B:
a_obj = A()
b_property_1 = ObjectAttribute(a_obj.a_property_1)
b_property_2 = ObjectAttribute(a_obj.a_property_2)
When I execute the above program, it will throw an error, name 'B' is not defined on executing a.py and viceversa.
After that, I did a bit research on this to figure out and findout django models already implemented something like this via ForeignKey
https://docs.djangoproject.com/en/dev/ref/models/fields/#foreignkey
All I need to implement the my ForeignKey module, can some one please help me in understanding the logic and writing the code in below format.
#Modeule a.py
import b
class A:
b_obj = MyForeignKey('B')
a_property_1 = ObjectAttribute(b_obj.b_property_1) # a_property_1 is dependent on b_property_1
a_property_2 = ObjectAttribute(b_obj.b_property_2)
#Module b.py
import a
class B:
a_obj = MyForeignKey('A')
b_property_1 = ObjectAttribute(a_obj.a_property_1)
b_property_2 = ObjectAttribute(a_obj.a_property_2)
There are some ways to do that. One of which would be for your foreign Key to be made as proxy classes to the actuall classes, that on instantiating, just annotate the class model, and forhe next subsequent attribute access instantiate the proxied-to class, and keep its reference, Subsequent attributes would just be redirected to the underlying class.
One mechanism that allows such hooks to be executed on attribute fecth (remebering that in Pyhton a class "method" is just a callable attribute - so it works for methods as well), is to implement the __getattribute__ method.
Let's supose you have a "models" module (or other kind of registry) wher all your models are referenced, after creation -- your code could look more or less like this:
import models
class MyForeignKey(objec):
def __init__(self, model_name, *args, **kw):
self._model_name = model_name
self._args = args
self._kw = kw
def _instantiate(self):
self._object = getattr(models, self._model_name)(*self._args, **self._kw)
def __getattribute__(self, attr):
if attr in ("_model_name", "_args", "_kw", "_object", "_instantiate"):
return object.__getattribute__(self, attr)
if not hasattr(self, "_object"):
self._instantiate()
return getattr(self._object, attr)
def __setattr__(self, attr, value):
if attr in ("_model_name", "_args", "_kw", "_object"):
return object.__setattr__(self, attr, value)
if not hasattr(self, "_object"):
self._instantiate()
return setattr(self._object, attr, value)
Note that (a) your models have to inherit from "object" like I commented in the question and (b) - this is ot complete if you implement "dunder" methods (python double underscore methods) to override behavior on any of the models - in that case, you have to set the appropriate te dunder methods to do the proxying as well.

Categories