I need to create a business query model, in which I need to create a circular dependency, I am using a look a like design of django models to implement it,
#Modeule a.py
import b
class A:
b_obj = B()
a_property_1 = ObjectAttribute(b_obj.b_property_1) # a_property_1 is dependent on b_property_1
a_property_2 = ObjectAttribute(b_obj.b_property_2)
#Module b.py
import a
class B:
a_obj = A()
b_property_1 = ObjectAttribute(a_obj.a_property_1)
b_property_2 = ObjectAttribute(a_obj.a_property_2)
When I execute the above program, it will throw an error, name 'B' is not defined on executing a.py and viceversa.
After that, I did a bit research on this to figure out and findout django models already implemented something like this via ForeignKey
https://docs.djangoproject.com/en/dev/ref/models/fields/#foreignkey
All I need to implement the my ForeignKey module, can some one please help me in understanding the logic and writing the code in below format.
#Modeule a.py
import b
class A:
b_obj = MyForeignKey('B')
a_property_1 = ObjectAttribute(b_obj.b_property_1) # a_property_1 is dependent on b_property_1
a_property_2 = ObjectAttribute(b_obj.b_property_2)
#Module b.py
import a
class B:
a_obj = MyForeignKey('A')
b_property_1 = ObjectAttribute(a_obj.a_property_1)
b_property_2 = ObjectAttribute(a_obj.a_property_2)
There are some ways to do that. One of which would be for your foreign Key to be made as proxy classes to the actuall classes, that on instantiating, just annotate the class model, and forhe next subsequent attribute access instantiate the proxied-to class, and keep its reference, Subsequent attributes would just be redirected to the underlying class.
One mechanism that allows such hooks to be executed on attribute fecth (remebering that in Pyhton a class "method" is just a callable attribute - so it works for methods as well), is to implement the __getattribute__ method.
Let's supose you have a "models" module (or other kind of registry) wher all your models are referenced, after creation -- your code could look more or less like this:
import models
class MyForeignKey(objec):
def __init__(self, model_name, *args, **kw):
self._model_name = model_name
self._args = args
self._kw = kw
def _instantiate(self):
self._object = getattr(models, self._model_name)(*self._args, **self._kw)
def __getattribute__(self, attr):
if attr in ("_model_name", "_args", "_kw", "_object", "_instantiate"):
return object.__getattribute__(self, attr)
if not hasattr(self, "_object"):
self._instantiate()
return getattr(self._object, attr)
def __setattr__(self, attr, value):
if attr in ("_model_name", "_args", "_kw", "_object"):
return object.__setattr__(self, attr, value)
if not hasattr(self, "_object"):
self._instantiate()
return setattr(self._object, attr, value)
Note that (a) your models have to inherit from "object" like I commented in the question and (b) - this is ot complete if you implement "dunder" methods (python double underscore methods) to override behavior on any of the models - in that case, you have to set the appropriate te dunder methods to do the proxying as well.
Related
I read that it is considered bad practice to create a variable in the class namespace and then change its value in the class constructor.
(One of my sources: SoftwareEngineering SE: Is it a good practice to declare instance variables as None in a class in Python.)
Consider the following code:
# lib.py
class mixin:
def __init_subclass__(cls, **kwargs):
cls.check_mixin_subclass_validity(cls)
super().__init_subclass__(**kwargs)
def check_mixin_subclass_validity(subclass):
assert hasattr(subclass, 'necessary_var'), \
'Missing necessary_var'
def method_used_by_subclass(self):
return self.necessary_var * 3.14
# app.py
class my_subclass(mixin):
necessary_var = None
def __init__(self, some_value):
self.necessary_var = some_value
def run(self):
# DO SOME STUFF
self.necessary_var = self.method_used_by_subclass()
# DO OTHER STUFF
To force its subclass to declare the variable necessary_var, the class mixin uses the metaclass subclass_validator.
And the only way I know to makes it work on app.py side, is to initialized necessary_var as a class variable.
I am missing something or is it the only way to do so?
Short answer
You should check that attributes and methods exist at instantiation of a class, not before. This is what the abc module does and it has good reasons to work like this.
Long answer
First, I would like to point out that it seems what you want to check is that an instance attribute exists.
Due to Python dynamic nature, it is not possible to do so before an instance is created, that is after the call to __init__. We could define Mixin.__init__, but we would then have to rely on the users of your API to have perfect hygiene and to always call super().__init__.
One option is thus to create a metaclass and add a check in its __call__ method.
class MetaMixin(type):
def __call__(self, *args, **kwargs):
instance = super().__call__(*args, **kwargs)
assert hasattr(instance, 'necessary_var')
class Mixin(metaclass=MetaMixin):
pass
class Foo(Mixin):
def __init__(self):
self.necessary_var = ...
Foo() # Works fine
class Bar(Mixin):
pass
Bar() # AssertionError
To convince yourself that it is good practice to do this at instantiation, we can look toward the abc module which uses this behaviour.
from abc import abstractmethod, ABC
class AbstractMixin(ABC):
#abstractmethod
def foo(self):
...
class Foo(AbstractMixin):
pass
# Right now, everything is still all good
Foo() # TypeError: Can't instantiate abstract class Foo with abstract methods foo
As you can see the TypeError was raise at instantiation of Foo() and not at class creation.
But why does it behave like this?
The reason for that is that not every class will be instantiated, consider the example where we want to inherit from Mixin to create a new mixin which checks for some more attributes.
class Mixin:
def __init_subclass__(cls, **kwargs):
assert hasattr(cls, 'necessary_var')
super().__init_subclass__(**kwargs)
class MoreMixin(Mixin):
def __init_subclass__(cls, **kwargs):
assert hasattr(cls, 'other_necessary_var')
super().__init_subclass__(**kwargs)
# AssertionError was raised at that point
class Foo(MoreMixin):
necessary_var = ...
other_necessary_var = ...
As you see, the AssertionError was raised at the creation of the MoreMixin class. This is clearly not the desired behaviour since the Foo class is actually correctly built and that is what our mixin was supposed to check.
In conclusion, the existence of some attribute or method should be done at instantiation, Otherwise, you are preventing a whole lot of helpful inheritance techniques. This is why the abc module does it like that and this is why we should.
The following code tries to solve the question asked, but the pattern presented does that in a not very clean way.
class Command(object):
__COMMANDS = {}
class __metaclass__(type):
def __init__(cls, name, parents, dct):
for parent in parents:
if hasattr(parent, '_Command__COMMANDS'):
getattr(parent, '_Command__COMMANDS')[cls.NAME] = cls
type.__init__(cls, name, parents, dct)
#classmethod
def find(cls, command_name):
""" Returns the Command implementation for a specific command name."""
return cls.__COMMANDS[command_name]
class Foo(Command):
NAME = 'foo'
Because of derived classes uses also the same __metaclass__ of the parent class, this pattern can be used to register all derived classes testing if the parent class has the attribute _Command__COMMANDS.
This pattern could get few controversial from other people, such as:
1) The Command by it self also uses the metaclass, but because of its parent is the type class and it has not the _Command__COMMANDS attribute it works fine.
2) Use the attribute testing leaves a dirty code. The use of functions like type or isinstance are not allowed but these would be more clear than the pattern used.
Does somebody a good recommendation to improve that ?
You can simply ask a class for a list of all its subclasses, by using the class.__subclasses__() method:
>>> class Command(object):
... pass
...
>>> class Foo(Command):
... NAME = 'foo'
...
>>> Command.__subclasses__()
[<class '__main__.Foo'>]
>>> Command.__subclasses__()[0].NAME
'foo'
You can use this method to implement your find() class method:
#classmethod
def find(cls, command_name):
"""Returns the Command implementation for a specific command name."""
try:
return next(c for c in cls.__subclasses__() if c.NAME == command_name)
except StopIteration:
raise KeyError(command_name)
or, if you expect to only call find() after all command subclasses have been imported, you can cache the results in a weakref.WeakValueDictionary() object (to avoid circular reference issues):
from weakref import WeakValueDictionary
class Command(object):
#classmethod
def find(cls, command_name):
"""Returns the Command implementation for a specific command name."""
try:
mapping = cls.__COMMANDS
except AttributeError:
mapping = cls.__COMMANDS = WeakValueDictionary({
c.NAME: c for c in cls.__subclasses__()})
return mapping[command_name]
You can always clear the cache again by explicitly deleting the Command._Command__COMMANDS class attribute.
I'm using Flask-Classy to write a Flask app using class based views.
My base class is called SlugView. It catches URLs like example.com/124/catchy-article-name:
class SlugView(FlaskView):
#route('/<id>')
#route('/<id>/<slug>')
def get(self, id, slug=None)
raise NotImplementedError
My second class is called ArticleView:
class ArticleView(SlugView):
def get(self, id, slug=None):
return render_template('article.html', article=get_article_by_id(id))
What decorator magic can I use to have the subclassed function inherit the same decorators as the parent class?
Magic? Yes. Decorator magic? No. Do you object to metaclass magic?
class InheritableRoutesMeta(type):
def __new__(cls, cls_name, bases, attributes):
for name, value in attributes.items():
if not callable(value):
continue
for base in bases:
super_method = getattr(base, name)
if super_method and hasattr(super_method, "_rule_cache"):
value._rule_cache = super_method._rule_cache
break
return super(InheritableRoutesMeta, cls).__new__(cls, cls_name,
bases, attributes)
Then you should be able to do something like this:
class ArticleView(SlugView, metaclass=InheritableRoutesMeta):
# Use the keyword argument metaclass for Python 3
# For Python 2, remove the argument and uncomment the below
# __metaclass__ = InheritableRoutesMeta
def get(self, id, slug=None):
return render_template('article.html', article=get_article_by_id(id))
Warning: This is based on an internal property. If Flask-Classy chooses to change how it stores these decorators the above code will break (assuming that it works in the first place). If you really need this, it is worth filing an issue with the creator(s) to either make this property part of the public API or to provide another way of doing what you are doing. They may choose not to do either, but at least then they are aware of the use case.
In my Django project, I want all my model fields to have an additional argument called documentation. (It would be similar to verbose_name or help_text, but instead for internal documentation.)
This seems straightforward: just subclass and override the field's __init__:
def __init__(self, verbose_name=None, name=None, documentation=None, **kwargs):
self.documentation = documentation
super(..., self).__init__(verbose_name, name, **kwargs)
The question is how do I make this apply to all of the 20-something field classes in django.db.models (BooleanField, CharField, PositiveIntegerField, etc.)?
The only way I see is to use metaprogramming with the inspect module:
import inspect
import sys
from django.db.models import *
current_module = sys.modules[__name__]
all_field_classes = [Cls for (_, Cls) in inspect.getmembers(current_module,
lambda m: inspect.isclass(m) and issubclass(m, Field))]
for Cls in all_field_classes:
Cls.__init__ = <???>
I am not used to seeing code like this, and don't even know if it will work. I wish I could just add the attribute to the base Field class, and have it inherit to all the child classes, but I don't see how that could be done.
Any ideas?
Indeed - you are ont he right track.
In python, introspection is a normal thing, and and you don even need to use the inspect module just because it "I am using introspection and meta programing, I must need inspect ) :-)
One thing that is not considered that much of a good practice, though, is Monkey patching - that is, if you change the classes as they are in the django.db.models itself, so that other modules will import the modified classes from there and use the modified version. (Note that in this case: not recommended != will not work) - so you would be better creating all the new model classes in your own module, and importing them from your own module, instead of from django.db.models
So, something along:
from django.db import models
# A decorator to implement the behavior you want for the
# __init__ method
def new_init(func):
def __init__(self, *args, **kw):
self.documentation = kw.pop("documentation", None)
return func(self, *args, **kw)
for name, obj in models.__dict__.items():
#check if obj is a class:
if not isinstance(obj, type):
continue
# creates a new_init, retrieving the original one -
# taking care for not to pick it as an unbound method -
# check: http://pastebin.com/t1SAusPS
new_init_method = new_init(obj.__dict__.get("__init__", lambda s:None))
# dynamically creates a new sublass of obj, overriding just the __init__ method:
new_class = type(name, (obj,), {"__init__": new_init_method})
# binds the new class to this module's scope:
globals().__setitem__(name, new_class)
Or if you prefer using monkey patching, as it is easier :-p
from django.db import models
def new_init(func):
def __init__(self, *args, **kw):
self.documentation = kw.pop("documentation", None)
return func(self, *args, **kw)
for name, obj in models.__dict__.items():
#check if obj is a class:
if not isinstance(obj, type):
continue
obj.__init__ = new_init(obj.__dict__["__init__"])
I know that there are several posts on this topic, however for what ever reason I can't get my head around it, or at least implement it. Below is some sample code of what I am trying to do.
Base Class:
class Animal(object):
def __init__(self, age):
self._age = age
def getAge(self):
return self._age
def speak(self):
raise NotImplementedError()
def speak_twice(self):
self.speak()
self.speak()
Sub Class
from Animal import Animal
class Dog(Animal):
def speak(self):
print "woff!"
Test Code
mod = __import__("Dog")
spot = mod(5)
After running test Code I get this error:
Traceback (most recent call last):
File "C:~test.py", line 2, in <module>
spot = mod(5)
TypeError: 'module' object is not callable
So basically my question is how do I load modules dynamically and initialize them correctly?
EDIT:
I will not know the subclass until runtime
You have to import the module itself, then get its class member. You can't just import the class. Assuming your subclass is in a file accessible from the pythonpath as 'animal':
mod = __import__('animal')
spot = mod.Dog(5)
When you import a module, the interpreter first looks to see if a module with that name exists in sys.modules, then if it fails to find it there, it searches over the pythonpath looking for a package or module matching the given name. If and when it finds one, it parses the code therein, builds a module object out of it, places it on sys.modules, and returns the module object to the calling scope to be bound to the name it was imported with in the given namespace. All the items in the module (classes, variables, functions) in the module scope (not nested inside something else in the code) are then available as members of that module instance.
Edit:
In response to your comment, the real problem is that you are trying to look up an attribute of the module dynamically, not that you are trying to import anything dynamically. The most direct way to do that would be:
import sub_animal
getattr(sub_animal, 'Dog')
However, if you are trying to dynamically determine the class to initialize based upon some conditions, you probably want to read up on the factory pattern, and possibly decorators or even metaclasses, so that you can dynamically add subclasses automatically to the factory.
class AnimalFactory(type):
animal_classes = {}
def __new__(cls, name, bases, attrs):
new_class = super(AnimalFactory, cls).__new__(cls, name, bases, attrs)
AnimalFactory.animal_classes[name] = new_class
return new_class
#classmethod
def build(cls, name, *args, **kwargs):
try:
klass = cls.animal_classes[name]
except KeyError:
raise ValueError('No known animal %s' % name)
return klass(*args, **kwargs)
class Animal(object):
__metaclass__ = AnimalFactory
def __init__(self, age):
self.age = age
def speak(self):
raise NotImplementedError()
# As long as the file it is implemented in is imported at some point,
# the following can be anywhere
class Dog(Animal):
def speak(self):
return 'woof'
# And then to use, again, anywhere
new_animal = AnimalFactory.build('Dog', 5)