Should I test methods in Django models? - python

Is there any reason to test model methods or I should assume that Django works properly and leave them untested?
Here is my model and couple methods from it:
class SlackTeam(models.Model):
...
def get_user(self, user_id):
return self.users.filter(user_id=user_id).first()
def deactivate(self):
self.active = False
self.initiator.access_token = ''
self.initiator.save()
self.initiator = None
self.deactivated_at = timezone.now()
self.save()
if hasattr(self, 'slackbot'):
self.slackbot.delete()
def set_initiator(self, user):
self.initiator = user
self.save(update_fields=['initiator'])
#classmethod
def initialize(cls, team_id, name):
return cls.objects.update_or_create(
team_id=team_id, defaults={'name': name})[0]
#classmethod
def get_by_team_id(cls, team_id):
return cls.objects.filter(team_id=team_id).first()

You can safely assume that methods defined on the base models.Model class work. Your own methods - either custom ones or overridden - have to be tested, of course.
As a side note: the convention with Django models is to define methods working at the table level on the manager, not on the model itself so at least your get_by_team_id and possibly initialize should be defined on a custom manager.

Related

Django inheritance and polymorphism with proxy models

I'm working on a Django project that I did not start and I am facing a problem of inheritance.
I have a big model (simplified in the example) called MyModel that is supposed to represents different kind of items.
All the instance objects of MyModel should have the same fields but the methods behaviours varies a lot depending on the item type.
Up to this moment this has been designed using a single MyModel field called item_type.
Then methods defined in MyModel check for this field and perform different logic using multiple if:
def example_method(self):
if self.item_type == TYPE_A:
do_this()
if self.item_type == TYPE_B1:
do_that()
Even more, some of the sub-types have many things in common, so let's say the subtypes B and C represents a 1st level of inheritance.
Then these types have sub-types being for example B1, B2, C1, C2 (better explained in the example code below).
I would say that's not the best approach to perform polymorphism.
Now I want to change these models to use real inheritance.
Since all submodels have the same fields I think multi-table inheritance is not necessary. I was thinking to use proxy models because only their behaviour should change depending on their types.
This a pseudo-solution I came up to:
ITEM_TYPE_CHOICES = (
(TYPE_A, _('Type A')),
(TYPE_B1, _('Type B1')),
(TYPE_B2, _('Type B2')),
(TYPE_C1, _('Type C1')),
(TYPE_C2, _('Type C2')))
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
def common_thing(self):
pass
def do_something(self):
pass
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
return 'Hola'
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
This might work if we already know the type of the object we are working on.
Let's say we want to instantiate a MyModel object of type C1 then we could simply instantiate a ModelC1 and the item_type would be set up correctly.
The problem is how to get the correct proxy model from the generic MyModel instances?
The most common case is when we get a queryset result: MyModel.objects.all(), all these objects are instances of MyModel and they don't know anything about the proxies.
I've seen around different solution like django-polymorphic but as I've understood that relies on multi-table inheritance, isn't it?
Several SO answers and custom solutions I've seen:
https://stackoverflow.com/a/7526676/1191416
Polymorphism in Django
http://anthony-tresontani.github.io/Python/2012/09/11/django-polymorphism/
https://github.com/craigds/django-typed-models
Creating instances of Django proxy models from their base class
but none of them convinced me 100%..
Considering this might be a common scenario did anyone came up with a better solution?
When you use django-polymorphic in your base model, you'll get this casting behavior for free:
class MyModel(PolymorphicModel):
pass
Each model that extends from it (proxy model or concrete model), will be casted back to that model when you do a MyModel.objects.all()
I have few experience with model proxies so I can't tell if this would properly work (without bearking anything I mean) nor how complicated this might be, but you could use an item_type:ProxyClass mapping and override your model's queryset (or provide a second manager with custom queryset etc) that actually lookup this mapping and instanciates the correct proxy model.
BTW you may want at django.models.base.Model.from_db, which (from a very quick glance at the source code) seems to be the method called by QuerySet.populate() to instanciate models. Just overriding this method might possibly be enough to solve the problem - but here again it might also breaks something...
I came up with a custom solution inspired by this SO answer and this blog post:
from django.db import models
from django.dispatch.dispatcher import receiver
ITEM_TYPE_CHOICES = (
(TYPE_A, _('type_a')),
(TYPE_B1, _('type_b1')),
(TYPE_B2, _('type_b2')),
(TYPE_C1, _('type_c1')),
(TYPE_C2, _('type_c2')),
)
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
description = models.TextField(blank=True, null=True)
def common_thing(self):
pass
def do_something(self):
pass
# ****************
# Hacking Django *
# ****************
PROXY_CLASS_MAP = {} # We don't know this yet
#classmethod
def register_proxy_class(cls, item_type):
"""Class decorator for registering subclasses."""
def decorate(subclass):
cls.PROXY_CLASS_MAP[item_type] = subclass
return subclass
return decorate
def get_proxy_class(self):
return self.PROXY_CLASS_MAP.get(self.item_type, MyModel)
# REGISTER SUBCLASSES
#MyModel.register_proxy_class(TYPE_A)
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
pass
# No need to register this, it's never instantiated directly
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
#MyModel.register_proxy_class(TYPE_B1)
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
#MyModel.register_proxy_class(TYPE_B2)
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
# USING SIGNAL TO CHANGE `__class__` at runtime
#receiver(models.signals.post_init, sender=MyModel)
def update_proxy_object(sender, **kwargs):
instance = kwargs['instance']
if hasattr(instance, "get_proxy_class") and not instance._meta.proxy:
proxy_class = instance.get_proxy_class()
if proxy_class is not None:
instance.__class__ = proxy_class
I'm using the decorator register_proxy_class to register each subclass after MyModel has been declared otherwise I would have needed to explicitly declare a map of {type: subclass} inside MyModel.
This would have been bad:
because at declaration we can't reference any of the proxy subclasses from MyModel (we could solve these with string names)
the parent would be aware of its subclasses which breaks OOP principles.
How it works:
Using the #register_proxy_class(type) decorator each subclass register itself, in fact creating an entry into MyModel.PROXY_CLASS_MAP dict when the module is loaded.
Then update_proxy_object is executed whenever MyModel dispatch a post_init signal. It change the __class__ of MyModel instances at runtime to select the right proxy subclass.
So basically:
# a1: MyModel dispatch a post_init signal -> `update_proxy_object` set the proper instance __class__ = ModelA
# Do NOT call ModelA.__init__
a1 = MyModel(item_type=TYPE_A)
isinstance(a1, MyModel) # True
isinstance(a1, ModelA) # True
# a2: calls ModelA.__init__ that call the parent MyModel.__init__ then it sets up the item_type for us
a2 = ModelA() # <- no need to pass item_type
isinstance(a2,MyModel) # True
isinstance(a2, ModelA) #True
# Using custom managers of MyModel return all objects having item_type == 'TYPE_B1'
b1 = MyModel.objects.b1()[0] # get the first one
isinstance(b1, ModelB1) # True
isinstance(b1, ModelB) # True
isinstance(b1, MyModel) # True
isinstance(b1, ModelA) # False
It seems to work so far but I will experiment a bit more for possible problems I haven't think about.
Cool!

Dynamically limiting queryset of related field

Using Django REST Framework, I want to limit which values can be used in a related field in a creation.
For example consider this example (based on the filtering example on https://web.archive.org/web/20140515203013/http://www.django-rest-framework.org/api-guide/filtering.html, but changed to ListCreateAPIView):
class PurchaseList(generics.ListCreateAPIView)
model = Purchase
serializer_class = PurchaseSerializer
def get_queryset(self):
user = self.request.user
return Purchase.objects.filter(purchaser=user)
In this example, how do I ensure that on creation the purchaser may only be equal to self.request.user, and that this is the only value populated in the dropdown in the form in the browsable API renderer?
I ended up doing something similar to what Khamaileon suggested here. Basically I modified my serializer to peek into the request, which kind of smells wrong, but it gets the job done... Here's how it looks (examplified with the purchase-example):
class PurchaseSerializer(serializers.HyperlinkedModelSerializer):
def get_fields(self, *args, **kwargs):
fields = super(PurchaseSerializer, self).get_fields(*args, **kwargs)
fields['purchaser'].queryset = permitted_objects(self.context['view'].request.user, fields['purchaser'].queryset)
return fields
class Meta:
model = Purchase
permitted_objects is a function which takes a user and a query, and returns a filtered query which only contains objects that the user has permission to link to. This seems to work both for validation and for the browsable API dropdown fields.
Here's how I do it:
class PurchaseList(viewsets.ModelViewSet):
...
def get_serializer(self, *args, **kwargs):
serializer_class = self.get_serializer_class()
context = self.get_serializer_context()
return serializer_class(*args, request_user=self.request.user, context=context, **kwargs)
class PurchaseSerializer(serializers.ModelSerializer):
...
def __init__(self, *args, request_user=None, **kwargs):
super(PurchaseSerializer, self).__init__(*args, **kwargs)
self.fields['user'].queryset = User._default_manager.filter(pk=request_user.pk)
The example link does not seem to be available anymore, but by reading other comments, I assume that you are trying to filter the user relationship to purchases.
If i am correct, then i can say that there is now an official way to do this. Tested with django rest framework 3.10.1.
class UserPKField(serializers.PrimaryKeyRelatedField):
def get_queryset(self):
user = self.context['request'].user
queryset = User.objects.filter(...)
return queryset
class PurchaseSeriaizer(serializers.ModelSerializer):
users = UserPKField(many=True)
class Meta:
model = Purchase
fields = ('id', 'users')
This works as well with the browsable API.
Sources:
https://github.com/encode/django-rest-framework/issues/1985#issuecomment-328366412
https://medium.com/django-rest-framework/limit-related-data-choices-with-django-rest-framework-c54e96f5815e
I disliked the style of having to override the init method for every place where I need to have access to user data or the instance at runtime to limit the queryset. So I opted for this solution.
Here is the code inline.
from rest_framework import serializers
class LimitQuerySetSerializerFieldMixin:
"""
Serializer mixin with a special `get_queryset()` method that lets you pass
a callable for the queryset kwarg. This enables you to limit the queryset
based on data or context available on the serializer at runtime.
"""
def get_queryset(self):
"""
Return the queryset for a related field. If the queryset is a callable,
it will be called with one argument which is the field instance, and
should return a queryset or model manager.
"""
# noinspection PyUnresolvedReferences
queryset = self.queryset
if hasattr(queryset, '__call__'):
queryset = queryset(self)
if isinstance(queryset, (QuerySet, Manager)):
# Ensure queryset is re-evaluated whenever used.
# Note that actually a `Manager` class may also be used as the
# queryset argument. This occurs on ModelSerializer fields,
# as it allows us to generate a more expressive 'repr' output
# for the field.
# Eg: 'MyRelationship(queryset=ExampleModel.objects.all())'
queryset = queryset.all()
return queryset
class DynamicQuersetPrimaryKeyRelatedField(LimitQuerySetSerializerFieldMixin, serializers.PrimaryKeyRelatedField):
"""Evaluates callable queryset at runtime."""
pass
class MyModelSerializer(serializers.ModelSerializer):
"""
MyModel serializer with a primary key related field to 'MyRelatedModel'.
"""
def get_my_limited_queryset(self):
root = self.root
if root.instance is None:
return MyRelatedModel.objects.none()
return root.instance.related_set.all()
my_related_model = DynamicQuersetPrimaryKeyRelatedField(queryset=get_my_limited_queryset)
class Meta:
model = MyModel
The only drawback with this is that you would need to explicitly set the related serializer field instead of using the automatic field discovery provided by ModelSerializer. i would however expect something like this to be in rest_framework by default.
In django rest framework 3.0 the get_fields method was removed. But in a similar way you can do this in the init function of the serializer:
class PurchaseSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Purchase
def __init__(self, *args, **kwargs):
super(PurchaseSerializer, self).__init__(*args, **kwargs)
if 'request' in self.context:
self.fields['purchaser'].queryset = permitted_objects(self.context['view'].request.user, fields['purchaser'].queryset)
I added the if check since if you use PurchaseSerializer as field in another serializer on get methods, the request will not be passed to the context.
First to make sure you only allow "self.request.user" when you have an incoming http POST/PUT (this assumes the property on your serializer and model is named "user" literally)
def validate_user(self, attrs, source):
posted_user = attrs.get(source, None)
if posted_user:
raise serializers.ValidationError("invalid post data")
else:
user = self.context['request']._request.user
if not user:
raise serializers.ValidationError("invalid post data")
attrs[source] = user
return attrs
By adding the above to your model serializer you ensure that ONLY the request.user is inserted into your database.
2) -about your filter above (filter purchaser=user) I would actually recommend using a custom global filter (to ensure this is filtered globally). I do something for a software as a service app of my own and it helps to ensure each http request is filtered down (including an http 404 when someone tries to lookup a "object" they don't have access to see in the first place)
I recently patched this in the master branch so both list and singular views will filter this
https://github.com/tomchristie/django-rest-framework/commit/1a8f07def8094a1e34a656d83fc7bdba0efff184
3) - about the api renderer - are you having your customers use this directly? if not I would say avoid it. If you need this it might be possible to add a custom serlializer that would help to limit the input on the front-end
Upon request # gabn88, as you may know by now, with DRF 3.0 and above, there is no easy solution.
Even IF you do manage to figure out a solution, it won't be pretty and will most likely fail on subsequent versions of DRF as it will override a bunch of DRF source which will have changed by then.
I forget the exact implementation I used, but the idea is to create 2 fields on the serializer, one your normal serializer field (lets say PrimaryKeyRelatedField etc...), and another field a serializer method field, which the results will be swapped under certain cases (such as based on the request, the request user, or whatever). This would be done on the serializers constructor (ie: init)
Your serializer method field will return a custom query that you want.
You will pop and/or swap these fields results, so that the results of your serializer method field will be assigned to the normal/default serializer field (PrimaryKeyRelatedField etc...) accordingly. That way you always deal with that one key (your default field) while the other key remains transparent within your application.
Along with this info, all you really need is to modify this: http://www.django-rest-framework.org/api-guide/serializers/#dynamically-modifying-fields
I wrote a custom CustomQueryHyperlinkedRelatedField class to generalize this behavior:
class CustomQueryHyperlinkedRelatedField(serializers.HyperlinkedRelatedField):
def __init__(self, view_name=None, **kwargs):
self.custom_query = kwargs.pop('custom_query', None)
super(CustomQueryHyperlinkedRelatedField, self).__init__(view_name, **kwargs)
def get_queryset(self):
if self.custom_query and callable(self.custom_query):
qry = self.custom_query()(self)
else:
qry = super(CustomQueryHyperlinkedRelatedField, self).get_queryset()
return qry
#property
def choices(self):
qry = self.get_queryset()
return OrderedDict([
(
six.text_type(self.to_representation(item)),
six.text_type(item)
)
for item in qry
])
Usage:
class MySerializer(serializers.HyperlinkedModelSerializer):
....
somefield = CustomQueryHyperlinkedRelatedField(view_name='someview-detail',
queryset=SomeModel.objects.none(),
custom_query=lambda: MySerializer.some_custom_query)
#staticmethod
def some_custom_query(field):
return SomeModel.objects.filter(somefield=field.context['request'].user.email)
...
I did the following:
class MyModelSerializer(serializers.ModelSerializer):
myForeignKeyFieldName = MyForeignModel.objects.all()
def get_fields(self, *args, **kwargs):
fields = super(MyModelSerializer, self).get_fields()
qs = MyModel.objects.filter(room=self.instance.id)
fields['myForeignKeyFieldName'].queryset = qs
return fields
I looked for a solution where I can set the queryset upon creation of the field and don't have to add a separate field class. This is what I came up with:
class PurchaseSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Purchase
fields = ["purchaser"]
def get_purchaser_queryset(self):
user = self.context["request"].user
return Purchase.objects.filter(purchaser=user)
def get_extra_kwargs(self):
kwargs = super().get_extra_kwargs()
kwargs["purchaser"] = {"queryset": self.get_purchaser_queryset()}
return kwargs
The main issue for tracking suggestions regarding this seems to be drf#1985.
Here's a re-usable generic serializer field that can be used instead of defining a custom field for every use case.
class DynamicPrimaryKeyRelatedField(serializers.PrimaryKeyRelatedField):
"""A PrimaryKeyRelatedField with ability to set queryset at runtime.
Pass a function in the `queryset_fn` kwarg. It will be passed the serializer `context`.
The function should return a queryset.
"""
def __init__(self, queryset_fn=None, **kwargs):
assert queryset_fn is not None, "The `queryset_fn` argument is required."
self.queryset_fn = queryset_fn
super().__init__(**kwargs)
def get_queryset(self):
return self.queryset_fn(context=self.context)
Usage:
class MySerializer(serializers.ModelSerializer):
my_models = DynamicPrimaryKeyRelatedField(
queryset_fn=lambda context: MyModel.objects.visible_to_user(context["request"].user)
)
# ...
Same works for serializers.SlugRelatedField.

understanding python class variable

Why do we have template_name = None as class variable here?(from django source code)
It's because if self.template_name is None, would raise an error?
(self.template_name would look for instance variable and if it's not there, would return the class variable)
If so, wouldn't it be better to have def __init__(self): self.template_name = None ?
class TemplateResponseMixin(object):
"""
A mixin that can be used to render a template.
"""
template_name = None
response_class = TemplateResponse
def render_to_response(self, context, **response_kwargs):
"""
Returns a response with a template rendered with the given context.
"""
return self.response_class(
request = self.request,
template = self.get_template_names(),
context = context,
**response_kwargs
)
def get_template_names(self):
"""
Returns a list of template names to be used for the request. Must return
a list. May not be called if render_to_response is overridden.
"""
if self.template_name is None:
raise ImproperlyConfigured(
"TemplateResponseMixin requires either a definition of "
"'template_name' or an implementation of 'get_template_names()'")
else:
return [self.template_name]
TemplateResponseMixin is a mixin which is not using an init, to make it easier to use in a mutiple inheritence. It does not need its own state, so it does not need a constructor. This also makes the inheritence easier because you don' t need to call a constructor on it in your subclass.
The template_name is set as a class instance obviously because there is no constructor. It implies that it should be set from the subclass. Also, changing the value of it will affect all future instances of that mixin.

How to use Django model inheritance with signals?

I have a few model inheritance levels in Django:
class WorkAttachment(models.Model):
""" Abstract class that holds all fields that are required in each attachment """
work = models.ForeignKey(Work)
added = models.DateTimeField(default=datetime.datetime.now)
views = models.IntegerField(default=0)
class Meta:
abstract = True
class WorkAttachmentFileBased(WorkAttachment):
""" Another base class, but for file based attachments """
description = models.CharField(max_length=500, blank=True)
size = models.IntegerField(verbose_name=_('size in bytes'))
class Meta:
abstract = True
class WorkAttachmentPicture(WorkAttachmentFileBased):
""" Picture attached to work """
image = models.ImageField(upload_to='works/images', width_field='width', height_field='height')
width = models.IntegerField()
height = models.IntegerField()
There are many different models inherited from WorkAttachmentFileBased and WorkAttachment. I want to create a signal, which would update an attachment_count field for parent work, when attachment is created. It would be logical, to think that signal made for parent sender (WorkAttachment) would run for all inherited models too, but it does not. Here is my code:
#receiver(post_save, sender=WorkAttachment, dispatch_uid="att_post_save")
def update_attachment_count_on_save(sender, instance, **kwargs):
""" Update file count for work when attachment was saved."""
instance.work.attachment_count += 1
instance.work.save()
Is there a way to make this signal work for all models inherited from WorkAttachment?
Python 2.7, Django 1.4 pre-alpha
P.S. I've tried one of the solutions I found on the net, but it did not work for me.
You could register the connection handler without sender specified. And filter the needed models inside it.
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save)
def my_handler(sender, **kwargs):
# Returns false if 'sender' is NOT a subclass of AbstractModel
if not issubclass(sender, AbstractModel):
return
...
Ref: https://groups.google.com/d/msg/django-users/E_u9pHIkiI0/YgzA1p8XaSMJ
The simplest solution is to not restrict on the sender, but to check in the signal handler whether the respective instance is a subclass:
#receiver(post_save)
def update_attachment_count_on_save(sender, instance, **kwargs):
if isinstance(instance, WorkAttachment):
...
However, this may incur a significant performance overhead as every time any model is saved, the above function is called.
I think I've found the most Django-way of doing this: Recent versions of Django suggest to connect signal handlers in a file called signals.py. Here's the necessary wiring code:
your_app/__init__.py:
default_app_config = 'your_app.apps.YourAppConfig'
your_app/apps.py:
import django.apps
class YourAppConfig(django.apps.AppConfig):
name = 'your_app'
def ready(self):
import your_app.signals
your_app/signals.py:
def get_subclasses(cls):
result = [cls]
classes_to_inspect = [cls]
while classes_to_inspect:
class_to_inspect = classes_to_inspect.pop()
for subclass in class_to_inspect.__subclasses__():
if subclass not in result:
result.append(subclass)
classes_to_inspect.append(subclass)
return result
def update_attachment_count_on_save(sender, instance, **kwargs):
instance.work.attachment_count += 1
instance.work.save()
for subclass in get_subclasses(WorkAttachment):
post_save.connect(update_attachment_count_on_save, subclass)
I think this works for all subclasses, because they will all be loaded by the time YourAppConfig.ready is called (and thus signals is imported).
You could try something like:
model_classes = [WorkAttachment, WorkAttachmentFileBased, WorkAttachmentPicture, ...]
def update_attachment_count_on_save(sender, instance, **kwargs):
instance.work.attachment_count += 1
instance.work.save()
for model_class in model_classes:
post_save.connect(update_attachment_count_on_save,
sender=model_class,
dispatch_uid="att_post_save_"+model_class.__name__)
(Disclaimer: I have not tested the above)
I just did this using python's (relatively) new __init_subclass__ method:
from django.db import models
def perform_on_save(*args, **kw):
print("Doing something important after saving.")
class ParentClass(models.Model):
class Meta:
abstract = True
#classmethod
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
models.signals.post_save.connect(perform_on_save, sender=cls)
class MySubclass(ParentClass):
pass # signal automatically gets connected.
This requires django 2.1 and python 3.6 or better. Note that the #classmethod line seems to be required when working with the django model and associated metaclass even though it's not required according to the official python docs.
post_save.connect(my_handler, ParentClass)
# connect all subclasses of base content item too
for subclass in ParentClass.__subclasses__():
post_save.connect(my_handler, subclass)
have a nice day!
Michael Herrmann's solution is definitively the most Django-way of doing this.
And yes it works for all subclasses as they are loaded at the ready() call.
I would like to contribute with the documentation references :
In practice, signal handlers are usually defined in a signals submodule of the application they relate to. Signal receivers are connected in the ready() method of your application configuration class. If you’re using the receiver() decorator, simply import the signals submodule inside ready().
https://docs.djangoproject.com/en/dev/topics/signals/#connecting-receiver-functions
And add a warning :
The ready() method may be executed more than once during testing, so you may want to guard your signals from duplication, especially if you’re planning to send them within tests.
https://docs.djangoproject.com/en/dev/topics/signals/#connecting-receiver-functions
So you might want to prevent duplicate signals with a dispatch_uid parameter on the connect function.
post_save.connect(my_callback, dispatch_uid="my_unique_identifier")
In this context I'll do :
for subclass in get_subclasses(WorkAttachment):
post_save.connect(update_attachment_count_on_save, subclass, dispatch_uid=subclass.__name__)
https://docs.djangoproject.com/en/dev/topics/signals/#preventing-duplicate-signals
This solution resolves the problem when not all modules imported into memory.
def inherited_receiver(signal, sender, **kwargs):
"""
Decorator connect receivers and all receiver's subclasses to signals.
#inherited_receiver(post_save, sender=MyModel)
def signal_receiver(sender, **kwargs):
...
"""
parent_cls = sender
def wrapper(func):
def childs_receiver(sender, **kw):
"""
the receiver detect that func will execute for child
(and same parent) classes only.
"""
child_cls = sender
if issubclass(child_cls, parent_cls):
func(sender=child_cls, **kw)
signal.connect(childs_receiver, **kwargs)
return childs_receiver
return wrapper
It's also possible to use content types to discover subclasses - assuming you have the base class and subclasses packaged in the same app. Something like this would work:
from django.contrib.contenttypes.models import ContentType
content_types = ContentType.objects.filter(app_label="your_app")
for content_type in content_types:
model = content_type.model_class()
post_save.connect(update_attachment_count_on_save, sender=model)
In addition to #clwainwright answer, I configured his answer to instead work for the m2m_changed signal. I had to post it as an answer for the code formatting to make sense:
#classmethod
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
for m2m_field in cls._meta.many_to_many:
if hasattr(cls, m2m_field.attname) and hasattr(getattr(cls, m2m_field.attname), 'through'):
models.signals.m2m_changed.connect(m2m_changed_receiver, weak=False, sender=getattr(cls, m2m_field.attname).through)
It does a couple of checks to ensure it doesn't break if anything changes in future Django versions.

Django polymorphism hack

I'm trying to bake out a sort of "single table inheritence" a.k.a. "table per hierarchy" model in Django.
Here's what I'd like to do:
class PolymorphicModel(models.Model):
content_type = models.ForeignKey(ContentType)
class Meta:
abstract = True
def __init__(self, *args, **kwargs):
super(PolymorphicModel, self).__init__(*args, **kwargs)
# Dynamically switch the class to the actual one
self.__class__ = self.content_type.model_class()
def save(self, *args, **kwargs):
if not self.content_type:
# Save the actual class name for the future.
self.content_type = ContentType.objects.get_for_model(self.__class__)
super(PolymorphicModel, self).save(*args, **kwargs)
And then the actual hierarchy:
class Base(PolymorphicModel):
a = models.IntegerField()
b = models.IntegerField()
#abstractmethod
def something(self): pass
class DerivedA(Base):
def something(self):
return self.a
class DerivedB(Base):
def something(self):
return self.b
Unfortunately I get an error DoesNotExist when constructing DerivedA(). It complains about content_type not existing.
EDIT:
Concerning my questions:
Why do I get the exception, how to fix it?
See my answer below: content_type is apparently not a viable name.
Is the thing that I'm trying to achieve doable this way?
Yes it is! And it works beautifully. Using class names instead of content type is also possible. This has an added value of handling proxy = True appropriately.
Ups, well apparently content_type is a reserved name. I changed the property name to ct and it works now.
I've published by solution here:
http://djangosnippets.org/snippets/2408/

Categories