I am using an approach similar to T. Stone's answer on this question. However, I have added an abstract base class, so my models.py looks like this:
class CustomQuerySetManager(models.Manager):
"""A re-usable Manager to access a custom QuerySet"""
def __getattr__(self, attr, *args):
try:
return getattr(self.__class__, attr, *args)
except AttributeError:
return getattr(self.get_query_set(), attr, *args)
def get_query_set(self):
return self.model.QuerySet(self.model)
class MyModel(models.Model):
class Meta:
abstract = True
class QuerySet(QuerySet):
def user(self, pub, *args, **kwargs):
return self.filter(publisher=pub, *args, **kwargs)
...some more methods here
class Book(MyModel):
title = models.CharField(max_length=100)
authors = models.ManyToManyField(Author, related_name='book_author')
publisher = models.ForeignKey(Publisher)
publication_date = models.DateField()
objects=models.Manager()
obj=CustomQuerySetManager() #for testing purposes only, this will override objects later
This allows me to get all of the books for a given publisher like such:
p = Publisher.object.get(pk=1)
Book.obj.user(p).all()
I would like to extend this so I can define a custom query in the Book model then pass a Q object to the QuerySet class, so the query "publisher=pub" can be different for different models. I still want to be able to call this like Book.obj.user(p).all(). Somewhere in the Book model I need:
pubQ=Q(publisher=pub)
Where can I put this and how do I pass it to QuerySet defined in the Abstract Base Class, while keeping the code as DRY as possible?
That answer is clever, but it breaks the Python principle of "explicit is better than implicit". My first reaction to your code was to tell you that you can't declare a custom queryset inside your model, but I decided to check the mentioned SO answer to see where you got that idea from. Again, it's clever -- not discounting that, but well-written code is self-documenting and should be able to be picked up by any random Django developer and ran with. That's where peer code-reviews come in handy -- had you had one, you'd have instantly got a WTF with that.
The Django core team does it the following way:
class MyQuerySet(models.query.QuerySet):
def some_method(self, an_arg, another_arg, a_kwarg='some_value'):
# do something
return a_queryset
class MyManager(models.Manager):
def get_query_set(self):
return MyQuerySet(self.model)
def some_method(self, *args, **kwargs):
return self.get_query_set().some_method(*args, **kwargs)
It's DRY in the sense that you don't repeat the actual method definition in the manager. But, it's also explicit -- you know exactly what's going on. It's not as DRY as the method you're referencing, but "explicit is better than implicit". Besides if it's done that way in the actual Django codebase, you can be reasonably assured that it's good practice to do so in your own code. And, it has the side-effect of making it much easier to extend and override in subclasses.
Related
I'm using Python 3.6 with Django 1.11.9 and rest_framework 3.6.2.
I want to inherit from serializers.Serializer to make a SharingSerializer class, that I want to be abstract, because I want to inherit from the latter to implement some ArticleSharingSerializer, ImageSharingSerializer,... and so on.
What I've tried so far:
from abc import ABCMeta, abstractmethod
from rest_framework import serializers
...
class SharingSerializer(serializers.Serializer, metaclass=ABCMeta):
course = serializers.PrimaryKeyRelatedField(queryset=Course.objects.all())
students = serializers.PrimaryKeyRelatedField(queryset=User.objects.all(), many=True)
#abstractmethod
def validate(self, data):
# Doing validation stuff with "course" and "students" fields
...
return data
class ArticleSharingSerializer(SharingSerializer):
articles = serializers.PrimaryKeyRelatedField(queryset=Article.objects.all(), many=True)
def validate(self, data):
data = super().validate(data)
# Doing validation stuff with "articles" and self.context["request"].user
...
return data
But when trying to "runserver", I get the following error:
File ".../school/serializers.py", line 11, in <module>
class SharingSerializer(serializers.Serializer, metaclass=ABCMeta):
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
Do you know how I can successfully achieve what I'm trying to achieve?
UPDATE : I want to take advantage of the #abstractmethod "enforcement" on instantiation that ABC provides.
UPDATE 2 : TLDR : The best answer given by Ahmed Hosny (see below) is this link
Short Answer
To keep using the ABCMeta You can also do,
class SharingSerializer(serializers.Serializer):
__metaclass__ = ABCMeta
course = serializers.PrimaryKeyRelatedField(queryset=Course.objects.all())
students = serializers.PrimaryKeyRelatedField(queryset=User.objects.all(), many=True)
#abstractmethod
def validate(self, data):
# Doing validation stuff with "course" and "students" fields
...
return data
Or make intermediate class, check this OP.
Long Answer:
The problem occurs when you try to have a class that inherits from two different classes with different meta_class. Then a conflict occurs!
So in your case you inherited from serializers.Serializer which has a meta class (check this) and also you inherited a different meta_class by metaclass=ABCMeta. This is the reason the conflict occur.
Check also this reference, and this one.
[UPDATE]
Lets Make some points clear:
Doing
class Meta:
abstract = True
will not make your class abstract the way you know in Java and other compiled languages.
what it actually do is just marking this class with some extra attribute (without going to more details).
Why people may think abstract = True is a Django related stuff. Because Django is doing the extra checks to have the flavor of abstract for you. Check Django source code here https://github.com/django/django/blob/master/django/db/models/base.py#L62
So It is not a Django specific, abstract = True alone without anything also will not do anything extra.
ABC is doing a similar idea with extra and neat way for having the abstract flavor. So, Check the source code of ABC here https://github.com/python/cpython/blob/master/Lib/abc.py
I'm implementing a bare-bones history tracking mechanism for my Django app, in which the models I care to track override the save() and delete() methods. In each method, I create my history objects as necessary:
def save(self, *args, **kwargs):
super(MyModel, self).save(*args, **kwargs)
# Create the historical model based on what we were given
h = Historical_MyModel(**{field.attname: getattr(self, field.attname) for field in self._meta.fields})
# Set some other fields as necessary...
h.save()
Since the code for each save() and delete() method is similar, I figured a good way to prevent typing the same code is to create an abstract base class to have the similar code in one place. One thing I'm struggling with, however, is how to handle creating the Historical_{Model} instance for each child class (each Historical_{Model} is essentially a copy of the original model, with additional info like who made the change, when the change occurred, etc.).
In my base class, the method would look something like this, I think:
class HistoryTrackedModel(models.Model):
def save(self, *args, **kwargs):
super(self.model, self).save(*args, **kwargs)
# Create the historical model based on what we were given
h = SOME_HISTORICAL_MODEL_INSTANCE(**{field.attname: getattr(self, field.attname) for field in self._meta.fields})
# Other fields get set ...
h.save()
class Meta:
abstract = True
The SOME_HISTORICAL_MODEL_INSTANCE bit above is the piece I'm stuck on. How can I get the associated historical model for a specific model I'm tracking? Is there an easy way to store a reference to it in each child class? I'd like to prevent code duplication, and I thought this was the right avenue, but I'm stuck on this one point. Any help would be appreciated.
I think the most straightforward way would be to store the value as a class attribute:
class HistoricalFoo(models.Model):
...
class Foo(HistoryTrackedModel):
history_model = HistoricalFoo
....
class HistoryTrackedModel(models.Model):
def save(self):
...
h = self.history_model(...)
An alternative would be to generate the historical model names programmatically:
class HistoryTrackedModel(models.Model):
def save(self):
...
history_model = globals()["Historical" + self.__class__.__name__]
h = history_model(...)
What I'm trying to do is create a dynamic ModelForm that generates extra fields based on one of its class-attributes to use in a ModelAdmin. Something like:
class MyModelForm(forms.ModelForm):
config_fields = ('book_type', 'is_featured', 'current_price__is_sale')
class MyModelAdmin(admin.ModelAdmin):
form = MyModelForm
In this case, MyModelForm would generate fields based on the config_fields attribute by performing some introspection. My approach so far looks something like this (based on this answer https://stackoverflow.com/a/6581949/677985):
class ConfigForm(type):
def __new__(cls, name, bases, attrs):
if 'config_fields' in attrs:
for config_field in attrs['config_fields']:
# ... (removed for clarity)
attrs.update(fields)
return type(name, bases, attrs)
class MyModelForm(forms.ModelForm):
__metaclass__ = ConfigForm
config_fields = ('book_type', 'is_featured', 'current_price__is_sale')
This approach works well enough, but I'm not quite happy with it for several reasons:
The validation doesn't seem to work, but this is a minor concern for now
I'm not quite sure why the "if config_field in attrs:"-condition is needed, but it is
I would prefer for MyModelForm to inherit instead of setting the __metaclass__ attribute, the base-class could then be easily reused and would allow me to easily override the clean- and __init__-methods.
I tried implementing the third item, the result being that the extra-fields did not show up in the admin-form. I'd be grateful if someone could help me figure this out, or at least point me in the right direction.
I am aware that using a metaclass for this probably overkill, and would guess that part of the problem is that ModelForm already has one or two metaclasses in its inheritance-chain. So if anyone has an alternate solution that accomplishes the same, that would make me just as happy.
I believe that the ModelForm already has a metaclass, but you're overwriting it by setting your own. That's why you're not getting validation or any of the other built in goodness of modelforms.
Instead, you should be able to use type directly to create your ModelForm, which will describe the type you want, but still cause the ModelForms metaclass to do its thing.
Example:
config_fields = ('book_type', 'is_featured', 'current_price__is_sale')
# the below is an example, you need more work to construct the proper attrs
attrs = dict((f, forms.SomeField) for f in config_fields)
ConfigModelForm = type('DynamicModelForm', (forms.ModelForm,), attrs)
class MyModelAdmin(admin.ModelAdmin):
form = ConfigModelForm
You can wrap the first part up in a function if need be, and invoke it for your form attribute in your ModelAdmin.
See my answer here for links and discussion on using type.
How about this,
Basically any form that extends your StepForm will also have the metaclass you wanted in the case below it's StepFormMetaclass, please note that if you have the form defined in some form.py file, you will need to import the form in the ___init___.py so that it will execute it during django starting sequence.
from django.forms.forms import DeclarativeFieldsMetaclass
class StepFormMetaclass(DeclarativeFieldsMetaclass):
.......
def __new__(meta_class, name, bases, attributes):
.....
return DeclarativeFieldsMetaclass.__new__(meta_class, name, bases, attributes)
class StepForm(six.with_metaclass(StepFormMetaclass, forms.Form, StepFormMixin)):
def __init__(self, *args, **kwargs):
super(StepForm, self).__init__(*args, **kwargs)
def as_p(self):
return ......
I want my model to get a GUID as key_name automatically and I'm using the code below. Is that a good approach to solve it? Does it have any drawbacks?
class SyncModel(polymodel.PolyModel):
def __init__(self, key_name=None, key=None, **kwargs):
super(SyncModel, self).__init__(key_name=str(uuid.uuid1()) if not key else None,key=key, **kwargs)
Overriding __init__ on a Model subclass is dangerous, because the constructor is used by the framework to reconstruct instances from the datastore, in addition to being used by user code. Unless you know exactly how the constructor is used to reconstruct existing entities - something which is an internal detail and may change in future - you should avoid overriding it.
Instead, define a factory method, like this:
class MyModel(db.Model):
#classmethod
def new(cls, **kwargs):
return cls(key_name=str(uuid.uuid4()), **kwargs)
There is an article by Nick about pre and post put hooks which and be used to set the key_name, I don't know if your current method is valid or not but at least you should be aware of other options.
I am trying to write a base crud controller class that does the
following:
class BaseCrudController:
model = ""
field_validation = {}
template_dir = ""
#expose(self.template_dir)
def new(self, *args, **kwargs)
....
#validate(self.field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
...
My intent is to have my controllers extend this base class, set the
model, field_validation, and template locations, and am ready to go.
Unfortunately, decorators (to my understanding), are interpreted when
the function is defined. Hence it won't have access to instance's
value. Is there a way to pass in dynamic data or values from the sub
class?
For example:
class AddressController(BaseCrudController):
model = Address
template_dir = "addressbook.templates.addresses"
When I try to load AddressController, it says "self is not defined". I am assuming that the base class is evaluating the decorator before the sub class is initialized.
Thanks,
Steve
Perhaps using a factory to create the class would be better than subclassing:
def CrudControllerFactory(model, field_validation, template_dir):
class BaseCrudController:
#expose(template_dir)
def new(self, *args, **kwargs)
....
#validate(field_validation, error_handler=new)
#expose()
def post(self, *args, **kwargs):
....
return BaseCrudController
Unfortunately, decorators (to my
understanding), are interpreted when
the function is defined. Hence it
won't have access to instance's value.
Is there a way to pass in dynamic data
or values from the sub class?
The template needs to be called with the name of the relevant attribute; the wrapper can then get that attribute's value dynamically. For example:
import functools
def expose(attname=None):
if attname:
def makewrapper(f):
#functools.wraps(f)
def wrapper(self, *a, **k):
attvalue = getattr(self, attname, None)
...use attvalue as needed...
return wrapper
return makewrapper
else:
...same but without the getattr...
Note that the complication is only because, judging from the code snippets in your Q, you want to allow the expose decorator to be used both with and without an argument (you could move the if attname guard to live within wrapper, but then you'd uselessly repeat the check at each call -- the code within wrapper may also need to be pretty different in the two cases, I imagine -- so, shoehorning two different control flows into one wrapper may be even more complicated). BTW, this is a dubious design decision, IMHO. But, it's quite separate from your actual Q about "dynamic data".
The point is, by using the attribute name as the argument, you empower your decorator to fetch the value dynamically "just in time" when it's needed. Think of it as "an extra level of indirection", that well-known panacea for all difficulties in programming!-)