In a Django model, I want to avoid doubles so I wrote this:
class Points(TimeStampedModel):
....
def save(self, *args, **kwargs):
if self.pk:
super(Points, self).save(*args, **kwargs)
else: # save() is a creation here, not an update
if Points.objects.filter(benef_card=self.benef_card,
spendable_at=self.spendable_at).exists():
pass
else:
super(Points, self).save(*args, **kwargs)
I was very surprised to find this result in my database:
I suppose there is something wrong with my code, but I'd like to know how these doubles could exist inspite of the protection I wrote in my save() method?
I think what you want instead is:
class Points(TimeStampedModel):
# ...
class Meta:
unique_together = ('benef_card', 'spendable_at')
Then you don't need to override save -- the uniqueness will be handled by a DB constraint and it is generally the way to go. This approach is better because save is not always called (example: bulk operations) so you might get different behavior across your app.
You might also want to check out update_or_create which just returns an object with attributes you need, creating it if it doesn't exist.
You could use Django signals instead to check before save.
Related
The model that I'm using has a lot of fields. I want to be able to set all the fields to be read only except for one i.e. I want to allow only one particular field to be writable. Is there a shortcut to do this?
I'm only aware of using "read_only_fields=('x','y') and I really don't want to type out all the fields especially if I'm going to make changes to the models later. "exclude =" also doesn't apply in this case.
Try to override serializer's __init__ method:
def __init__(self, *args, **kwargs):
super(UserSerializer, self).__init__(*args, **kwargs)
for field in self.fields:
if field != 'some_required_filed':
self.fields[field].read_only = True
Here is one piece from Django documentation:
from django.db import models
class Blog(models.Model):
name = models.CharField(max_length=100)
tagline = models.TextField()
def save(self, *args, **kwargs):
do_something()
super(Blog, self).save(*args, **kwargs) # Call the "real" save() method.
do_something_else()
My hesitation is focus on the save method.
Why the author separates do_something from do_something_else?
Because of the existence of 'Call the "real" save() method',what's the meaning of do_something(),which seems to be "false" ?I even can do the manipulation:
def save(self, *args, **kwargs):
super(Blog, self).save(*args, **kwargs) # Call the "real" save() method.
do_something()
do_something_else()
Am I right?
Yes, you're right.
There is no deep meaning.
The sentences just mean "You can write some methods which you want to do before super().save or after super().save()."
However, I slightly doubt that you know super() in detail, which is one of the syntax in python.
do_something() and do_something_else() aren't real functions. They aren't defined. They are just hints for you to do something there, then, if you need, do something else, implementing even functions and calling them there, if you need.
Usually you need some field to automatically update/get a value when another field is saved. One way to do this is by overriding the model save method. And usually you do this before you call super().save().
Well, it is just showing that you can fall functions (or write code) before or after calling a parent or sibling method with super.
But first of all, you must know what super is...
In your example, you e calling save with super, which triggers the django based methods and functions and do many other stuff that is within ModelBase.save().
You are doing this to follow standart django save procedure.
But, you may want to make some checks or make some pre-save work, so you can call a method to do stuff that should be done before saving the model instance. Maybe you want to log the record time to a file and you just write a method and call it to log timestamp before you call super().save()
The same is also valid as post-save actions.
In django, creating a User has a different and unique flow from the usual Model instance creation. You need to call create_user() which is a method of BaseUserManager.
Since django REST framework's flow is to do restore_object() and then save_object(), it's not possible to simply create Users using a ModelSerializer in a generic create API endpoint, without hacking you way through.
What would be a clean way to solve this? or at least get it working using django's built-in piping?
Edit:
Important to note that what's specifically not working is that once you try to authenticate the created user instance using django.contrib.auth.authenticate it fails if the instance was simply created using User.objects.create() and not .create_user().
Eventually I've overridden the serializer's restore_object method and made sure that the password being sent is then processes using instance.set_password(password), like so:
def restore_object(self, attrs, instance=None):
if not instance:
instance = super(RegisterationSerializer, self).restore_object(attrs, instance)
instance.set_password(attrs.get('password'))
return instance
Thanks everyone for help!
Another way to fix this is to overwrite pre_save(self, obj) method in your extension of viewsets.GenericViewSet like so:
def pre_save(self, obj):
""" We have to encode the password in the user object that will be
saved before saving it.
"""
viewsets.GenericViewSet.pre_save(self, obj)
# Password is raw right now, so set it properly (encoded password will
# overwrite the raw one then).
obj.user.set_password(obj.user.password)
Edit:
Note that the obj in the code above contains the instance of User class. If you use Django's user model class directly, replace obj.user with obj in the code (the last line in 2 places).
I'm working with DRF. And here is how I create users:
I have a Serializer with overrided save method:
def save(self, **kwargs ):
try:
user = create_new_user(self.init_data)
except UserDataValidationError as e:
raise FormValidationFailed(e.form)
self.object = user.user_profile
return self.object
create_new_user is just my function for user creation and in the view, I just have:
def post(self, request, *args, **kwargs):
return self.create(request, *args, **kwargs)
It seems like you should be overriding restore_object() in your serializer, not save(). This will allow you to create your object correctly.
However, it looks like you are trying to abuse the framework -- you are trying to make a single create() create two objects (the user and the profile). I am no DRF expert, but I suspect this may cause some problems.
You would probably do better by using a custom user model (which would also include the profile in the same object).
I want my model to get a GUID as key_name automatically and I'm using the code below. Is that a good approach to solve it? Does it have any drawbacks?
class SyncModel(polymodel.PolyModel):
def __init__(self, key_name=None, key=None, **kwargs):
super(SyncModel, self).__init__(key_name=str(uuid.uuid1()) if not key else None,key=key, **kwargs)
Overriding __init__ on a Model subclass is dangerous, because the constructor is used by the framework to reconstruct instances from the datastore, in addition to being used by user code. Unless you know exactly how the constructor is used to reconstruct existing entities - something which is an internal detail and may change in future - you should avoid overriding it.
Instead, define a factory method, like this:
class MyModel(db.Model):
#classmethod
def new(cls, **kwargs):
return cls(key_name=str(uuid.uuid4()), **kwargs)
There is an article by Nick about pre and post put hooks which and be used to set the key_name, I don't know if your current method is valid or not but at least you should be aware of other options.
I'm just curious if anyone knows if there's good reason why django's orm doesn't call 'full_clean' on a model unless it is being saved as part of a model form.
Note that full_clean() will not be called automatically when you call your model’s save() method. You’ll need to call it manually when you want to run one-step model validation for your own manually created models.
django's full clean doc
(NOTE: quote updated for Django 1.6... previous django docs had a caveat about ModelForms as well.)
Are there good reasons why people wouldn't want this behavior? I'd think if you took the time to add validation to a model, you'd want that validation run every time the model is saved.
I know how to get everything to work properly, I'm just looking for an explanation.
AFAIK, this is because of backwards compatibility. There are also problems with ModelForms with excluded fields, models with default values, pre_save() signals, etc.
Sources you might be intrested in:
http://code.djangoproject.com/ticket/13100
http://groups.google.com/group/django-developers/browse_frm/thread/b888734b05878f87
Because of the compatibility considering, the auto clean on save is not enabled in django kernel.
If we are starting a new project and want the default save method on Model could clean automatically, we can use the following signal to do clean before every model was saved.
from django.dispatch import receiver
from django.db.models.signals import pre_save, post_save
#receiver(pre_save)
def pre_save_handler(sender, instance, *args, **kwargs):
instance.full_clean()
The simplest way to call the full_clean method is just to override the save method in your model:
class YourModel(models.Model):
...
def save(self, *args, **kwargs):
self.full_clean()
return super(YourModel, self).save(*args, **kwargs)
Commenting on #Alfred Huang's answer and coments on it. One might lock the pre_save hook down to an app by defining a list of classes in the current module (models.py) and checking against it in the pre_save hook:
CUSTOM_CLASSES = [obj for name, obj in
inspect.getmembers(sys.modules[__name__])
if inspect.isclass(obj)]
#receiver(pre_save)
def pre_save_handler(sender, instance, **kwargs):
if type(instance) in CUSTOM_CLASSES:
instance.full_clean()
If you have a model that you want to ensure has at least one FK relationship, and you don't want to use null=False because that requires setting a default FK (which would be garbage data), the best way I've come up with is to add custom .clean() and .save() methods. .clean() raises the validation error, and .save() calls the clean. This way the integrity is enforced both from forms and from other calling code, the command line, and tests. Without this, there is (AFAICT) no way to write a test that ensures that a model has a FK relation to a specifically chosen (not default) other model.
class Payer(models.Model):
name = models.CharField(blank=True, max_length=100)
# Nullable, but will enforce FK in clean/save:
payer_group = models.ForeignKey(PayerGroup, null=True, blank=True,)
def clean(self):
# Ensure every Payer is in a PayerGroup (but only via forms)
if not self.payer_group:
raise ValidationError(
{'payer_group': 'Each Payer must belong to a PayerGroup.'})
def save(self, *args, **kwargs):
self.full_clean()
return super().save(*args, **kwargs)
def __str__(self):
return self.name
Instead of inserting a piece of code that declares a receiver, we can use an app as INSTALLED_APPS section in settings.py
INSTALLED_APPS = [
# ...
'django_fullclean',
# your apps here,
]
Before that, you may need to install django-fullclean using PyPI:
pip install django-fullclean
A global pre_save signal can work well if you want to always ensure model validation. However it will run into issues with Django's auth in current versions (3.1.x) and could cause issues with models from other apps you are using.
Elaborating on #Peter Shannon's answer, this version will only validate models inside the module you execute it in, skips validation with "raw" saves and adds a dispatch_uid to avoid duplicate signals.
from django.db.models.signals import pre_save
import inspect
import sys
MODELS = [obj for name, obj in
inspect.getmembers(sys.modules[__name__], inspect.isclass)]
def validate_model(sender, instance, **kwargs):
if 'raw' in kwargs and not kwargs['raw']:
if type(instance) in MODELS:
instance.full_clean()
pre_save.connect(validate_model, dispatch_uid='validate_models')