I have a IPv4Manage model, in it I have a vlanedipv4network field:
class IPv4Manage(models.Model):
...
vlanedipv4network = models.ForeignKey(
to=VlanedIPv4Network, related_name="ipv4s", on_delete=models.xxx, null=True)
As we know, on the on_delete param, we general fill the models.xxx, such as models.CASCADE.
Is it possible to custom a function, to fill there? I want to do other logic things there.
The choices for on_delete can be found in django/db/models/deletion.py
For example, models.SET_NULL is implemented as:
def SET_NULL(collector, field, sub_objs, using):
collector.add_field_update(field, None, sub_objs)
And models.CASCADE (which is slightly more complicated) is implemented as:
def CASCADE(collector, field, sub_objs, using):
collector.collect(sub_objs, source=field.remote_field.model,
source_attr=field.name, nullable=field.null)
if field.null and not connections[using].features.can_defer_constraint_checks:
collector.add_field_update(field, None, sub_objs)
So, if you figure out what those arguments are then you should be able to define your own function to pass to the on_delete argument for model fields. collector is most likely an instance of Collector (defined in the same file, not sure what it's for exactly), field is most likely the model field being deleted, sub_objs is likely instances that relate to the object by that field, and using denotes the database being used.
There are alternatives for custom logic for deletions too, incase overriding the on_delete may be a bit overkill for you.
The post_delete and pre_delete allows you define some custom logic to run before or after an instance is deleted.
from django.db.models.signals import post_save
def delete_ipv4manage(sender, instance, using):
print('{instance} was deleted'.format(instance=str(instance)))
post_delete.connect(delete_ipv4manage, sender=IPv4Manage)
And lastly you can override the delete() method of the Model/Queryset, however be aware of caveats with bulk deletes using this method:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called when deleting objects in bulk using a QuerySet or as a result of a cascading delete. To ensure customized delete logic gets executed, you can use pre_delete and/or post_delete signals.
Another useful solution is to use the models.SET() where you can pass a function (deleted_guest in the example below)
guest = models.ForeignKey('Guest', on_delete=models.SET(deleted_guest))
and the function deleted_guest is
DELETED_GUEST_EMAIL = 'deleted-guest#introtravel.com'
def deleted_guest():
""" used for setting the guest field of a booking when guest is deleted """
from intro.models import Guest
from django.conf import settings
deleted_guest, created = Guest.objects.get_or_create(
first_name='Deleted',
last_name='Guest',
country=settings.COUNTRIES_FIRST[0],
email=DELETED_GUEST_EMAIL,
gender='M')
return deleted_guest
You can't send any parameters and you have to be careful with circular imports. In my case I am just setting a filler record, so the parent model has a predefined guest to represent one that has been deleted. With the new GDPR rules we gotta be able to delete guest information.
CASCADE and PROTECT etc are in fact functions, so you should be able to inject your own logic there. However, it will take a certain amount of inspection of the code to figure out exactly how to get the effect you're looking for.
Depending what you want to do it might be relatively easy, for example the PROTECT function just raises an exception:
def PROTECT(collector, field, sub_objs, using):
raise ProtectedError(
"Cannot delete some instances of model '%s' because they are "
"referenced through a protected foreign key: '%s.%s'" % (
field.remote_field.model.__name__, sub_objs[0].__class__.__name__, field.name
),
sub_objs
)
However if you want something more complex you'd have to understand what the collector is doing, which is certainly discoverable.
See the source for django.db.models.deletion to get started.
There is nothing stopping you from adding your own logic. However, you need to consider multiple factors including compatibility with the database that you are using.
For most use cases, the out of the box logic is good enough if your database design is correct. Please check out your available options here https://docs.djangoproject.com/en/2.0/ref/models/fields/#django.db.models.ForeignKey.on_delete.
Related
I'm currently working in django-cms and utilizing a PlaceholderField in several of my models. As such, I'd like to generalize this process to avoid having to override every model's delete, and add a specialized manager for each type of object just to handle deletions.
a little back story:
After working up the design of my application a little bit and using the (honestly impressive) PlaceholderFields I noticed that if I deleted a model that contained one of these fields, it would leave behind it's plugins/placeholder after deletion of the model instance that spawned it. This surprised me, so I contacted them and according to django-cms's development team:
By design, the django CMS PlaceholderField does not handle deletion of the plugins for you.
If you would like to clear the placeholder content and remove the placeholder itself when the object that references it is deleted, you can do so by calling the clear() method on the placeholder instance and then the delete() method
So being that this is expected to happen prior to deletion of the model, my first thought was use the pre_delete signal provided by django. So I set up the following:
my problem
models.py
class SimplifiedCase(models.Model):
#... fields/methods ...
my_placeholder_instance= PlaceholderField('reading_content') # ****** the placeholder
#define local method for clearing placeholderfields for this object
def cleanup_placeholders(self):
# remove any child plugins of this placeholder
self.my_placeholder_instance.clear()
# remove the placeholder itself
self.my_placeholder_instance.delete()
# link the reciever to the section
signals.pre_delete.connect(clear_placeholderfields, sender=SimplifiedCase)
signals.py
# create a generalized reciever
#(expecting multiple models to contain placeholders so generalizing the process)
def clear_placeholderfields(sender, instance, **kwargs):
instance.cleanup_placeholders() # calls the newly defined cleanup method in the model
I expected this to work without any issues, but I'm getting some odd behavior from when calling the [placeholder].delete() method from within the method called by the pre_delete receiver.
For some reason, when calling the placeholder's delete() method in my cleanup_placeholders method, it fires the parent's pre_delete method again. Resulting in an recursion loop
I'm relatively new to using django/django-cms, so possibly I'm overlooking something or fundamentally misunderstanding what's causing this loop, but is there a way to achieve what I'm trying to do here using the pre_delete signals? or am I going about this poorly?
Any suggestions would be greatly appreciated.
After several days of fighting this, I believe I've found a method of deleting Placeholders along with the 3rd party app models automatically.
Failed attempts:
- Signals failed to be useful due to the recursion mentioned my question, which is caused by all related models of a placeholder triggering a pre_delete event during the handling of the connected model's pre_delete event.
Additionally, I had a need for handling child FK-objects that also contained their own placeholders. after much trial and error the best course of action (I could find) to ensure deletion of placeholders for child objects is as follows:
define a queryset which performs an iterative deletion. (non-ideal, but only way to ensure the execution of the following steps)
class IterativeDeletion_Manager(models.Manager):
def get_queryset(self):
return IterativeDeletion_QuerySet(self.model, using=self._db)
def delete(self, *args, **kwargs):
return self.get_queryset().delete(*args, **kwargs)
class IterativeDeletion_QuerySet(models.QuerySet):
def delete(self, *args, **kwargs):
with transaction.atomic():
# attempting to prevent 'bulk delete' which ignores overridden delete
for obj in self:
obj.delete()
Set the model containing the PlaceholderField to use the newly defined manager.
Override the deletion method of any model that contains a placeholder field to handle the deletion of the placeholder AFTER deletion of the connected model. (i.e. an unoffical post_delete event)
class ModelWithPlaceholder(models.Model):
objects = IterativeDeletion_Manager()
# the placeholder
placeholder_content = PlaceholderField('slotname_for_placeholder')
def delete(self, *args, **kwargs):
# ideally there would be a method to get all fields of 'placeholderfield' type to populate this
placeholders = [self.placeholder_content]
# if there are any FK relations to this model, set the child's
# queryset to use iterative deletion as well, and manually
# call queryset delete in the parent (as follows)
#self.child_models_related_name.delete()
# delete the object
super(ModelWithPlaceholder,self).delete(*args, **kwargs)
# clear, and delete the placeholders for this object
# ( must be after the object's deletion )
for ph in placeholders:
ph.clear()
ph.delete()
Using this method I've verified that the child PlaceholderFieldfor each object is deleted along with the object utilizing the Admin Interface, Queryset deletions, Direct deletions. (at least in my usage cases)
Note: This seems unintuitive to me, but the deletion of placeholders needs to happen after deletion of the model itself, at least in the case of having child relations in the object being deleted.
This is due to the fact that calling placeholder.delete() will trigger a deletion of all related models to the deleted-model containing the PlaceholderField.
I didn't expect that at all. idk if this was the expected functionality of deleting a placeholder, but it does. Being that the placeholder is still contained in the database prior to deleting the object (by design), there shouldn't be an issue with handling it's deletion after calling the super(...).delete()
if ANYONE has a better solution to this problem, feel free to comment and let me know my folly.
This is the best I could come up with after many hours running through debuggers and tracing through the deletion process.
I'm trying to implement a geofencing for a fleet of trucks. I have to associate a list of boundaries to a vehicle. On top of that one of the requirements is keep everything even once it is deleted for audit purposes. Therefore we have to implement soft delete on everything. This is where the problem lies. My many to many field does not conform to the soft delete manager, it includes both the active and the inactive records in the lookup dataset.
class Vehicle(SoftDeleteModel):
routes = models.ManyToManyField('RouteBoundary', through='VehicleBoundaryMap', verbose_name=_('routes'),
limit_choices_to={'active': True})
class VehicleBoundaryMap(SoftDeleteModel):
vehicle = models.ForeignKey(Vehicle, verbose_name="vehicle")
route_boundary = models.ForeignKey(RouteBoundary, verbose_name="route boundary")
# ... more stuff here
alive = SoftDeleteManager()
class SoftDeleteManager(models.Manager):
use_for_related_fields = True
def get_queryset(self):
return SoftDeleteQuerySet(self.model).filter(active=True)
As you see above I tried to make sure the default manager is a soft delete manager (ie. filter for active records only) and also try use limit limit_choices_to but that turn out to field the foreign model only not the "through" model I wanted. If you have any suggestions or recommendation I would love to hear from you.
Thanks!
First problem: your use of limit_choices_to won't work because as the documentation says:
limit_choices_to has no effect when used on a ManyToManyField with a custom intermediate table specified using the through parameter.
You are using through so limit_choices_to has no effect.
Second problem: your use of use_for_related_fields = True is also ineffective. The documentation says about this attribute:
If this attribute is set on the default manager for a model (only the default manager is considered in these situations), Django will use that class whenever it needs to automatically create a manager for the class.
Your custom manager is assigned to the alive attribute of VehicleBoundaryMap rather than objects so it is ignored.
The one way I see which may work would be:
Create a proxy model for VehicleBoundaryMap. Let's call it VehicleBoundaryMapProxy. Set it so that its default manager is SoftDeleteManager() Something like:
class VehicleBoundaryMapProxy(VehicleBoundaryMap):
class Meta:
proxy = True
objects = SoftDeleteManager()
Have through='VehicleBounddaryMapProxy' on your ManyToManyField:
class Vehicle(SoftDeleteModel):
routes = models.ManyToManyField('RouteBoundary',
through='VehicleBoundaryMapProxy',
verbose_name=_('routes'))
What about if you just do:
class Vehicle(SoftDeleteModel):
#you can even remove that field
#routes = models.ManyToManyField('RouteBoundary', through='VehicleBoundaryMap', verbose_name=_('routes'),
# limit_choices_to={'active': True})
#property
def routes(self):
return RouteBoundary.objects.filter(
vehicleboundarymap__active=True,
vehicleboundarymap__vehicle=self,
)
And now instead of vehicle.routes.clear() use vehicle.vehicleboundarymap_set.delete(). You will only lose the reverse relation (RouteBoundary.vehicles) but you can implement it back using the same fashion.
The rest of the M2M field features are disabled anyway because of the intermediate model.
I have two models which I want to relate: User and Group.
Each user belongs to a group. I've tried to create a default user by using in get_or_create():
group = models.ForeignKey(Group.objects.get_or_create(name="Free")[0])
But it raises the following error:
(fields.E300) Field defines a relation with model 'Group', which is either not installed, or is abstract.
What can I do to fix this issue?
Each user must have a non-null group value. So I've read about this get_or_create() method. But I've also seen that it can return more than one object... and I don't want it to happen. I thought about creating a unique name parameter but is there a better solution for it?
Can you help me, please? I appreciate your help.
A more comprehensive answer can be found here: How to set a Django model field's default value to a function call / callable (e.g., a date relative to the time of model object creation)
You need to specifify the related Model and set the default.
class User(models.Model):
def default_group(self):
return Group.objects.get_or_create(name="Free")[0]
group = models.ForeignKey('app_name.Group', default=default_group)
Your default value would be evaluated at model definition time, but Django allows you to provide a callable as default, which is called for each instance creation.
To explain the error - code that is not inside a function, such as the line in your question, is executed as soon as your models.py file is loaded by Python. This happens early in the start-up of your Django process, when Django looks for a models.py file in each of the INSTALLED_APPS and imports it. The problem is that you don't know which other models have been imported yet. The error here is because the Group model (from django.auth.models) has not been imported yet, so it is as if it doesn't exist (yet).
Others have suggested you could put the Group.objects.get_or_create(name="Free")[0] in a function so that it is not executed immediately, Django will instead call the function only when it needs to know the value. At this point all the models in your project, and Django's own models, will have been imported and it will work.
Regarding the second part of your question... yes, any time you use get or get_or_create methods you need to query on a unique field otherwise you may get MultipleObjectsReturned exception.
In fact I think you should not use get_or_create for what you are trying to do here. Instead you should use an initial data fixture:
https://docs.djangoproject.com/en/1.9/howto/initial-data/
...to ensure that the default group already exists (and with a known primary key value) before you run your site.
That way you will know the unique pk of the default Group and you can do a get query:
def default_group():
return Group.objects.get(pk=1)
class YourModel(models.model):
group = models.ForeignKey(Group, default=default_group)
As I understand it, when one creates a Django application, data is validated by the form before it's inserted into a model instance which is then written to the database. But if I want to create an additional layer of protection at the data model layer, is what I've done below the current "best practice?" I'm trying to ensure that a reviewer's name cannot be omitted nor be left blank. Should I be putting any custom validation in the 'clean' method as I've done here and then have 'save' call 'full_clean" which calls 'clean'? If not, what's the preferred method? Thanks.
class Reviewer(models.Model):
name = models.CharField(max_length=128, default=None)
def clean(self, *args, **kwargs):
if self.name == '':
raise ValidationError('Reviewer name cannot be blank')
super(Reviewer, self).clean(*args, **kwargs)
def full_clean(self, *args, **kwargs):
return self.clean(*args, **kwargs)
def save(self, *args, **kwargs):
self.full_clean()
super(Reviewer, self).save(*args, **kwargs)
Firstly, you shouldn't override full_clean as you have done. From the django docs on full_clean:
Model.full_clean(exclude=None)
This method calls Model.clean_fields(), Model.clean(), and Model.validate_unique(), in that order and raises a ValidationError that has a message_dict attribute containing errors from all three stages.
So the full_clean method already calls clean, but by overriding it, you've prevented it calling the other two methods.
Secondly, calling full_clean in the save method is a trade off. Note that full_clean is already called when model forms are validated, e.g. in the Django admin. So if you call full_clean in the save method, then the method will run twice.
It's not usually expected for the save method to raise a validation error, somebody might call save and not catch the resulting error. However, I like that you call full_clean rather than doing the check in the save method itself - this approach allows model forms to catch the problem first.
Finally, your clean method would work, but you can actually handle your example case in the model field itself. Define your CharField as
name = models.CharField(max_length=128)
The blank option will default to False. If the field is blank, a ValidationError will be raised when you run full_clean. Putting default=None in your CharField doesn't do any harm, but it is a bit confusing when you don't actually allow None as a value.
After thinking about Alasdair's answer and doing addtional reading, my sense now is that Django's models weren't designed so as to be validated on a model-only basis as I'm attempting to do. Such validation can be done, but at a cost, and it entails using validation methods in ways they weren't intended for.
Instead, I now believe that any constraints other than those that can be entered directly into the model field declarations (e.g. "unique=True") are supposed to be performed as a part of Form or ModelForm validation. If one wants to guard against entering invalid data into a project's database via any other means (e.g. via the ORM while working within the Python interpreter), then the validation should take place within the database itself. Thus, validation could be implemented on three levels: 1) First, implement all constraints and triggers via DDL in the database; 2) Implement any constraints available to your model fields (e.g. "unique=True"); and 3) Implement all other constraints and validations that mirror your database-level constraints and triggers within your Forms and ModelForms. With this approach, any form validation errors can be re-displayed to the user. And if the programmer is interacting directly with the database via the ORM, he/she would see the database exceptions directly.
Thoughts anyone?
Capturing the pre-save signals on on my models ensured clean will be called automatically.
from django.db.models.signals import pre_save
def validate_model(sender, **kwargs):
if 'raw' in kwargs and not kwargs['raw']:
kwargs['instance'].full_clean()
pre_save.connect(validate_model, dispatch_uid='validate_models')
Thanks #Kevin Parker for your answer, quite helpful!
It is common to have models in your app outside of the ones you define, so here is a modified version you can use to scope this behavior only to your own models or a specific app/module as desired.
from django.db.models.signals import pre_save
import inspect
import sys
MODELS = [obj for name, obj in
inspect.getmembers(sys.modules[__name__], inspect.isclass)]
def validate_model(sender, instance, **kwargs):
if 'raw' in kwargs and not kwargs['raw']:
if type(instance) in MODELS:
instance.full_clean()
pre_save.connect(validate_model, dispatch_uid='validate_models')
This code will run against any models defined inside the module where it is executed but you can adapt this to scope more strictly or to be a set of modules / apps if desired.
This is what I had before (but realized that you can't obviously do it in this order:
class MasterAdmin(models.Model):
"""
A permanent admin (one per Account) that shouldn't be deleted.
"""
admin = models.OneToOneField(AccountAdmin)
class Account(models.Model):
"""
A top-level account in the system.
"""
masteradmin = models.OneToOneField(MasterAdmin)
class AccountAdmin(models.Model):
"""
An Account admin that can be deleted. This includes limited permissions.
"""
account = models.ForeignKey(Account)
I think you can see what I want to do from the example. I want to have an MasterAccountAdmin which shares the attributes from AccountAdmin. The purpose is that I want to give people the ability to delete an AccountAdmin, but not MasterAccountAdmin. I didn't want to just have an attribute on AccountAdmin called "master = models.BooleanField()".
Obviously this example won't work because MasterAdmin is referencing AccountAdmin before its creation, but I wanted to show what I'm trying to achieve. Am I thinking of this all wrong?
Why not just make is_master a property of AccountAdmin and then override the delete() method to ensure is_master is not true?
When you have forward references, use the quotes.
admin = models.OneToOneField('AccountAdmin')
See the docs.
If you need to create a relationship on a model that has not yet been defined, you can use the name of the model, rather than the model object itself...