How to pass an object between views in Django - python

I have the following model for my students to upload their tasks to an application that I am creating, but I have a problem, I need to pass an instance of the model between views, but since it is not serializable, I can not save it in a session attribute. Keep in mind that in one view I create the object without saving it in the database and in the other I perform operations with the object and finally I save it. Any idea how I can do this?
from gdstorage.storage import GoogleDriveStorage
gd_storage = GoogleDriveStorage()
class Homework(models.Model):
code = models.AutoField(primary_key=True)
student = models.ForeignKey('Student', on_delete=models.PROTECT)
title = models.CharField(unique=True, max_length=100)
attached_file = models.FileField(upload_to='files/homeworks/', validators=[validate_file_size], storage=gd_storage)

As #dirkgroten says, you can add an additional field to your model that is called status and by default assign it the value of temporary. In addition to this you can review the package code.
Finally to delete a file in Google Drive as a storage backend is very simple. Use the following
gd_storage.delete(name_file)
So change in the code of #dirkgroten
from django.core.files.storage import default_storage
#receiver (post_delete, sender=Homework)
def remove_file (sender, instance, **kwargs):
if instance.attached_file is not None:
gd_storage.delete(instance.attached_file.name)

The only way to keep "state" between views is to save to the database (or other permanent storage). That's what the session does for you.
If you can't serialise to save in the session, then you have no alternative but to save a temporary object to the database. You could mark it as temporary and add a timestamp. And in the next view mark it as committed. And if needed clean up once in a while, removing old temporary objects.
To remove the associated file with old temporary objects, you can add a signal handler for the post_delete signal:
from django.core.files.storage import default_storage
#receiver(post_delete, sender=Homework)
def remove_file(sender, instance, **kwargs)
path = instance.attached_file.name
if path:
default_storage.delete(path)

Related

Django: How to call the same model class inside it s self?

I have a function make_fields_permissions that I need to use it inside the model calss in order to parse the fields and to make permissions for each field like [('can_view_first_name_field','can view first name'),...]
goal I need to call and override Person class and inside it self
I tried
def __init__(self,*args, **kwargs):
self.Meta.permissions = make_fields_permissions(self.model)
super().__init__(*args, **kwargs)
My code look like this
from django.db import models
class Person(models.Model):
first_name = models.CharField(max_length=30)
last_name = models.CharField(max_length=30)
def __init__(self, *args, **kwargs):
# kwargs['user']['permissions'] = make_fields_permissions(Profile)
# x = self['age']
super().__init__(*args, **kwargs)
class Meta:
permissions = make_fields_permissions(Person) #< I can't use the same model inside meta
Your goal is as follows:
Goal X (Real goal): Create permissions dynamically according to the model fields
Goal Y (Perceived goal that will achieve X): Call the model class while creating it.
Note: See What is the XY problem?
Let us first discuss Goal Y and why it is too complex, and somewhat unfeasable. When one wants to customize how the creation of a class occurs one would use metaclasses, and at first sight this would appear as a perfect solution for your needs (in fact if you do create one properly it would be). But the problem here is that Model already has a metaclass being ModelBase and it is already doing lots of stuff and is a little complicated. If we would want a custom metaclass we would need to inherit from it and very carefully work around its implementation to do what we want. Furthermore making it would not be the end of the story, because then we would need to maintain it since it would be easily breakable by updates to Django. Hence Goal Y is not feasible.
Moving on to the actual Goal X to do that one can Programmatically create permissions [Django docs]. A good place to do this would be in the app configs ready method. For all apps created using startapp there is an apps.py file which has an appconfig inheriting from AppConfig, when the models are loaded its ready method is called. Hence this method is used to do various tasks like attaching signals, various setup like tasks, etc. Modify the appconfig of your app to create permissions programmatically like so:
from django.apps import AppConfig
class YourAppConfig(AppConfig):
default_auto_field = 'django.db.models.AutoField' # Don't modify, keep it as it is in your code
name = 'your_app' # Don't modify, keep it as it is in your code
def ready(self):
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from path.to import make_fields_permissions
from .models import Person
# import models here, not outside as models may not be loaded yet
content_type = ContentType.objects.get_for_model(Person)
for codename, name in make_fields_permissions(Person):
Permission.objects.get_or_create(
codename=codename, # 'can_view_first_name_field'
name=name, # 'can view first name'
content_type=content_type,
)

Extend django-import-export's import form to specify fixed value for each imported row

I am using django-import-export 1.0.1 with admin integration in Django 2.1.1. I have two models
from django.db import models
class Sector(models.Model):
code = models.CharField(max_length=30, primary_key=True)
class Location(models.Model):
code = models.CharField(max_length=30, primary_key=True)
sector = ForeignKey(Sector, on_delete=models.CASCADE, related_name='locations')
and they can be imported/exported just fine using model resources
from import_export import resources
from import_export.fields import Field
from import_export.widgets import ForeignKeyWidget
class SectorResource(resources.ModelResource):
code = Field(attribute='code', column_name='Sector')
class Meta:
model = Sector
import_id_fields = ('code',)
class LocationResource(resources.ModelResource):
code = Field(attribute='code', column_name='Location')
sector = Field(attribute='sector', column_name='Sector',
widget=ForeignKeyWidget(Sector, 'code'))
class Meta:
model = Location
import_id_fields = ('code',)
and import/export actions can be integrated into the admin by
from django.contrib import admin
from import_export.admin import ImportExportModelAdmin
class SectorAdmin(ImportExportModelAdmin):
resource_class = SectorResource
class LocationAdmin(ImportExportModelAdmin):
resource_class = LocationResource
admin.site.register(Sector, SectorAdmin)
admin.site.register(Location, LocationAdmin)
For Reasons™, I would like to change this set-up so that a spreadsheet of Locations which does not contain a Sector column can be imported; the value of sector (for each imported row) should be taken from an extra field on the ImportForm in the admin.
Such a field can indeed be added by overriding import_action on the ModelAdmin as described in Extending the admin import form for django import_export. The next step, to use this value for all imported rows, is missing there, and I have not been able to figure out how to do it.
EDIT(2): Solved through the use of sessions. Having a get_confirm_import_form hook would still really help here, but even better would be having the existing ConfirmImportForm carry across all the submitted fields & values from the initial import form.
EDIT: I'm sorry, I thought I had this nailed, but my own code wasn't working as well as I thought it was. This doesn't solve the problem of passing along the sector form field in the ConfirmImportForm, which is necessary for the import to complete. Currently looking for a solution which doesn't involve pasting the whole of import_action() into an ImportMixin subclass. Having a get_confirm_import_form() hook would help a lot here.
Still working on a solution for myself, and when I have one I'll update this too.
Don't override import_action. It's a big complicated method that you don't want to replicate. More importantly, as I discovered today: there are easier ways of doing this.
First (as you mentioned), make a custom import form for Location that allows the user to choose a Sector:
class LocationImportForm(ImportForm):
sector = forms.ModelChoiceField(required=True, queryset=Sector.objects.all())
In the Resource API, there's a before_import_row() hook that is called once per row. So, implement that in your LocationResource class, and use it to add the Sector column:
def before_import_row(self, row, **kwargs):
sector = self.request.POST.get('sector', None)
if contract:
self.request.session['import_context_sector'] = sector
else:
# if this raises a KeyError, we want to know about it.
# It means that we got to a point of importing data without
# contract context, and we don't want to continue.
try:
sector = self.request.session['import_context_sector']
except KeyError as e:
raise Exception("Sector context failure on row import, " +
f"check resources.py for more info: {e}")
row['sector'] = sector
(Note: This code uses Django sessions to carry the sector value from the import form to the import confirmation screen. If you're not using sessions, you'll need to find another way to do it.)
This is all you need to get the extra data in, and it works for both the dry-run preview and the actual import.
Note that self.request doesn't exist in the default ModelResource - we have to install it by giving LocationResource a custom constructor:
def __init__(self, request=None):
super()
self.request = request
(Don't worry about self.request sticking around. Each LocationResource instance doesn't persist beyond a single request.)
The request isn't usually passed to the ModelResource constructor, so we need to add it to the kwargs dict for that call. Fortunately, Django Import/Export has a dedicated hook for that. Override ImportExportModelAdmin's get_resource_kwargs method in LocationAdmin:
def get_resource_kwargs(self, request, *args, **kwargs):
rk = super().get_resource_kwargs(request, *args, **kwargs)
rk['request'] = request
return rk
And that's all you need.

Django: how to save model instance after deleting a ForeignKey-related instance?

I am using Django 2.1.1.
I have a model Analysis that, among other fields, contains a ForeignKey to a MyFile model (a model I wrote to handle files):
from polymorphic.models import PolymorphicModel
from django.db.models import Model, DateTimeField, FileField, SET_NULL
from django.db.models.signals import pre_delete
class MyFile(Model):
file = FileField(upload_to='./', null=False, blank=False)
description = CharField(max_length=255, null=True, blank=True)
date_added = DateTimeField(auto_now_add=True)
#receiver(pre_delete, sender=MyFile)
def mymodel_delete(sender, instance, **kwargs):
"""
To delete the file connected to the `sender` class: receive the pre_delete signal
and delete the file associated with the model instance.
"""
instance.file.delete(False)
class Analysis(PolymorphicModel):
# ... other fields ...
file_results = ForeignKey(MyFile, on_delete=SET_NULL,
related_name='file_results',
null=True, blank=True)
Analysis is a PolymorphicModel for reasons related to the bigger project.
In Analysis.file_results I set on_delete=SET_NULL because I want to allow an Analysis instance to exist even without a file_result, which can be populated later.
Let's suppose I have added a few files (the MyFile table has a few rows) and a few Analysis instances. Now, if I want to delete the file related to one of the instances of Analysis I do:
a = Analysis.objects.get(pk=0)
a.file_results.delete()
a.save()
but I get the following error:
File "/Users/mtazzari/djangos/views.py" in update_job_refs
377. a.save()
File "/Users/mtazzari/anaconda/envs/djangos/lib/python3.6/site-packages/polymorphic/models.py" in save
83. return super(PolymorphicModel, self).save(*args, **kwargs)
File "/Users/mtazzari/anaconda/envs/djangos/lib/python3.6/site-packages/django/db/models/base.py" in save
670. "unsaved related object '%s'." % field.name
ValueError: save() prohibited to prevent data loss due to unsaved
related object 'file_results'.
The mymodel_delete function that is called on pre_delete signal works correctly as the file gets actually deleted from the file system.
However, I really don't understand how to solve the ValueError.
Interestingly, I notice that the following lines work fine, i.e. do not raise any ValueError, get the file deleted from the file system, and the FK in a.file_results set to Null:
a = Analysis.objects.get(pk=0)
tmp = a.file_results
a.file_results = None
tmp.file_results.delete()
a.save()
But, is this a proper way of doing this? What is the best practice for deleting a related object?
Thanks!
First, note that you don't need to save() just because of the delete(). The delete() will update the database as required.
That said, it's reasonable to want to continue using the instance to do other operations, leading to a save(). The reason you're getting the error is that the a.file_results Python object still exists, and references a database row that is now missing. The documentation for delete() mentions this:
This only deletes the object in the database; the Python instance will still exist and will still have data in its fields.
So if you want to continue to work with the instance object, just set the attribute to None yourself. Similar to your code above, except you don't need the temp object.
a = Analysis.objects.get(pk=0)
a.file_results.delete()
a.file_results = None
# ... more operations on a
a.save() # no error

Django model field to upload file from one app but save with another

We are developing an intranet with Django which has to have a consistent and centralized file managmenet. We implemented a filemanager app which should handle all the uploads and downloads, do mimetype checks, permission checks etc.
The upload ist achieved through a Django form:
class UploadFileForm(forms.ModelForm):
class Meta:
model = PhysicalFile
fields = ('path', 'directory')
def save(self, commit=True):
"""
Override save method of ModelForm to create Physical File object of
uploaded file and to process meta data
"""
# Proceed with default behaviour but DO NOT commit!
# pfile contains the PhysicalFile object which is not yet written to DB
pfile = super(UploadFileForm, self).save(commit=False)
# set pfile's meta data according to:
# https://docs.djangoproject.com/en/1.11/ref/files/uploads/
pfile.name = self.cleaned_data['path'].name
pfile.size = self.cleaned_data['path'].size
pfile.mimetype = self.cleaned_data['path'].content_type
# NOW save to database, ignoring commit parameter
pfile.save()
return pfile
Now we need want to perform uploads in ANOTHER app (say a members app with profile picture upload) using the same form as above. However, it needs to be included into an app specific form. E.g. a form with name, address etc. Basically we would only need to save the corresponding foreignKey of the file into the memberModel and process the upload with the filemanger's form.
That is why we thought of a custom filed. But this is not working out at all..
class FilemanagerUploadField(models.ForeignKey):
def __init__(self, upload_to=None, *args, **kwargs):
# Will be used later to bind specific apps to specific directories
self.upload_to = upload_to
# Bind PhysicalFile as default Model
super(FilemanagerUploadField, self).__init__('filemanager.PhysicalFile')
def formfield(self, **kwargs):
""" Taken from django's FileField but does NOT WORK"""
defaults = {'form_class': forms.FileField, 'max_length': self.max_length}
if 'initial' in kwargs:
defaults['required'] = False
defaults.update(kwargs)
return super(FilemanagerUploadField, self).formfield(**defaults)
def save(self):
# somewho run form from here with uploaded data and return foreignKey
I am not really able to get a grip on those custom model fields... We need it to perform like a FileField (widget validation and stuff) but be saved like a ForeignKey (to the actual PhysicalFile Model in another app)...
If there is another way to achieve what we are looking for, please tell me.
tldr; Upload files in App A but let App B process it, save the file path, meta data etc, and return ForeignKey of the processed object to A to save it to database. Custom model field?

Django Admin import_export module used for custom IMPORT

I was trying to follow official doc of import-export:
https://django-import-export.readthedocs.org/en/latest/import_workflow.html#import-data-method-workflow
But I still do not know how to glue it to my admin assuming that:
I want only subset of fields (I created Resource model with listed fields, but It crashes while importing anyway with: KeyError full stack bellow.
Where - in which method - in my admin class (inheriting of course ImportExportModelAdmin and using defined resource_class) should i place the code responsible for some custom actions I want to happen after validating, that import data are correct but before inserting them into database.
I am not very advanced in Django and will be thankful for some hints.
Example of working implementation will be appreciated, so if you know something similar on github - share.
you can override it as
to create a new instance
def get_instance(self, instance_loader, row):
return False
your custom save
def save_instance(self, instance, real_dry_run):
if not real_dry_run:
try:
obj = YourModel.objects.get(some_val=instance.some_val)
# extra logic if object already exist
except NFCTag.DoesNotExist:
# create new object
obj = YourModel(some_val=instance.some_val)
obj.save()
def before_import(self, dataset, dry_run):
if dataset.headers:
dataset.headers = [str(header).lower().strip() for header in dataset.headers]
# if id column not in headers in your file
if 'id' not in dataset.headers:
dataset.headers.append('id')

Categories