I'm not an expert but I think is a good idea using a Class to define choices and to prepopulate the database with these choices. I think that make easier to change choices, etc
So in my models.py I have:
class City(models.Model):
name = models.CharField(max_length=32)
distance = models.SmallIntegerField(blank=True, null=True)
#etc
class OtherClass(models.Model):
name = models.CharField(max_length=32)
#etc
class UserProfile(models.Model):
name = models.CharField(max_length=32)
city = models.ForeignKey(City)
otherfield = models.ForeignKey(OtherClass)
#etc
UserProfile is what the users compile, City, OtherClass is where the programmer puts the options.
After the migration I have to create some City and OtherClass objects: they will be the options (and yes they have to be fixed).
I Just find out about the fixtures. Until now I was using a script:
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'sitopossedimenti.settings')
import django
django.setup()
from core.models import *
def populate():
namecity1 = add_source('city1', None)
namecity2 = add_source('city2', None)
#etc
nameotherclass1 = add_otherclass('name1', #etc)
#etc some thousands more
def add_source(name, distance):
s = model.Source.objects.get_or_create(name=name, distance=distance)[0]
s.save()
return s
def add_otherclass:
#etc
if __name__ == '__main__':
print ("Starting myapp population script...")
populate()
For now the script works (about) and I'm afraid to change... but what do you think? Are the fixtures better? Why? There're differences?
As the saying goes, if it works don't fix it. Fixtures is the more usual method but no harm in using your own. If you were writing a new test case, you might want to use fixtures, but If I were you I would just let this be.
If you want a fully automated way of achieving the result, consider migration.RunPython. The linked document contains a full example which shows data being loaded. Obviously this will happen with ./manage.py migrate without the need of an additional step.
The advantage of using migrations.RunPython is that if you were to share your app with a colleague or install on a different server, the required data will automatically be loaded into the production server and the tests will also have full access to it in the test database.
Related
I have a function make_fields_permissions that I need to use it inside the model calss in order to parse the fields and to make permissions for each field like [('can_view_first_name_field','can view first name'),...]
goal I need to call and override Person class and inside it self
I tried
def __init__(self,*args, **kwargs):
self.Meta.permissions = make_fields_permissions(self.model)
super().__init__(*args, **kwargs)
My code look like this
from django.db import models
class Person(models.Model):
first_name = models.CharField(max_length=30)
last_name = models.CharField(max_length=30)
def __init__(self, *args, **kwargs):
# kwargs['user']['permissions'] = make_fields_permissions(Profile)
# x = self['age']
super().__init__(*args, **kwargs)
class Meta:
permissions = make_fields_permissions(Person) #< I can't use the same model inside meta
Your goal is as follows:
Goal X (Real goal): Create permissions dynamically according to the model fields
Goal Y (Perceived goal that will achieve X): Call the model class while creating it.
Note: See What is the XY problem?
Let us first discuss Goal Y and why it is too complex, and somewhat unfeasable. When one wants to customize how the creation of a class occurs one would use metaclasses, and at first sight this would appear as a perfect solution for your needs (in fact if you do create one properly it would be). But the problem here is that Model already has a metaclass being ModelBase and it is already doing lots of stuff and is a little complicated. If we would want a custom metaclass we would need to inherit from it and very carefully work around its implementation to do what we want. Furthermore making it would not be the end of the story, because then we would need to maintain it since it would be easily breakable by updates to Django. Hence Goal Y is not feasible.
Moving on to the actual Goal X to do that one can Programmatically create permissions [Django docs]. A good place to do this would be in the app configs ready method. For all apps created using startapp there is an apps.py file which has an appconfig inheriting from AppConfig, when the models are loaded its ready method is called. Hence this method is used to do various tasks like attaching signals, various setup like tasks, etc. Modify the appconfig of your app to create permissions programmatically like so:
from django.apps import AppConfig
class YourAppConfig(AppConfig):
default_auto_field = 'django.db.models.AutoField' # Don't modify, keep it as it is in your code
name = 'your_app' # Don't modify, keep it as it is in your code
def ready(self):
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from path.to import make_fields_permissions
from .models import Person
# import models here, not outside as models may not be loaded yet
content_type = ContentType.objects.get_for_model(Person)
for codename, name in make_fields_permissions(Person):
Permission.objects.get_or_create(
codename=codename, # 'can_view_first_name_field'
name=name, # 'can view first name'
content_type=content_type,
)
When using Peewee I follow the advice from the Create "query methods" in Peewee Models Python answer:
class Person(Model):
name = CharField()
age = IntegerField()
#classmethod
def adults(cls):
return cls.select().where(cls.age > 18)
I create class methods for all my queries to keep my model "fat" and everything else "thin". Now I introduced a foreign key and I'm struggling with this approach, because Peewee requires me to use model class directly in the query:
class Metric(Model):
person = ForeignKeyField(Person, backref='metrics')
name = CharField()
value = IntegerField()
Other file:
class Person(Model):
name = CharField()
age = IntegerField()
#classmethod
def popular(cls, min_likes):
return cls.select(cls, Metric).join(Metric).where(Metric.name == 'likes', Metric.value > min_likes)
This won't work, as the Metric definition depends on Person and vice versa, causing a circular import. The documentation has a section Circular foreign key dependencies, where the solution to similar situations is DeferredForeignKey, but I don't like it, as it adds overhead in code (foreign keys need to be created manually everywhere) and because my app is using SQLite - the docs explicitly state the following:
Because SQLite has limited support for altering tables, foreign-key constraints cannot be added to a table after it has been created.
If I understand it correctly, that effectively means I'd actually lose the FK constraint completely. I want the constraint though, the app relies on the fact that records with missing counterparts throw exceptions.
Is there a different workaround I'm overlooking? Is having fat models like this a recommended practice with Peewee after all? I like it, but it got me into a dead end in my models design. The docs even say:
My personal opinion is that circular foreign keys are a code smell and should be refactored (by adding an intermediary table, for instance).
Update: I updated the question as originally I unintentionally omitted the main detail: I'm coping with circular imports, not just dependencies between the classes. If I collocate the classes to one file, it's gonna work, because Python resolves the names in classmethods only when they're called, but that's not what I'm solving, I'd like to keep the classes in separate modules.
I don't think you understand Python scoping. There's nothing wrong with referencing a related model inside a method body, e.g.:
# Move metric below Person.
class Person(Model):
name = CharField()
age = IntegerField()
#classmethod
def popular(cls, min_likes):
return cls.select(cls, Metric).join(Metric).where(Metric.name == 'likes', Metric.value > min_likes)
class Metric(Model):
person = ForeignKeyField(Person, backref='metrics')
name = CharField()
value = IntegerField()
Alternatively you can use DeferredForeignKey which is built exactly for this purpose.
I came up with an ugly workaround. For the sake of completeness I'm posting it as an answer to my own question, but I don't like the solution, so I won't accept it as the answer.
Given the package structure looks like this:
models/
__init__.py
person.py
metric.py
And given the __init__.py looks something like this:
from .person import Person
from .metric import Metric
Allows for simplified importing: from models import Person instead of from models.person import Person
Then the workaround could be an ugly third file with just the properties of one of the models. For example, person_attrs.py:
models/
__init__.py
person.py
person_attrs.py
metric.py
The files would have the following contents. Person:
class Person(Model):
name = CharField()
age = IntegerField()
Metric:
from .person import Person
class Metric(Model):
person = ForeignKeyField(Person, backref='metrics')
name = CharField()
value = IntegerField()
Person attributes:
from .metric import Metric
__all__ = ['popular']
#classmethod
def popular(cls, min_likes):
return cls.select(cls, Metric).join(Metric).where(Metric.name == 'likes', Metric.value > min_likes)
Then the __init__.py works as the glue:
from .person import Person
from .metrics import Metric
from . import person_attrs # isort:skip
for attr_name in person_attrs.__all__:
setattr(Person, attr_name, getattr(person_attrs, attr_name))
The workaround abuses __all__ (but I think it's still better to be explicit than to look up the attributes with an implicit algorithm), and the order of imports in __init__.py becomes significant. The person_attrs.py allows for defining both #classmethod and #property methods, which can now use the other model Metric freely as they wish, but for the price of being expelled into a separate file as top-level objects, only to be united with the Person model in the models package root.
Obviously not nice, not straightforward, kinda ugly, but I couldn't come up with anything else yet apart from putting the models together into one file.
Because you don't already have an actual reference loop and your __init__.py file is just making imports easier, use absolute imports IN the modules themselves (avoids worrying about import order in __init__.py. Your shorthand imports will still be useful everywhere else in your codebase. Then, simply defer the import of Metric into the classmethod itself.
Given the layout you specified:
models/
__init__.py
person.py
metric.py
Your __init__.py should look like this:
from .person import Person
from .metric import Metric
__all__ = ("Person", "Metric", )
Then person.py should look like this:
from peewee import Model
class Person(Model):
name = CharField()
age = IntegerField()
#classmethod
def popular(cls, min_likes):
from models.metric import Metric # Import deferred
return cls.select(cls, Metric).join(Metric).where(Metric.name == 'likes', Metric.value > min_likes)
And metric.py simply like this:
from peewee import Model
from models.person import Person
class Metric(Model):
person = ForeignKeyField(Person, backref='metrics')
name = CharField()
value = IntegerField()
Now, if you want to get a bit more fancy and you're not wanting to have to constantly invoke the import framework you could do:
from peewee import Model
class Person(Model):
name = CharField()
age = IntegerField()
#classmethod
def popular(cls, min_likes):
try:
Metric = cls.MetricKlass
except AttributeError:
Metric = getattr(__import__("models.metric"), "Metric")
cls.MetricKlass = Metric
return cls.select(cls, Metric).join(Metric).where(Metric.name == 'likes', Metric.value > min_likes)
... which will stash the import on the class so you can just retrieve it there. I doubt that's really worth it though.
I have the following model:
class Class(models.Model):
title = models.CharField(max_length = 60)
video = models.FileField(
upload_to = class_files_custom_upload,
validators = [
FileExtensionValidator(['mp4', 'webm', 'mpg', 'mpeg', 'ogv']),
]
)
section = models.ForeignKey(Section, on_delete = models.CASCADE)
created = models.DateTimeField(auto_now_add = True)
class Meta:
verbose_name = 'clase'
verbose_name_plural = 'clases'
ordering = ['created']
def __str__(self):
return self.title
I create an instance of this model, but if I update the video field with another file of any instance, the previous saved file is orphaned and the file takes up space and I want to avoid it, deleting the file.
To do this I customize the file load, putting a callable in the upload_to:
def class_files_custom_upload(instance, filename):
try:
old_instance = Class.objects.get(id = instance.id)
old_instance.video.delete()
except Class.DoesNotExist:
pass
return os.path.join('courses/videos', generate_secure_filename(filename))
In this way I achieve my goal. But I have several models that save multimedia files, and for each one I have to customize the file load, practically doing a function almost equal to class_files_custom_upload, and the code repeats and this is not optimal at all.
I tried to create a reusable function that meets the goal of the class_files_custom_upload function, in various fields like ImageField and FileField, but I can't do it since the function receives only 2 parameters, instance and filename, which is too little data to achieve it.
The only way I managed to create that "function" that meets the goal and is reusable, was to create a validator:
def delete_orphaned_media_file(value):
old_instance = value.instance.__class__.objects.get(pk = value.instance.pk)
media_file_field = getattr(old_instance, value.field.name)
if not media_file_field.name == value.name: media_file_field.delete()
And it works, but after all it is a "validator", a "validator" is supposed to validate a field, not "that". My question is, is it good practice to do this?
Is there a better alternative to my solution? but that this alternative meets the objective of being reusable.
Any suggestion helps my learning, thanks.
One of the problem is that, two or more FileFields can refer to the same file. In the database a FileField stores the location of the file, so two or more columns can have the same file, therefore, just removing the old one is not (completely) safe.
You can for example make use of django-unused-media. You install this with:
$ pip install django-unused-media
Next you add this to the installed apps:
# settings.py
INSTALLED_APPS = [
# …,
'django_unused_media',
# …
]
Next you can run:
python3 manage.py cleanup_unused_media
this will look for files that are no longer referenced, and clean these up interactively.
You can also make a scheduled task (for example with cron), that runs with the --no-input flag:
python3 manage.py cleanup_unused_media --no-input
I am using django-import-export 1.0.1 with admin integration in Django 2.1.1. I have two models
from django.db import models
class Sector(models.Model):
code = models.CharField(max_length=30, primary_key=True)
class Location(models.Model):
code = models.CharField(max_length=30, primary_key=True)
sector = ForeignKey(Sector, on_delete=models.CASCADE, related_name='locations')
and they can be imported/exported just fine using model resources
from import_export import resources
from import_export.fields import Field
from import_export.widgets import ForeignKeyWidget
class SectorResource(resources.ModelResource):
code = Field(attribute='code', column_name='Sector')
class Meta:
model = Sector
import_id_fields = ('code',)
class LocationResource(resources.ModelResource):
code = Field(attribute='code', column_name='Location')
sector = Field(attribute='sector', column_name='Sector',
widget=ForeignKeyWidget(Sector, 'code'))
class Meta:
model = Location
import_id_fields = ('code',)
and import/export actions can be integrated into the admin by
from django.contrib import admin
from import_export.admin import ImportExportModelAdmin
class SectorAdmin(ImportExportModelAdmin):
resource_class = SectorResource
class LocationAdmin(ImportExportModelAdmin):
resource_class = LocationResource
admin.site.register(Sector, SectorAdmin)
admin.site.register(Location, LocationAdmin)
For Reasons™, I would like to change this set-up so that a spreadsheet of Locations which does not contain a Sector column can be imported; the value of sector (for each imported row) should be taken from an extra field on the ImportForm in the admin.
Such a field can indeed be added by overriding import_action on the ModelAdmin as described in Extending the admin import form for django import_export. The next step, to use this value for all imported rows, is missing there, and I have not been able to figure out how to do it.
EDIT(2): Solved through the use of sessions. Having a get_confirm_import_form hook would still really help here, but even better would be having the existing ConfirmImportForm carry across all the submitted fields & values from the initial import form.
EDIT: I'm sorry, I thought I had this nailed, but my own code wasn't working as well as I thought it was. This doesn't solve the problem of passing along the sector form field in the ConfirmImportForm, which is necessary for the import to complete. Currently looking for a solution which doesn't involve pasting the whole of import_action() into an ImportMixin subclass. Having a get_confirm_import_form() hook would help a lot here.
Still working on a solution for myself, and when I have one I'll update this too.
Don't override import_action. It's a big complicated method that you don't want to replicate. More importantly, as I discovered today: there are easier ways of doing this.
First (as you mentioned), make a custom import form for Location that allows the user to choose a Sector:
class LocationImportForm(ImportForm):
sector = forms.ModelChoiceField(required=True, queryset=Sector.objects.all())
In the Resource API, there's a before_import_row() hook that is called once per row. So, implement that in your LocationResource class, and use it to add the Sector column:
def before_import_row(self, row, **kwargs):
sector = self.request.POST.get('sector', None)
if contract:
self.request.session['import_context_sector'] = sector
else:
# if this raises a KeyError, we want to know about it.
# It means that we got to a point of importing data without
# contract context, and we don't want to continue.
try:
sector = self.request.session['import_context_sector']
except KeyError as e:
raise Exception("Sector context failure on row import, " +
f"check resources.py for more info: {e}")
row['sector'] = sector
(Note: This code uses Django sessions to carry the sector value from the import form to the import confirmation screen. If you're not using sessions, you'll need to find another way to do it.)
This is all you need to get the extra data in, and it works for both the dry-run preview and the actual import.
Note that self.request doesn't exist in the default ModelResource - we have to install it by giving LocationResource a custom constructor:
def __init__(self, request=None):
super()
self.request = request
(Don't worry about self.request sticking around. Each LocationResource instance doesn't persist beyond a single request.)
The request isn't usually passed to the ModelResource constructor, so we need to add it to the kwargs dict for that call. Fortunately, Django Import/Export has a dedicated hook for that. Override ImportExportModelAdmin's get_resource_kwargs method in LocationAdmin:
def get_resource_kwargs(self, request, *args, **kwargs):
rk = super().get_resource_kwargs(request, *args, **kwargs)
rk['request'] = request
return rk
And that's all you need.
I want to make some of my Django global settings configurable through the admin interface.
To that end, I've decided to set them as database fields, rather than in settings.py.
These are the settings I care about:
class ManagementEmail(models.Model):
librarian_email = models.EmailField()
intro_text = models.CharField(max_length=1000)
signoff_text = models.CharField(max_length=1000)
These are one-off global settings, so I only ever want there to be a single librarian_email, intro_text etc floating around the system.
Is there a way I can prevent admin users from adding new records here, without preventing them from editing the existing record?
I guess I can do this by writing a custom admin template for this model, but I'd like to know if there's a neater way to configure this.
Could I use something other than class, for example?
Thanks!
Please see this question on "keep[ing] settings in database", where the answer seems to be django-dbsettings
Update
Just thought of another option: you can create the following model:
from django.contrib.sites.models import Site
class ManagementEmail(models.Model):
site = models.OneToOneField(Site)
librarian_email = models.EmailField()
intro_text = models.CharField(max_length=1000)
signoff_text = models.CharField(max_length=1000)
Because of the OneToOneField field, you can only have one ManagementEmail record per site. Then, just make sure you're using sites and then you can pull the settings thusly:
from django.contrib.sites.models import Site
managementemail = Site.objects.get_current().managementemail
Note that what everyone else is telling you is true; if your goal is to store settings, adding them one by one as fields to a model is not the best implementation. Adding settings over time is going to be a headache: you have to add the field to your model, update the database structure, and modify the code that is calling that setting.
That's why I'd recommend using the django app I mentioned above, since it does exactly what you want -- provide for user-editable settings -- without making you do any extra, unnecessary work.
I think the easiest way you can do this is using has_add_permissions function of the ModelAdmin:
class ContactUsAdmin(admin.ModelAdmin):
form = ContactUsForm
def has_add_permission(self, request):
return False if self.model.objects.count() > 0 else super().has_add_permission(request)
You can set the above to be any number you like, see the django docs.
If you need more granularity than that, and make the class a singleton at the model level, see django-solo. There are many singleton implementations also that I came across.
For StackedInline, you can use max_num = 1.
Try django-constance.
Here are some useful links:
https://github.com/jezdez/django-constance
http://django-constance.readthedocs.org/en/latest/
I'd take a page out of wordpress and create a Model that support settings.
class Settings(models.Model):
option_name = models.CharField(max_length=1000)
option_value = models.CharField(max_length=25000)
option_meta = models.CharField(max_length=1000)
Then you can just pickle (serialize) objects into the fields and you'll be solid.
Build a little api, and you can be as crafty as wordpress and call. AdminOptions.get_option(opt_name)
Then you can just load the custom settings into the runtime, keeping the settings.py module separate, but equal. A good place to write this would be in an __init__.py file.
Just set up an GlobalSettings app or something with a Key and Value field.
You could easily prevent admin users from changing values by not giving them permission to edit the GlobalSettings app.
class GlobalSettingsManager(models.Manager):
def get_setting(self, key):
try:
setting = GlobalSettings.objects.get(key=key)
except:
raise MyExceptionOrWhatever
return setting
class GlobalSettings(models.Model):
key = models.CharField(unique=True, max_length=255)
value = models.CharField(max_length=255)
objects = GlobalSettingsManager()
>>> APP_SETTING = GlobalSettings.objects.get_setting('APP_SETTING')
There are apps for this but I prefer looking at them and writing my own.
You can prevent users from adding/deleting an object by overriding this method on your admin class:
ModelAdmin.has_add_permission(self, request)
ModelAdmin.has_delete_permission(self, request, obj=None)
Modification of #radtek answer to prevent deleting if only one entry is left
class SendgridEmailQuotaAdmin(admin.ModelAdmin):
list_display = ('quota','used')
def has_add_permission(self, request):
return False if self.model.objects.count() > 0 else True
def has_delete_permission(self, request, obj=None):
return False if self.model.objects.count() <= 1 else True
def get_actions(self, request):
actions = super(SendgridEmailQuotaAdmin, self).get_actions(request)
if(self.model.objects.count() <= 1):
del actions['delete_selected']
return actions
I had basically the same problem as the original poster describes, and it's easily fixed by overriding modelAdmin classes. Something similar to this in an admin.py file easily prevents adding a new object but allows editing the current one:
class TitleAdmin(admin.ModelAdmin):
def has_delete_permission(self, request, obj=TestModel.title):
return False
def has_add_permission(self, request):
return False
def has_change_permission(self, request, obj=TestModel.title):
return True
This doesn't prevent a user from posting a form that edits data, but keeps things from happening in the Admin site. Depending on whether or not you feel it's necessary for your needs you can enable deletion and addition of a record with a minimum of coding.