Django Custom Migration Not Executing - python

So I added a new "status" field to a django database table. This field needed a default value, so I defaulted it to "New", but I then added a custom migration file that calls the save() method on all of the objects in that table, as I have the save() overridden to check a different table and pull the correct status from that. However, after running this migration, all of the statuses are still set to "New", so it looks like the save isn't getting executed. I tested this by manually calling the save on all the objects after running the migration, and the statuses are updated as expected.
Here's the table model in models.py:
class SOS(models.Model):
number = models.CharField(max_length=20, unique=True)
...
# the default="New" portion is missing here because I have a migration to remove it after the custom migration (shown below) that saves the models
status = models.CharField(max_length=20)
def save(self, *args, **kwargs):
self.status = self.history_set.get(version=self.latest_version).status if self.history_set.count() != 0 else "New"
super(SOS, self).save(*args, **kwargs)
And here is the migration:
# Generated by Django 2.0.5 on 2018-05-23 13:50
from django.db import migrations, models
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
sos.save()
class Migration(migrations.Migration):
dependencies = [
('sos', '0033_auto_20180523_0950'),
]
operations = [
migrations.RunPython(set_status),
]
So it seems pretty clear to me that I'm doing something wrong with the migration, but I matched it exactly to what I see in the Django Documentation and I also compared it to this StackOverflow answer, and I can't see what I'm doing wrong. There are no errors when I run the migrations, but the custom one I wrote does run pretty much instanteously, which seems strange, as when I do the save manually, it takes about 5 seconds to save all 300+ entries.
Any suggestions?
P.S. Please let me know if there are any relevant details I neglected to include.

When you run migrations and get Model from apps you can not use custom managers or custom save or create or something like that. This model only have the fields and that's all. If you want to achieve what you want you should add your logic into you migrations like this:
# comment to be more than 6 chars...
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
if sos.history_set.exists():
sos.status = sos.history_set.get(version=sos.latest_version).status
else:
sos.status = "New"
sos.save()

Related

remove orphaned file from a model

I have the following model:
class Class(models.Model):
title = models.CharField(max_length = 60)
video = models.FileField(
upload_to = class_files_custom_upload,
validators = [
FileExtensionValidator(['mp4', 'webm', 'mpg', 'mpeg', 'ogv']),
]
)
section = models.ForeignKey(Section, on_delete = models.CASCADE)
created = models.DateTimeField(auto_now_add = True)
class Meta:
verbose_name = 'clase'
verbose_name_plural = 'clases'
ordering = ['created']
def __str__(self):
return self.title
I create an instance of this model, but if I update the video field with another file of any instance, the previous saved file is orphaned and the file takes up space and I want to avoid it, deleting the file.
To do this I customize the file load, putting a callable in the upload_to:
def class_files_custom_upload(instance, filename):
try:
old_instance = Class.objects.get(id = instance.id)
old_instance.video.delete()
except Class.DoesNotExist:
pass
return os.path.join('courses/videos', generate_secure_filename(filename))
In this way I achieve my goal. But I have several models that save multimedia files, and for each one I have to customize the file load, practically doing a function almost equal to class_files_custom_upload, and the code repeats and this is not optimal at all.
I tried to create a reusable function that meets the goal of the class_files_custom_upload function, in various fields like ImageField and FileField, but I can't do it since the function receives only 2 parameters, instance and filename, which is too little data to achieve it.
The only way I managed to create that "function" that meets the goal and is reusable, was to create a validator:
def delete_orphaned_media_file(value):
old_instance = value.instance.__class__.objects.get(pk = value.instance.pk)
media_file_field = getattr(old_instance, value.field.name)
if not media_file_field.name == value.name: media_file_field.delete()
And it works, but after all it is a "validator", a "validator" is supposed to validate a field, not "that". My question is, is it good practice to do this?
Is there a better alternative to my solution? but that this alternative meets the objective of being reusable.
Any suggestion helps my learning, thanks.
One of the problem is that, two or more FileFields can refer to the same file. In the database a FileField stores the location of the file, so two or more columns can have the same file, therefore, just removing the old one is not (completely) safe.
You can for example make use of django-unused-media. You install this with:
$ pip install django-unused-media
Next you add this to the installed apps:
# settings.py
INSTALLED_APPS = [
# …,
'django_unused_media',
# …
]
Next you can run:
python3 manage.py cleanup_unused_media
this will look for files that are no longer referenced, and clean these up interactively.
You can also make a scheduled task (for example with cron), that runs with the --no-input flag:
python3 manage.py cleanup_unused_media --no-input

`wagtailimages` not found in reverse data migration for custom image field

TLDR: Getting error: LookupError: No installed app with label 'wagtailimages'. once the custom data migration is executed in wagtail which causes all tests to fail, as Django can't find the app after running the latest migration.
I needed to add a few custom fields to my image model in my wagtail installation that supports a Vue SPA.
I followed the guidelines in the docs here: http://docs.wagtail.io/en/v2.0/advanced_topics/images/custom_image_model.html
So, I created a custom image model along with custom rendition like this:
class CustomImage(AbstractImage):
alt_text = models.CharField(max_length=255, blank=True)
caption = models.CharField(max_length=255, blank=True)
admin_form_fields = Image.admin_form_fields + (
"alt_text",
"caption",
)
class CustomRendition(AbstractRendition):
image = models.ForeignKey(
CustomImage, on_delete=models.CASCADE, related_name="renditions"
)
class Meta:
unique_together = (
("image", "filter_spec", "focal_point_key"),
)
I also changed the WAGTAILIMAGES_IMAGE_MODEL setting to point to my new model:
WAGTAILIMAGES_IMAGE_MODEL = "pages.CustomImage"
I wrote a data migration with the help of this blog post which refers to this StackOverflow discussion:
# Generated by Django 2.1.10 on 2020-01-15 09:03
from django.db import migrations, models
def forwards_func(apps, schema_editor):
wagtail_image_model = apps.get_model("wagtailimages", "Image")
custom_image_model = apps.get_model("pages", "CustomImage")
tagged_item_model = apps.get_model("taggit", "TaggedItem")
django_content_type = apps.get_model("contenttypes", "contenttype")
db_alias = schema_editor.connection.alias
# Get images stored in default wagtail image model
images = wagtail_image_model.objects.using(db_alias).all()
new_images = []
for image in images:
new_images.append(
custom_image_model(
id=image.id,
title=image.title,
file=image.file,
width=image.width,
height=image.height,
created_at=image.created_at,
focal_point_x=image.focal_point_x,
focal_point_y=image.focal_point_y,
focal_point_width=image.focal_point_width,
focal_point_height=image.focal_point_height,
file_size=image.file_size,
collection=image.collection,
uploaded_by_user=image.uploaded_by_user,
)
)
# Create images in new model
custom_image_model.objects.using(db_alias).bulk_create(new_images)
# Leave all images in previous model untouched.
# Move tags from old image to new image model. Moving tags is
# a little different case. The lookup table taggit_taggeditem looks like this:
# id object_id content_type_id tag_id
# 1 1 10 1
# 2 1 10 2
# 3 1 10 3
# 4 1 10 4
# In our case, the object_id will be same for old and new image model
# objects. So, we have to only change the content_type_id
ct_custom_img_model, created = django_content_type.objects.using(
db_alias
).get_or_create(app_label="pages", model="customimage")
ct_wagtail_model = django_content_type.objects.using(db_alias).get(
app_label="wagtailimages", model="image"
)
tagged_item_model.objects.using(db_alias).filter(
content_type_id=ct_wagtail_model.id
).update(content_type_id=ct_custom_img_model.id)
def reverse_func(apps, schema_editor):
# We get the model from the versioned app registry;
custom_image_model = apps.get_model("pages", "CustomImage")
tagged_item_model = apps.get_model("taggit", "TaggedItem")
django_content_type = apps.get_model("contenttypes", "contenttype")
db_alias = schema_editor.connection.alias
# Move tags from new image model to old wagtail model
ct_extended_model = django_content_type.objects.using(db_alias).get(
app_label="pages", model="customimage"
)
ct_wagtail_model = django_content_type.objects.using(db_alias).get(
app_label="wagtailimages", model="image"
)
tagged_item_model.objects.using(db_alias).filter(
content_type_id=ct_extended_model.id
).update(content_type_id=ct_wagtail_model.id)
# Delete all images created in the new model
custom_image_model.objects.using(db_alias).all().delete()
class Migration(migrations.Migration):
dependencies = [
("pages", "0030_auto_20200115_0817"),
]
operations = [
migrations.RunPython(forwards_func, reverse_func),
]
The forward migrations work as expected to migrate all the data and work well when I tested these changes locally.
I tried to test my backward migration, and they also work fine.
However, if I try to get the old wagtailimages.Image model in my reverse_func function, it throws an error LookupError: No installed app with label 'wagtailimages'.
Although I actually wanted to delete the images from the old model just to perform cleanup, but due to this error, I thought that it is not that important and I can just move on.
Unfortunately, as soon as I pushed the code to CI, all my tests were failing as when this 31st migration which is a custom data migration is applied, Django seems to not find the wagtailimages app at all.
I'm not sure what is the issue here. I've been trying to debug this issue for a while now but all my efforts were futile. I also didn't find anything related to this on the web which might help.
I've also tried to simplify my migration like not doing barely anything at all and just trying to fetch the model using Django's apps.get_model. The forward migration works fine, but in reverse migration, it seems that wagtailimages app just vanishes. I'm not sure why django.setup() isn't able to load that app.
Can anyone help in this regard and provide me a pointer on where are things going sideways?
I just ran into this myself. Your migration seems to depend on models from other apps, wagtailimages is among them. You gonna want to list the labels of these apps with a migration (the latest when creating your datamigration) in the dependencies list for the migration.
You gonna need to find a migration name for each of the apps.
dependencies = [
("pages", "0030_auto_20200115_0817"),
("wagtailimages", "0023_add_choose_permissions"), # I am using Wgatail 2.16 here.
...
]
This exact error is actually explained in the Django docs.
wagtail.wagtailimages is now wagtail.images
Reference (see Old Name/New Name table)
Put wagtail.images in INSTALLED_APPS.

Django model serialization problem with default fields

Inside of my app model, I use IntegerRangeField fields:
from django.db import models
from django.contrib.postgres.fields import IntegerRangeField
from django.contrib.postgres.validators import RangeMinValueValidator, RangeMaxValueValidator
from psycopg2.extras import NumericRange
class MyModel(models.Model):
...
field = IntegerRangeField(default=NumericRange(400, 600), validators=[
RangeMinValueValidator(1),
RangeMaxValueValidator(1000)
])
...
The "default" attributes are used in the admin panel UI only, and are not needed anywhere else.
If I add them after migration, they work smoothly. However, if I add them before I run makemigrations, I get this message:
ValueError: Cannot serialize: NumericRange(400, 600, '[)') There are
some values Django cannot serialize into migration files.
I don't even want the default values to be saved to my PostgreSQL database, I just want to not have to remove and bring them back every time I run makemigrations.
Any ideas?
(Didn't work: a custom object with "lower" and "higher" attributes, a single integer, a string, a tuple)
Python: 3.6.6, Django: 2.1.2, PostgreSQL: 11.0
Try to move default value calculation into separate function:
def get_default_range():
return NumericRange(400, 600)
class MyModel(models.Model):
field = IntegerRangeField(default=get_default_range, validators=[
RangeMinValueValidator(1),
RangeMaxValueValidator(1000)
])
In this case migration was successfully generated:
operations = [
migrations.AddField(
model_name='comment',
name='field',
field=django.contrib.postgres.fields.ranges.IntegerRangeField(
default=play.models.get_default_range,
validators=[django.contrib.postgres.validators.RangeMinValueValidator(1),
django.contrib.postgres.validators.RangeMaxValueValidator(1000)]),
),
]
I was able to solve this problem using the string representation of the range:
IntegerRangeField(default='[400, 600]')
django==3.0.5
psycopg2==2.8.5
EDIT I should point out that the original question was 2 years old, but at least in django 3.1, their is a serializer that you must register separately.
You need to register the serializer that is provided by django.
from psycopg2.extras import NumericRange
from django.contrib.postgres.serializers import RangeSerializer
from django.db.migrations.writer import MigrationWriter
MigrationWriter.register_serializer(NumericRange, RangeSerializer)
This piece was not in the documentation, but then you can add your defaults as you'd expect:
class AgeDivision(models.Model):
name = models.CharField(max_length=50, unique=True)
age_range = fields.IntegerRangeField(
unique=True, blank=True, default=NumericRange(None, None))
as for where to put this, it just needs to go along side any module that is only loaded once. The documentation didn't specify where to put custom serializers (as least that I could find), but I'd say put them in the migrations/__init__.py file for any app that requires the serializer. here's the documentation on migration serialization: https://docs.djangoproject.com/en/3.1/topics/migrations/#custom-serializers

Data migrations for OneToOneField in django

I have a Product model and I want to extend by using OneToOneField.
For example
class Product:
name = models.CharField(..)
price = models.FloatField(...)
I want to do like this
class MyProduct:
product = models.OneToOneField(myapp.Product, on_delete=models.CASCADE)
location = models.CharField(...)
and using signal
def create_myproduct(sender, instance, created, **kwargs):
"""Create MyProduct class for every new Product"""
if created:
MyProduct.objects.create(product=instance)
signals.post_save.connect(create_myproduct, sender=Product, weak=False,
dispatch_uid='models.create_myproduct')
This works for newly created Product, so I can do like this in template.
{{ product.myproduct.location }}
But Old products that created before adding this OneToOneRelation,has no field 'myproduct' and that template code didn't work.
I heard I need a data migrations for old product using RunPython or manage.py shell. Can you teach me how to do? I read a documentation from django, but still don't fully understand.
you can add new migration. and apply it.
something like this code:
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-22 06:04
from __future__ import unicode_literals
from django.db import migrations, models
def create_myproducts(apps, schema_editor):
Product = apps.get_model('myapp', 'Product')
MyProduct = apps.get_model('myapp', 'MyProduct')
for prod in Product.objects.all():
MyProduct.objects.create(product=prod)
class Migration(migrations.Migration):
dependencies = [
('myapp', 'your last migration'),
]
operations = [
migrations.RunPython(create_myproducts)
]
I just found out.
Like Rohit Jain said
product.myproduct is None.
When I tried to access product.myproduct, I got an exception that object does not exist. It has a relation to myproduct but the actual object doesn't exist.
What I really want was creating MyProduct object and add it to Product class.
So I did it in python manage.py shell
products = Product.objects.all()
for prod in products:
if not hasattr(prod, 'myproduct'):
prod.myproduct = MyProduct.objects.create(product=prod)
prod.save()
I think it works for me now.
Thank you guys
You should just migrate your models in a normal way
python manage.py makemigrations
python manage.py migrate
During making migrations you will be asked how to fill new fields for existing data
Please notice that when you are using Django under 1.7 version you do not have migrations (and syncdb will not do the job for existing tables) - consider using the 3rd part tool like south

How to create table during Django tests with managed = False?

From the oficial documentation:
For tests involving models with managed=False, it’s up to you to ensure the correct tables are created as part of the test setup.
I don't know how to create the tables as part of the test setup. I found this question and the accepted answer doesn't work for me. I think this is because the migrations files. The configuration is in the migrations files, to change the values "on the fly" don't have any effect.
What's the way to solve this in Django 1.7+?
I found a way. Modify the fixtures and add the SQL to generate the tables:
#0001_initial.py (or followings)
class Migration(migrations.Migration):
operations = [
migrations.RunSQL("CREATE TABLE..."),
...
]
I'm a "migrations newbie" so I don't know if this is the best option. But it works.
I think it should be similar in Django 1.7+. When you are going to run the tests you should manage those models with Django (just for testing purposes).
This conversion should be done before creating tables and Django allows you to give a class instance setting up TEST_RUNNER in your settings.py
# settings_test.py
TEST_RUNNER = 'utils.test_runner.ManagedModelTestRunner'
# test_runner.py
from django.test.runner import DiscoverRunner
class ManagedModelTestRunner(DiscoverRunner):
"""
Test runner that automatically makes all unmanaged models in your Django
project managed for the duration of the test run, so that one doesn't need
to execute the SQL manually to create them.
"""
def setup_test_environment(self, *args, **kwargs):
from django.db.models.loading import get_models
super(ManagedModelTestRunner, self).setup_test_environment(*args,
**kwargs)
self.unmanaged_models = [m for m in get_models(only_installed=False)
if not m._meta.managed]
for m in self.unmanaged_models:
m._meta.managed = True
def teardown_test_environment(self, *args, **kwargs):
super(ManagedModelTestRunner, self).teardown_test_environment(*args, **kwargs)
# reset unmanaged models
for m in self.unmanaged_models:
m._meta.managed = False

Categories