models.py
class Subscription(models.Model):
#... many fields ...
# I added this field when I already had many objects
uniqueSubscriptionId = models.CharField(default=generateUniqueSubscription, max_length=30)
generateUniqueSubscription
from django.utils.crypto import get_random_string
def generateUniqueSubscription():
return get_random_string(20)
The Problem is that, when I run migrations, all of my old objects get the same uniqueSubscriptionId. I want each and every single old object to get a unique uniqueSubscriptionId.
How can I do that?
Here's what I did:
models.py
def updateOldSubscriptionObjs(apps, schema_editor):
old_subscription_model = apps.get_model("app_label", "Profile")
for obj in old_subscription_model.objects.all():
obj.uniqueSubscriptionId = generateUniqueSubscription()
obj.save()
class Subscription(models.Model):
#... many fields ...
# I added this field when I already had many objects
uniqueSubscriptionId = models.CharField(default=generateUniqueSubscription, max_length=30)
Then I ran makemigrations:
python manage.py makemigrations
Then edited the latest migration file:
class Migration(migrations.Migration):
dependencies = [
# forget this
]
operations = [
# .... many things ...
migrations.RunPython(updateOldProfileObjs)
]
Then ran migrate:
python manage.py migrate
And voila, all old objects got updated, and also, any new object will also get updated as I specified default.
If you are lazy like me, and don't want to do these things, then open django python shell:
python manage.py shell
and then execute this function in shell:
def updateOldSubscriptionObjs():
for obj in Subscription.objects.all():
obj.uniqueSubscriptionId = generateUniqueSubscription()
obj.save()
I wish if there was some built-in django feature for this.
Related
Inside of my app model, I use IntegerRangeField fields:
from django.db import models
from django.contrib.postgres.fields import IntegerRangeField
from django.contrib.postgres.validators import RangeMinValueValidator, RangeMaxValueValidator
from psycopg2.extras import NumericRange
class MyModel(models.Model):
...
field = IntegerRangeField(default=NumericRange(400, 600), validators=[
RangeMinValueValidator(1),
RangeMaxValueValidator(1000)
])
...
The "default" attributes are used in the admin panel UI only, and are not needed anywhere else.
If I add them after migration, they work smoothly. However, if I add them before I run makemigrations, I get this message:
ValueError: Cannot serialize: NumericRange(400, 600, '[)') There are
some values Django cannot serialize into migration files.
I don't even want the default values to be saved to my PostgreSQL database, I just want to not have to remove and bring them back every time I run makemigrations.
Any ideas?
(Didn't work: a custom object with "lower" and "higher" attributes, a single integer, a string, a tuple)
Python: 3.6.6, Django: 2.1.2, PostgreSQL: 11.0
Try to move default value calculation into separate function:
def get_default_range():
return NumericRange(400, 600)
class MyModel(models.Model):
field = IntegerRangeField(default=get_default_range, validators=[
RangeMinValueValidator(1),
RangeMaxValueValidator(1000)
])
In this case migration was successfully generated:
operations = [
migrations.AddField(
model_name='comment',
name='field',
field=django.contrib.postgres.fields.ranges.IntegerRangeField(
default=play.models.get_default_range,
validators=[django.contrib.postgres.validators.RangeMinValueValidator(1),
django.contrib.postgres.validators.RangeMaxValueValidator(1000)]),
),
]
I was able to solve this problem using the string representation of the range:
IntegerRangeField(default='[400, 600]')
django==3.0.5
psycopg2==2.8.5
EDIT I should point out that the original question was 2 years old, but at least in django 3.1, their is a serializer that you must register separately.
You need to register the serializer that is provided by django.
from psycopg2.extras import NumericRange
from django.contrib.postgres.serializers import RangeSerializer
from django.db.migrations.writer import MigrationWriter
MigrationWriter.register_serializer(NumericRange, RangeSerializer)
This piece was not in the documentation, but then you can add your defaults as you'd expect:
class AgeDivision(models.Model):
name = models.CharField(max_length=50, unique=True)
age_range = fields.IntegerRangeField(
unique=True, blank=True, default=NumericRange(None, None))
as for where to put this, it just needs to go along side any module that is only loaded once. The documentation didn't specify where to put custom serializers (as least that I could find), but I'd say put them in the migrations/__init__.py file for any app that requires the serializer. here's the documentation on migration serialization: https://docs.djangoproject.com/en/3.1/topics/migrations/#custom-serializers
So I added a new "status" field to a django database table. This field needed a default value, so I defaulted it to "New", but I then added a custom migration file that calls the save() method on all of the objects in that table, as I have the save() overridden to check a different table and pull the correct status from that. However, after running this migration, all of the statuses are still set to "New", so it looks like the save isn't getting executed. I tested this by manually calling the save on all the objects after running the migration, and the statuses are updated as expected.
Here's the table model in models.py:
class SOS(models.Model):
number = models.CharField(max_length=20, unique=True)
...
# the default="New" portion is missing here because I have a migration to remove it after the custom migration (shown below) that saves the models
status = models.CharField(max_length=20)
def save(self, *args, **kwargs):
self.status = self.history_set.get(version=self.latest_version).status if self.history_set.count() != 0 else "New"
super(SOS, self).save(*args, **kwargs)
And here is the migration:
# Generated by Django 2.0.5 on 2018-05-23 13:50
from django.db import migrations, models
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
sos.save()
class Migration(migrations.Migration):
dependencies = [
('sos', '0033_auto_20180523_0950'),
]
operations = [
migrations.RunPython(set_status),
]
So it seems pretty clear to me that I'm doing something wrong with the migration, but I matched it exactly to what I see in the Django Documentation and I also compared it to this StackOverflow answer, and I can't see what I'm doing wrong. There are no errors when I run the migrations, but the custom one I wrote does run pretty much instanteously, which seems strange, as when I do the save manually, it takes about 5 seconds to save all 300+ entries.
Any suggestions?
P.S. Please let me know if there are any relevant details I neglected to include.
When you run migrations and get Model from apps you can not use custom managers or custom save or create or something like that. This model only have the fields and that's all. If you want to achieve what you want you should add your logic into you migrations like this:
# comment to be more than 6 chars...
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
if sos.history_set.exists():
sos.status = sos.history_set.get(version=sos.latest_version).status
else:
sos.status = "New"
sos.save()
I have a Product model and I want to extend by using OneToOneField.
For example
class Product:
name = models.CharField(..)
price = models.FloatField(...)
I want to do like this
class MyProduct:
product = models.OneToOneField(myapp.Product, on_delete=models.CASCADE)
location = models.CharField(...)
and using signal
def create_myproduct(sender, instance, created, **kwargs):
"""Create MyProduct class for every new Product"""
if created:
MyProduct.objects.create(product=instance)
signals.post_save.connect(create_myproduct, sender=Product, weak=False,
dispatch_uid='models.create_myproduct')
This works for newly created Product, so I can do like this in template.
{{ product.myproduct.location }}
But Old products that created before adding this OneToOneRelation,has no field 'myproduct' and that template code didn't work.
I heard I need a data migrations for old product using RunPython or manage.py shell. Can you teach me how to do? I read a documentation from django, but still don't fully understand.
you can add new migration. and apply it.
something like this code:
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-22 06:04
from __future__ import unicode_literals
from django.db import migrations, models
def create_myproducts(apps, schema_editor):
Product = apps.get_model('myapp', 'Product')
MyProduct = apps.get_model('myapp', 'MyProduct')
for prod in Product.objects.all():
MyProduct.objects.create(product=prod)
class Migration(migrations.Migration):
dependencies = [
('myapp', 'your last migration'),
]
operations = [
migrations.RunPython(create_myproducts)
]
I just found out.
Like Rohit Jain said
product.myproduct is None.
When I tried to access product.myproduct, I got an exception that object does not exist. It has a relation to myproduct but the actual object doesn't exist.
What I really want was creating MyProduct object and add it to Product class.
So I did it in python manage.py shell
products = Product.objects.all()
for prod in products:
if not hasattr(prod, 'myproduct'):
prod.myproduct = MyProduct.objects.create(product=prod)
prod.save()
I think it works for me now.
Thank you guys
You should just migrate your models in a normal way
python manage.py makemigrations
python manage.py migrate
During making migrations you will be asked how to fill new fields for existing data
Please notice that when you are using Django under 1.7 version you do not have migrations (and syncdb will not do the job for existing tables) - consider using the 3rd part tool like south
Something really annoying is happening to me since using Django migrations (not south) and using loaddata for fixtures inside of them.
Here is a simple way to reproduce my problem:
create a new model Testmodel with 1 field field1 (CharField or whatever)
create an associated migration (let's say 0001) with makemigrations
run the migration
and add some data in the new table
dump the data in a fixture testmodel.json
create a migration with call_command('loaddata', 'testmodel.json'): migration 0002
add some a new field to the model: field2
create an associated migration (0003)
Now, commit that, and put your db in the state just before the changes: ./manage.py migrate myapp zero. So you are in the same state as your teammate that didn't get your changes yet.
If you try to run ./manage.py migrate again you will get a ProgrammingError at migration 0002 saying that "column field2 does not exist".
It seems it's because loaddata is looking into your model (which is already having field2), and not just applying the fixture to the db.
This can happen in multiple cases when working in a team, and also making the test runner fail.
Did I get something wrong? Is it a bug? What should be done is those cases?
--
I am using django 1.7
loaddata command will simply call serializers. Serializers will work on models state from your models.py file, not from current migration, but there is little trick to fool default serializer.
First, you don't want to use that serializer by call_command but rather directly:
from django.core import serializers
def load_fixture(apps, schema_editor):
fixture_file = '/full/path/to/testmodel.json'
fixture = open(fixture_file)
objects = serializers.deserialize('json', fixture, ignorenonexistent=True)
for obj in objects:
obj.save()
fixture.close()
Second, monkey-patch apps registry used by serializers:
from django.core import serializers
def load_fixture(apps, schema_editor):
original_apps = serializers.python.apps
serializers.python.apps = apps
fixture_file = '/full/path/to/testmodel.json'
fixture = open(fixture_file)
objects = serializers.deserialize('json', fixture, ignorenonexistent=True)
for obj in objects:
obj.save()
fixture.close()
serializers.python.apps = original_apps
Now serializer will use models state from apps instead of default one and whole migration process will succeed.
To expand on the answer from GwynBleidD and mix in this issue since Postgres won't reset the primary key sequences when loaded this way (https://stackoverflow.com/a/14589706/401636)
I think I now have a failsafe migration for loading fixture data.
utils.py:
import os
from io import StringIO
import django.apps
from django.conf import settings
from django.core import serializers
from django.core.management import call_command
from django.db import connection
os.environ['DJANGO_COLORS'] = 'nocolor'
def reset_sqlsequence(apps=None, schema_editor=None):
"""Suitable for use in migrations.RunPython"""
commands = StringIO()
cursor = connection.cursor()
patched = False
if apps:
# Monkey patch django.apps
original_apps = django.apps.apps
django.apps.apps = apps
patched = True
else:
# If not in a migration, use the normal apps registry
apps = django.apps.apps
for app in apps.get_app_configs():
# Generate the sequence reset queries
label = app.label
if patched and app.models_module is None:
# Defeat strange test in the mangement command
app.models_module = True
call_command('sqlsequencereset', label, stdout=commands)
if patched and app.models_module is True:
app.models_module = None
if patched:
# Cleanup monkey patch
django.apps.apps = original_apps
sql = commands.getvalue()
print(sql)
if sql:
# avoid DB error if sql is empty
cursor.execute(commands.getvalue())
class LoadFixtureData(object):
def __init__(self, *files):
self.files = files
def __call__(self, apps=None, schema_editor=None):
if apps:
# If in a migration Monkey patch the app registry
original_apps = serializers.python.apps
serializers.python.apps = apps
for fixture_file in self.files:
with open(fixture_file) as fixture:
objects = serializers.deserialize('json', fixture)
for obj in objects:
obj.save()
if apps:
# Cleanup monkey patch
serializers.python.apps = original_apps
And now my data migrations look like:
# -*- coding: utf-8 -*-
# Generated by Django 1.11.1 on foo
from __future__ import unicode_literals
import os
from django.conf import settings
from django.db import migrations
from .utils import LoadFixtureData, reset_sqlsequence
class Migration(migrations.Migration):
dependencies = [
('app_name', '0002_auto_foo'),
]
operations = [
migrations.RunPython(
code=LoadFixtureData(*[
os.path.join(settings.BASE_DIR, 'app_name', 'fixtures', fixture) + ".json"
for fixture in ('fixture_one', 'fixture_two',)
]),
# Reverse will NOT remove the fixture data
reverse_code=migrations.RunPython.noop,
),
migrations.RunPython(
code=reset_sqlsequence,
reverse_code=migrations.RunPython.noop,
),
]
When you run python manage.py migrate it's trying to load your testmodel.json in fixtures folder, but your model (after updated) does not match with data in testmodel.json. You could try this:
Change your directory from fixture to _fixture.
Run python manage.py migrate
Optional, you now can change _fixture by fixture and load your data as before with migrate command or load data with python manage.py loaddata app/_fixtures/testmodel.json
I'm facing a really odd problem with OneToOneField. I've a really simple model like
class Doctor(models.Model):
user = models.OneToOneField(User)
The problem is with my method RunPython in the migration. I've written a 0002_addusers migration that depends on 0001_initial and the code is the following:
class Migration(migrations.Migration):
def create_users(apps, schema_editor):
u = User.objects.create_superuser('admin', 'admin#aaa.com', 'admin')
u.save()
du = User.objects.create_user(username='doc01', password='doc01')
du.save()
def create_doctors(apps, schema_editor):
Doctor = apps.get_model('custom_user', 'Doctor')
du = User.objects.get(username='doc01')
d = Doctor(user=du)
d.save()
dependencies = [
('custom_user', '0001_initial')
]
operations = [
migrations.RunPython(create_users),
migrations.RunPython(create_doctors),
]
What is really weird for me is that this really simple code works in views, works in shell, works everywhere except in the migration :)
The traceback is the follow:
line 23, in create_doctors
d = Doctor(user=du)
...
ValueError: Cannot assign "<User: doc01>": "Doctor.user" must be a "User" instance.
Thank you a lot for any support!
EDIT:
I found out the solution. I just had to call the RunPython
migrations.RunPython(create_users, create_doctor)
as Avinash suggested even without moving the functions outside the class.
It seems that subsequent functions have to be called as arguments of a single RunPython call.
The suggested answer to run migrations.RunPython(create_users, create_doctor) doesn't solve your issue, it just makes it invisible.
The second argument of RunPython is the function that will be called during a rollback, this is why it did not raise any exception when migrating upwards. You never called the function create_doctors.
Your issue is caused by du not being a User instance. This can be caused in migrations when not using apps.get_model to get the model class. You should use the following code instead:
class Migration(migrations.Migration):
def create_users(apps, schema_editor):
User = apps.get_model('auth', 'User') # Here you get the user programatically, it is a good practise in migrations
u = User.objects.create_superuser('admin', 'admin#aaa.com', 'admin')
u.save()
du = User.objects.create_user(username='doc01', password='doc01')
du.save()
def create_doctors(apps, schema_editor):
Doctor = apps.get_model('custom_user', 'Doctor')
User = apps.get_model('auth', 'User') # Here you get the user programatically, it is a good practise in migrations
du = User.objects.get(username='doc01')
d = Doctor(user=du)
d.save()
dependencies = [
('custom_user', '0001_initial')
]
operations = [
migrations.RunPython(create_users),
migrations.RunPython(create_doctors),
]
I think the problem is in your migration code. Define your methods outside the Migration class then call it from migration's RunPython command.
Try below code in your migration file. This will work.
def create_users(apps, schema_editor):
u = User.objects.create_superuser('admin', 'admin#aaa.com', 'admin')
u.save()
du = User.objects.create_user(username='doc01', password='doc01')
du.save()
def create_doctors(apps, schema_editor):
Doctor = apps.get_model('custom_user', 'Doctor')
du = User.objects.get(username='doc01')
# We can't import the Doctor model directly, But we can create it. Try this -
Doctor.objects.create(user=du)
class Migration(migrations.Migration):
dependencies = [
('custom_user', '0001_initial')
]
operations = [
migrations.RunPython(create_users, create_doctors),
]
Importing models in a "migration" file will not work if done the "traditional" way. Check this https://www.spheron1.uk/2016/05/15/valueerror-in-django-migration/ to import User model and the documentation here https://docs.djangoproject.com/en/3.2/topics/migrations/#data-migrations for an example.