Related
So I've got an API built in Django with Django Rest Framework and I now want to add a role based access control to it. For this I found the django-rest-framework-roles extension. I've got it installed, but I'm not really familiar with the usual authentication system in Django. It says I need to define the groups in the settings as
ROLE_GROUPS = [group.name.lower() for group in Group.objects.all()]
So I need the Group model and of course also the User model. As far as I understand, these are standard models. However, I don't have any tables in my DB for them. So I need a migration for it, but I'm unsure how I can do that.
I guess it should be really easy, but even on the relevant pages in the documentation I don't see any mention on how to add these models to my Django installation.
Could anybody enlighten me on this?
In your settings.py you have something like this:
INSTALLED_APPS = [
...
"django.contrib.auth",
...
]
That app has the Group and User models(included django app), so the first thing that you will do after config the database is migrate with this command./manage.py migrate, after migrate you can use importing them like this: from django.contrib.auth.models import User, Group
The standard User model in django has the table name auth_user.
The standard Group model has the table name of auth_group.
The database tables themselves are created once you have ran your first migration script after starting your project.
This is done from the command line with:
$ python manage.py migrate
yekabathula-macbookair2:roster yekabathula$ python manage.py migrate
Operations to perform:
Synchronize unmigrated apps: staticfiles, messages
Apply all migrations: admin, contenttypes, api, auth, sessions
Synchronizing apps without migrations:
Creating tables...
Running deferred SQL...
Installing custom SQL...
Running migrations:
Rendering model states... DONE
Applying contenttypes.0001_initial... OK
Applying auth.0001_initial... OK
Applying admin.0001_initial... OK
Applying api.0001_initial... OK
Applying contenttypes.0002_remove_content_type_name... OK
Applying auth.0002_alter_permission_name_max_length... OK
Applying auth.0003_alter_user_email_max_length... OK
Applying auth.0004_alter_user_username_opts... OK
Applying auth.0005_alter_user_last_login_null... OK
Applying auth.0006_require_contenttypes_0002... OK
Applying sessions.0001_initial... OK
yekabathula-macbookair2:roster yekabathula$ python manage.py syncdb
/Library/Python/2.7/site-packages/django/core/management/commands/syncdb.py:24: RemovedInDjango19Warning: The syncdb command will be removed in Django 1.9
warnings.warn("The syncdb command will be removed in Django 1.9", RemovedInDjango19Warning)
Operations to perform:
Synchronize unmigrated apps: staticfiles, messages
Apply all migrations: admin, contenttypes, api, auth, sessions
Synchronizing apps without migrations:
Creating tables...
Running deferred SQL...
Installing custom SQL...
Running migrations:
No migrations to apply.
After doing python manage.py migrate, tables are not created in database from my models.py it is able to create other tables from django_session etc. Is there anything else that I need to follow here ?
I was facing a similar problem in Django 1.10 and none of the above solutions worked for me.
What eventually worked was running this command:
python manage.py migrate --fake myappname zero
This reset all migrations (to the zeroth state)
This followed by :
python manage.py migrate myappname
created the tables for me.
If you do not want to roll back to the initial(zero) state but say to the migration number 0005(the last migration that worked), you can instead do this:
python manage.py migrate --fake myappname 0005
And then proceed with the actual migrate:
python manage.py migrate myappname
More details in the docs
In my case the __init__.py file was missing from the APP/migrations/ folder. If you don't have one, all it needs is an empty __init__.py file.
I ran into the same problem. After lots of digging, I found the solution.
I'm using django 1.11.
If you want to start-over,
1)delete all the files in your migrations folder except __init__.py
2)drop database
3)create database
4)python makemigrations
5)python migrate
if you have reset_db, instead of 2nd and 3rd steps you can use reset_db.
python manage.py reset_db
I am using MySQL and get into this issue after deleting 0001_initial.py migration file and all the custom tables evolved from DB to try to regenerate all they...
Solved this issue simply deleting these rows in django_migrations table...
After that, $ python manage.py migrate command regenerates all my custom tables again.
I had a similar problem and just figured it out.I have multiple databases. My local one (the one not being updated) is a MySQL database. The others are MS SQL Server and MySQL. I have routers to the other databases since I do not manage them and had (in Django 1.6) used the routers to indicate allow_sync() = False. With 1.7, I changed that to allow_migrate() = False. BUT I DID NOT ADD A ROUTER FOR MY LOCAL DATABASE. The default appears to be allow_migrate() = False if there is not one. As a result, the migrations just failed silently ( Reference: https://docs.djangoproject.com/en/1.7/topics/db/multi-db/). I added a router for my local DB, setting allow_migrate() to return True and now my migrations actually create my tables.
Change managed = True (if it is set to False) in models.py
class Meta:
managed = False
db_table = 'table_name'
To
class Meta:
managed = True
db_table = 'table_name'
I had a similar issue. I did everything stated above but nothing worked. Then I realized I had not added my model name in admin.py file in my app.
Along with everything stated above you have to also add your model name in admin.py file. You have to add it like this:
admin.site.register(model name)
This can be very frustrating, but here is method that worked for me.
First, if you've deleted the migration file go ahead and create it again
python manage.py makemigrations
Then compare the migration file with the SQL raw code and if it fit then apply migration
python manage.py sqlmigrate [app_name] 0001
python manage.py migrate
or you can simply pipe it from your command line or terminal
python manage.py sqlmigrate [app_name] 0001 | [psql] db_name username
BEGIN
CREATE TABLE
ALTER TABLE
COMMIT
NOTE: In my case I'm using postgresql. Though engine is similar to other database langauge, you might not get expected result.
Delete existing tables of the models in MySQL database.
Delete migration folder under app folder.
Delete all relative migration records in the table
"django_migrations" from MySQL.
Now you get clear model and database. Use
python manage.py makemigrations and python manage.py migrate
to create tables.
Hope to help you.
try this one,
run,
python manage.py makemigrations app_name
above command will make migrations, if successful then run second command or
check if you have typo in Installed_app and also check for AppConfig module
python manage.py migrate app_name
if above was successful, this will create the table in db
This solved the problem for me (I am using MySQL workbench by the way):
Run this sql: SET FOREIGN_KEY_CHECKS = 0;
Select all the tables in your django database (click on the first table, then press and hold shift, then click on the last table). Then right click and choose "Drop n tables" (where n is the number of tables you just selected)
then run python manage.py migrate
Finally restore foreign key check settings by running this sql: SET FOREIGN_KEY_CHECKS = 1;
Note: Before taking this drastic measure, I tried what Paulo Pessoa said in his comment, but still I got "No migrations to apply." messages. However, this solved the issue.
If you want to do it only for your application not from all the apps then Clear your application migrations records from 'django_migrations' table in your DB.
Python manage.py makemigrtions
python
Problem: : When you apply migrations in django for the first time, django creates table of that model in database and marks somewhere in its own file(class):
`initial = True`
When you then tries to alter the schema of that table it firstly checks
if initial = True
if an initial class attribute isn’t found, a migration will be considered “initial”
In case the initial = True we need to use
python manage.py migrate --fake-initial
For an initial migration Django checks that all of those tables already exist in the database and fake-applies the migration if so. Similarly, for an initial migration that adds one or more fields Django checks that all of the respective columns already exist in the database and fake-applies the migration if so.
Fake initial migration uses both CreateModel() and AddField() methods.
Solution:
>> python manage.py makemigrations <AppName>
>> python manage.py migrate --fake-initial
To avoid deleting critical databases, you may also consider silencing properties below the Class Meta: for example this model:
class Blog(models.Model):
category = models.CharField(max_length=100)
title = models.CharField(max_length=100)
date_added = models.DateTimeField()
merits = models.CharField(max_length=300, blank=True)
demerits = models.CharField(max_length=300, blank=True)
class Meta:
managed = True
db_table = 'weblog'
verbose_name_plural = "Blog"
#property
def content_min(self):
return truncatechars(self.content, 50)
You can then run makemigrations and migrate and your table will be created.
Check if you have added your created app to you installed apps under settings.py.
The list should seem like this:
INSTALLED_APPS = [
'django.contrib.admin','django.contrib.auth','django.contrib.contenttypes',
'django.contrib.sessions', 'django.contrib.messages','django.contrib.staticfiles',
'YourCreatedApp',
]
in my case just one app's tables of project doesn't recreate!
and this line of code recreate them and successfully run!
python manage.py migrate --run-syncdb
however i couldn't find main reason caused this kind of problem!
Delete database
delete migration folder
run migrate command
run makemigrations command
run migrate command
It will create all tables perfectly
Change from sqlite3 to mysql in settings.py
Make sure you have correct information regarding database name and username-password
Delete existing migration
Make migration
And then migrate
Make sure each app's migrations folder has a __init__.py file.
I had a similar issue, the connection with DB was set correctly as all the django admin tables were created on DB side. However, no models appeared in models.py
What worked for me is running in the console:
python manage.py inspectdb > models.py
which wrote everything into a new models.py file which then I replaced with the one I had in the app folder. Then I could alter the
managed = False
into
managed = True
here the link to the documentation: https://docs.djangoproject.com/en/3.1/howto/legacy-databases/
If you are a begginer like me make sure your classes extend models.Model. Then makemigrations and migrate.
class Cook(models.Model):
email = models.Field()
I solved a similar problem by doing this:
Delete all migration files and pycache so what retains is init.py
Comment all implementations of that table in models, serializers, admin, urls and views
Do python manage.py makemigrations
Do python manage.py migrate
Then, uncomment all implementations of that table in models, serializers, admin, urls and views
Then, do python manage.py makemigrations
Then, do python manage.py migrate
To elaborate on #Leman-Kirme 's answer.
I had a similar problem where the table for the model I added into models.py of one of my apps wouldn't get created when making and applying migrations. I tried different solutions but nothing seemed to work and I really didn't want to flush my databases. So what I did was:
Delete all the files in the migrations folder except for _init_.py
Delete all records in databases' django_migrations tables.
Manually make a migration file (copied from another project and changed the content so that migration would only make a model that I was having troubles with). Please, note, that it seems like it should be initial migration, i.e. 0001_initial.py
Run it.
Voila, here comes the table!
If you want / need, instead of deleting previous migrations' files and records in the databases, you could backup them and restore afterwards. You gonna need to edit name of your manually-created migration from 0001_initial to something like <number_of_existing_migrations>_fix
That worked for me (but I ignored step 6), I hope someone finds this useful as well!
Gist: => Delete entry from django_migrations and run the migrate command again.
This answer takes help from other answers, but still writing as there are different cases and my answer might help in your case.
After creating migration, I think by mistake I ran the command python manage.py migrate campaign --fake I am not sure if it did some wrong thing, so now running python manage.py migrate campaign was not creating tableSo what solved for me is--
in mysql shell run
select * from django_migrations; Note the id of last migration which is creating problem in my case it was 62delete from django_migrations where id = 62Now I ran the migration again, and it worked this time. with command python manage.py migrate campaign i.e. required table was created as a result of applying migration
The problem I discovered working with Postgres is that I had not set the schema. I the original question is one of those gotchas that may have a number of different answers, so if the other answers are not working for you just check this.
The "problem" is that Postgres has an additional "namespace" layer which means you can specify the database but there can still be a whole set of tables, still in the same database, but in another schema or namespace.
The tables in this schema may be unreachable because the user being used to connect does not have permissions, or in fact the schema has been specified, but the tables were created in a different schema, or even you haven't specified the schema this time and are looking in the public schema while the tables have been created in another schema.
So the solution is to set the schema correctly, to what you intend, in every location and place that you use it. This boils down to correct database settings in all the settings files you are using, and logging in to the right schema and user when you inspect the database manually.
For example:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {'options': f'-c search_path={"foobar"}'},
'NAME': 'foo',
'USER': 'bar',
'PASSWORD': 'barpass',
'HOST': 'baz.example.com',
'PORT': 5432,
},
}
The database name here is foo, the username is bar and the schema name is foobar. Adding this line to the OPTIONS parameter defaults the schema for all operations.
For other usage, within Postgres, you can display the current search_path, and alter it for the current session, or "permanently" for the database:
SHOW search_path;
SET search_path=foobar,public;
ALTER DATABASE foo SET search_path TO foobar;
If your problem was that Django was creating your tables, but just in the wrong schema or under different permissions, I hope this has resolved your issue.
This kind-of thing was happening to me when I would try to reset my DB. Nothing else would work to get django to recognize my tables except:
python manage.py migrate --run-syncdb
In my case those solutions didn't work to me
I have to:
Create a new app: py manage.py startapp new_app_name
Copy all the files in my new app
Run: py manage.py makemigrations
Run: py manage.py migrate
It works to me, I hope it could be useful for you too
Below commands worked for me
python manage.py makemigrations app_name
python manage.py migrate app_name
I had the same issue and lol I just forgot to change my engine to the needed one
I am trying create a new model with Django, but I keep running into the error Lookup failed for model referenced by field help.HelpDefinition.org: account.Organization. Organization has been imported. You can see the model below.
models.py
org = models.ForeignKey(Organization, unique=True)
help_type = models.CharField(max_length=255, choices=HELP_CHOICES)
help_content = models.TextField(blank=True)
This model has been successfully migrated previously. I dropped the table via psql in Postgres so that it could be recreated.
It happens when you change the target objects in relationship. Even if they have the same name and fields they are not the same objects. I had the same issue and deleting all previous migrations from migrations folder solved it.
You can also add as a dependency to the migration the last migration from the object's app. That did the trick for me.
class Migration(migrations.Migration):
dependencies = [
(<app>, <last_migration_filename>),
...
My case was: moving away from South I deleted almost all migration files from several apps and applied makemigrations and migrate and later on I found out some forgotten migrations in one app and I tried to do the process (delete/makemigrations) only for this one app. But going back one step and recreating the migrations for ALL the apps fixed the issue for me.
I'm trying to extend the User model of a django app, but I keep getting the error:
OperationalError at /admin/auth/user/3/
Exception Value: no such column: subjects_subject.user_id
My Code:
#subjects/models.py
from django.contrib.auth.models import User
from django.db import models
class Subject(models.Model):
user = models.OneToOneField(User)
description = models.CharField(max_length=100)
models.signals.post_save
#_admin/admin.py
from django.contrib.auth.models import User, Group
from django.contrib.auth.admin import UserAdmin, GroupAdmin
from .extended_admin import new_admin
from django.contrib import admin
from subjects.models import Subject
class SubjectInline(admin.StackedInline):
model = Subject
can_delete = False
verbose_name_plural = 'subject'
class UserAdmin(UserAdmin):
inlines = (SubjectInline, )
new_admin.register(User, UserAdmin)
new_admin.register(Group, GroupAdmin)
I have pretty much copied Django's own documentation word for word. Any help would really be appreciated!
EDIT:
I also wanted to say that I have ran syndb and flush
I had the same issue and here are the steps I took to solve it. You did not specify the database you are using but in my case I am using MySQL. From Django docs here https://docs.djangoproject.com/en/1.8/topics/migrations/#mysql it seems sometimes creating the tables fails. I had a table called 'administration' and that kept failing to detect any changes.
Here is what I did:
Remove migration directory from my app. This was so as to create new migration schemas
Rename the directory. I renamed my directory from 'administration' to 'donor'. For some weird reason this seemed to fix the issue. I am not entirely sure why but could be 'administration' is a key word in either MySQL or Django
If both steps do not work, you might have to manually add the changes as directed on the django docs. Even easier, you can drop the existing table if its empty and recreate it. (WARNING! Please only do this if the table is empty otherwise you risk losing all your data)
All in all, there seems to be no outright reason as to why this could be happening at the moment. My trial and error approach seems to have fixed it.
Did you solved this problem?
I had the same issue , after different trying , I solved this. I'm using sqlite3 .
it seems like the when you first migrate , there is something wrong with the database table (I don't know what this problem caused by , so you raise the error : no such column: subjects_subject.user_id)
if you remove the migrations directory , and re-migrate , that will not solve this problem , because Django keeps track of all the applied migrations in django_migrations table.Migration was faked because the table already existed (probably with an outdated schema)SO I delete all the rows in the django_migrations table.
Here is what I did:
1.remove all files in migrations directory in my django app
2.using python manage.py dbshell , then DELETE FROM django_migrations WHERE app='your-app-name or DELETE FROM django_migrations WHERE ID='your-first-migrate
3.then python manage.py makemigrations python manage.py migrate
Successful!
I recently switched from Django 1.6 to 1.7, and I began using migrations (I never used South).
Before 1.7, I used to load initial data with a fixture/initial_data.json file, which was loaded with the python manage.py syncdb command (when creating the database).
Now, I started using migrations, and this behavior is deprecated :
If an application uses migrations, there is no automatic loading of fixtures.
Since migrations will be required for applications in Django 2.0, this behavior is considered deprecated. If you want to load initial data for an app, consider doing it in a data migration.
(https://docs.djangoproject.com/en/1.7/howto/initial-data/#automatically-loading-initial-data-fixtures)
The official documentation does not have a clear example on how to do it, so my question is :
What is the best way to import such initial data using data migrations :
Write Python code with multiple calls to mymodel.create(...),
Use or write a Django function (like calling loaddata) to load data from a JSON fixture file.
I prefer the second option.
I don't want to use South, as Django seems to be able to do it natively now.
Update: See #GwynBleidD's comment below for the problems this solution can cause, and see #Rockallite's answer below for an approach that's more durable to future model changes.
Assuming you have a fixture file in <yourapp>/fixtures/initial_data.json
Create your empty migration:
In Django 1.7:
python manage.py makemigrations --empty <yourapp>
In Django 1.8+, you can provide a name:
python manage.py makemigrations --empty <yourapp> --name load_intial_data
Edit your migration file <yourapp>/migrations/0002_auto_xxx.py
2.1. Custom implementation, inspired by Django' loaddata (initial answer):
import os
from sys import path
from django.core import serializers
fixture_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), '../fixtures'))
fixture_filename = 'initial_data.json'
def load_fixture(apps, schema_editor):
fixture_file = os.path.join(fixture_dir, fixture_filename)
fixture = open(fixture_file, 'rb')
objects = serializers.deserialize('json', fixture, ignorenonexistent=True)
for obj in objects:
obj.save()
fixture.close()
def unload_fixture(apps, schema_editor):
"Brutally deleting all entries for this model..."
MyModel = apps.get_model("yourapp", "ModelName")
MyModel.objects.all().delete()
class Migration(migrations.Migration):
dependencies = [
('yourapp', '0001_initial'),
]
operations = [
migrations.RunPython(load_fixture, reverse_code=unload_fixture),
]
2.2. A simpler solution for load_fixture (per #juliocesar's suggestion):
from django.core.management import call_command
fixture_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), '../fixtures'))
fixture_filename = 'initial_data.json'
def load_fixture(apps, schema_editor):
fixture_file = os.path.join(fixture_dir, fixture_filename)
call_command('loaddata', fixture_file)
Useful if you want to use a custom directory.
2.3. Simplest: calling loaddata with app_label will load fixtures from the <yourapp>'s fixtures dir automatically :
from django.core.management import call_command
fixture = 'initial_data'
def load_fixture(apps, schema_editor):
call_command('loaddata', fixture, app_label='yourapp')
If you don't specify app_label, loaddata will try to load fixture filename from all apps fixtures directories (which you probably don't want).
Run it
python manage.py migrate <yourapp>
Short version
You should NOT use loaddata management command directly in a data migration.
# Bad example for a data migration
from django.db import migrations
from django.core.management import call_command
def load_fixture(apps, schema_editor):
# No, it's wrong. DON'T DO THIS!
call_command('loaddata', 'your_data.json', app_label='yourapp')
class Migration(migrations.Migration):
dependencies = [
# Dependencies to other migrations
]
operations = [
migrations.RunPython(load_fixture),
]
Long version
loaddata utilizes django.core.serializers.python.Deserializer which uses the most up-to-date models to deserialize historical data in a migration. That's incorrect behavior.
For example, supposed that there is a data migration which utilizes loaddata management command to load data from a fixture, and it's already applied on your development environment.
Later, you decide to add a new required field to the corresponding model, so you do it and make a new migration against your updated model (and possibly provide a one-off value to the new field when ./manage.py makemigrations prompts you).
You run the next migration, and all is well.
Finally, you're done developing your Django application, and you deploy it on the production server. Now it's time for you to run the whole migrations from scratch on the production environment.
However, the data migration fails. That's because the deserialized model from loaddata command, which represents the current code, can't be saved with empty data for the new required field you added. The original fixture lacks necessary data for it!
But even if you update the fixture with required data for the new field, the data migration still fails. When the data migration is running, the next migration which adds the corresponding column to the database, is not applied yet. You can't save data to a column which does not exist!
Conclusion: in a data migration, the loaddata command introduces potential inconsistency between the model and the database. You should definitely NOT use it directly in a data migration.
The Solution
loaddata command relies on django.core.serializers.python._get_model function to get the corresponding model from a fixture, which will return the most up-to-date version of a model. We need to monkey-patch it so it gets the historical model.
(The following code works for Django 1.8.x)
# Good example for a data migration
from django.db import migrations
from django.core.serializers import base, python
from django.core.management import call_command
def load_fixture(apps, schema_editor):
# Save the old _get_model() function
old_get_model = python._get_model
# Define new _get_model() function here, which utilizes the apps argument to
# get the historical version of a model. This piece of code is directly stolen
# from django.core.serializers.python._get_model, unchanged. However, here it
# has a different context, specifically, the apps variable.
def _get_model(model_identifier):
try:
return apps.get_model(model_identifier)
except (LookupError, TypeError):
raise base.DeserializationError("Invalid model identifier: '%s'" % model_identifier)
# Replace the _get_model() function on the module, so loaddata can utilize it.
python._get_model = _get_model
try:
# Call loaddata command
call_command('loaddata', 'your_data.json', app_label='yourapp')
finally:
# Restore old _get_model() function
python._get_model = old_get_model
class Migration(migrations.Migration):
dependencies = [
# Dependencies to other migrations
]
operations = [
migrations.RunPython(load_fixture),
]
Inspired by some of the comments (namely n__o's) and the fact that I have a lot of initial_data.* files spread out over multiple apps I decided to create a Django app that would facilitate the creation of these data migrations.
Using django-migration-fixture you can simply run the following management command and it will search through all your INSTALLED_APPS for initial_data.* files and turn them into data migrations.
./manage.py create_initial_data_fixtures
Migrations for 'eggs':
0002_auto_20150107_0817.py:
Migrations for 'sausage':
Ignoring 'initial_data.yaml' - migration already exists.
Migrations for 'foo':
Ignoring 'initial_data.yaml' - not migrated.
See django-migration-fixture for install/usage instructions.
In order to give your database some initial data, write a data migration.
In the data migration, use the RunPython function to load your data.
Don't write any loaddata command as this way is deprecated.
Your data migrations will be run only once. The migrations are an ordered sequence of migrations. When the 003_xxxx.py migrations is run, django migrations writes in the database that this app is migrated until this one (003), and will run the following migrations only.
The solutions presented above didn't work for me unfortunately. I found that every time I change my models I have to update my fixtures. Ideally I would instead write data migrations to modify created data and fixture-loaded data similarly.
To facilitate this I wrote a quick function which will look in the fixtures directory of the current app and load a fixture. Put this function into a migration in the point of the model history that matches the fields in the migration.
On Django 2.1, I wanted to load some models (Like country names for example) with initial data.
But I wanted this to happen automatically right after the execution of initial migrations.
So I thought that it would be great to have a sql/ folder inside each application that required initial data to be loaded.
Then within that sql/ folder I would have .sql files with the required DMLs to load the initial data into the corresponding models, for example:
INSERT INTO appName_modelName(fieldName)
VALUES
("country 1"),
("country 2"),
("country 3"),
("country 4");
To be more descriptive, this is how an app containing a sql/ folder would look:
Also I found some cases where I needed the sql scripts to be executed in a specific order. So I decided to prefix the file names with a consecutive number as seen in the image above.
Then I needed a way to load any SQLs available inside any application folder automatically by doing python manage.py migrate.
So I created another application named initial_data_migrations and then I added this app to the list of INSTALLED_APPS in settings.py file. Then I created a migrations folder inside and added a file called run_sql_scripts.py (Which actually is a custom migration). As seen in the image below:
I created run_sql_scripts.py so that it takes care of running all sql scripts available within each application. This one is then fired when someone runs python manage.py migrate. This custom migration also adds the involved applications as dependencies, that way it attempts to run the sql statements only after the required applications have executed their 0001_initial.py migrations (We don't want to attempt running a SQL statement against a non-existent table).
Here is the source of that script:
import os
import itertools
from django.db import migrations
from YourDjangoProjectName.settings import BASE_DIR, INSTALLED_APPS
SQL_FOLDER = "/sql/"
APP_SQL_FOLDERS = [
(os.path.join(BASE_DIR, app + SQL_FOLDER), app) for app in INSTALLED_APPS
if os.path.isdir(os.path.join(BASE_DIR, app + SQL_FOLDER))
]
SQL_FILES = [
sorted([path + file for file in os.listdir(path) if file.lower().endswith('.sql')])
for path, app in APP_SQL_FOLDERS
]
def load_file(path):
with open(path, 'r') as f:
return f.read()
class Migration(migrations.Migration):
dependencies = [
(app, '__first__') for path, app in APP_SQL_FOLDERS
]
operations = [
migrations.RunSQL(load_file(f)) for f in list(itertools.chain.from_iterable(SQL_FILES))
]
I hope someone finds this helpful, it worked just fine for me!. If you have any questions please let me know.
NOTE: This might not be the best solution since I'm just getting started with Django, however I still wanted to share this "How-to" with you all since I didn't find much information while googling about this.
In my opinion fixtures are a bit bad. If your database changes frequently, keeping them up-to-date will came a nightmare soon. Actually, it's not only my opinion, in the book "Two Scoops of Django" it's explained much better.
Instead I'll write a Python file to provide initial setup. If you need something more I'll suggest you look at Factory boy.
If you need to migrate some data you should use data migrations.
There's also "Burn Your Fixtures, Use Model Factories" about using fixtures.
Although #rockallite's answer is excellent, it does not explain how to handle fixtures that rely on natural keys instead of integer pk values.
Simplified version
First, note that #rockallite's solution can be simplified by using unittest.mock.patch as a context manager, and by patching apps instead of _get_model:
...
from unittest.mock import patch
...
def load_fixture(apps, schema_editor):
with patch('django.core.serializers.python.apps', apps):
call_command('loaddata', 'your_data.json', ...)
...
This works well, as long as your fixtures do not rely on natural keys.
If they do, you're likely to see a DeserializationError: ... value must be an integer....
The problem with natural keys
Under the hood, loaddata uses django.core.serializers.deserialize() to load your fixture objects.
The deserialization of fixtures based on natural keys relies on two things:
the presence of a get_by_natural_key() method on the model's default manager
the presence of a natural_key() method on the model itself
The get_by_natural_key() method is necessary for the deserializer to know how to interpret the natural key, instead of an integer pk value.
Both methods are necessary for the deserializer to get existing objects from the database by natural key, as also explained here.
However, the apps registry which is available in your migrations uses historical models, and these do not have access to custom managers or custom methods such as natural_key().
Possible solution: step 1
The problem of the missing get_by_natural_key() method from our custom model manager is relatively easy to solve:
Just set use_in_migrations=True on your custom manager, as described in the documentation.
This ensures that your historical models can access the current get_by_natural_key() during migrations, and fixture loading should now succeed.
However, your historical models still don't have a natural_key() method. As a result, your fixtures will be treated as new objects, even if they are already present in the database.
This may lead to a variety of errors if the data-migration is ever re-applied, such as:
unique-constraint violations (if your models have unique-constraints)
duplicate fixture objects (if your models do not have unique-constraints)
"get returned multiple objects" errors (due to duplicate fixture objects created previously)
So, effectively, you're still missing out on a kind of get_or_create-like behavior during deserialization.
To experience this, just apply a data-migration as described above (in a test environment), then roll back the same data-migration (without removing the data), then re-apply the data-migration.
Possible solution: step 2
The problem of the missing natural_key() method from the model itself is a bit more difficult to solve.
One solution would be to assign the natural_key() method from the current model to the historical model, for example:
...
from unittest.mock import patch
from django.apps import apps as current_apps
from django.core.management import call_command
...
def load_fixture(apps, schema_editor):
def _get_model_patch(app_label):
""" add natural_key method from current model to historical model """
historical_model = apps.get_model(app_label=app_label)
current_model = current_apps.get_model(app_label=app_label)
historical_model.natural_key = current_model.natural_key
return historical_model
with patch('django.core.serializers.python._get_model', _get_model_patch):
call_command('loaddata', 'your_data.json', ...)
...
Notes:
For clarity, I omitted things like error handling and attribute checking from the example. You should implement those where necessary.
This solution uses the current model's natural_key method, which may still lead to trouble in certain scenarios, but the same goes for Django's use_in_migrations option for model managers.