Does Django's south (migration tool) work for innodb? - python

$ py manage.py migrate turkey
Running migrations for turkey:
- Migrating forwards to 0001_initial.
> turkey:0001_initial
! Error found during real run of migration! Aborting.
! Since you have a database that does not support running
! schema-altering statements in transactions, we have had
! to leave it in an interim state between migrations.
! You *might* be able to recover with: = DROP TABLE `turkey_demorecs` CASCADE; []
! The South developers regret this has happened, and would
! like to gently persuade you to consider a slightly
! easier-to-deal-with DBMS.
! NOTE: The error which caused the migration to fail is further up.
For some reason I get this when I try it.
But my other setups are in MyISAM.
Why doesn't it work in Innodb?

InnoDB has constraints on Foreign Keys which ensure you are not breaking the database model when doing a migration. (see http://dev.mysql.com/doc/refman/5.5/en/innodb-foreign-key-constraints.html)
MyISAM does not have native support for constraints (although it seems you can implement this if you choose to do do http://dev.mysql.com/tech-resources/articles/mysql-enforcing-foreign-keys.html)
Because MyISAM is not checking your FK relationships, you do not get the error. InnoDB however is doing a check and it seems that you have a problem with your migration.

See also https://code.djangoproject.com/wiki/AlterModelOnSyncDB
I have had the same kind of error happen to me when working with a mysql setup whose default table storage engine is MyISAM and I wanted to use InnoDB (using the recipe found in above link, we used the post_syncdb signal to trigger the conversion code). However, when using South to create new tables they were first created using MyISAM engine then later converted. I was mistakenly believing InnoDB tables weren't doing what they were supposed to, when those were actually MyISAM; because the table were converted by the signal, any migration error would fail to unapply :-/
If you need to use or create InnoDB tables where the default is MyISAM, this be solved with:
# add at the beginning of your migration
if db.backend_name == 'mysql':
db.execute('SET storage_engine=INNODB')
or if you do not mind the performance hit:
# add this to settings.py
DATABASE_OPTIONS = {
"init_command": "SET storage_engine=INNODB", # XXX: performance hit...
}

Yes, South does support InnoDB. Can you delete the contents of your "migrations" folder, and re-run schemamigration, migrate, and post the results and contents of the 0001_initial file here? PS: Make sure you have your migrations folder backed up or in source control first.
rm -fr app/migrations/*
./manage.py schemamigration app --initial
./manage.py migrate app

You could try adding to your first migration:
if db.backend_name == 'mysql':
db.execute('SET foreign_key_checks=0')
This will disable the foreign key check constraints.
You don't have to set it back to 1 since it's a session variable.
By the way, it doesn't work if you set to 0 at the beggining and back to 1 at the end of your migration method, because south generates SQL with them, but executes it when they return.

Related

how to retain data in postgres slave after a change in master

I am using bitnami Postgres docker image for creating my master-slave database.
I am making changes to my master and serving my app using slave.
my script needs to remove a table in order to recreate it with updated data.
But as soon as I delete these tables, they will be removed from slave as well, and hence can cause my app to break.
Is there a way to retain tables on slave until I am done recreating them?
Am I using the right approach and intuition?
Any help is much appreciated
Thanks
You can suspend replication by running this statement on the standby server:
SELECT pg_wal_replay_pause();
Then, when you are done, you can resume replication with
SELECT pg_wal_replay_resume();
Note that when you resume replication, the tables on the standby will be deleted as well. But replay will happen as fast as possible, so the time when you have no tables on the standby will be no longer than necessary.
Concerning permissions: the documentation says:
This function is restricted to superusers by default, but other users can be granted EXECUTE to run the function.
So you could
GRANT EXECUTE ON pg_wal_replay_pause() TO some_user;

Unit tests with an unmanaged external read-only database

I'm working on a project which involves a huge external dataset (~490Gb) loaded in an external database (MS SQL through django-pyodbc-azure). I've generated the Django models marked managed=False in their meta. In my application this works fine, but I can't seem to figure out how to run my unit tests. I can think of two approaches: mocking the data in a test database, and giving the unit tests (and CI) read-only access to the production dataset. Both options are acceptable, but I can't figure out either of them:
Option 1: Mocked data
Because my models are marked managed=False, there are no migrations, and as a result, the test runner fails to create the database.
Option 2: Live data
django-pyodbc-azure will attempt to create a test database, which fails because it has a read-only connection. Also I suspect that even if it were allowed to do so, the resulting database would be missing the required tables.
Q How can I run my unittests? Installing additional packages, or reconfiguring the database is acceptable. My setup uses django 1.9 with postgresql for the main DB.
After a day of staring at my screen, I found a solution:
I removed the managed=True from the models, and generated migrations. To prevent actual migrations against the production database, I used my database router to prevent the migrations. (return False in allow_migrate when for the appropriate app and database).
In my settings I detect whether unittests are being run, and then just don't define the database router or the external database. With the migrations present, the unit tests.

Django App - Getting Initial Rows into Database

I have a Django application and I have a few rows that I want to be in the application at the beginning of time. An example is a System Settings table that has some settings - they should be setup with any db instance that is constructed.
In the past, I had handled this by making an migration script manually that inserted the records. However, when I run my tests and the database is created and deleted, these scripts are not run again and the database is empty. The tests assume the that the migrations only have schema migrations, so that they don't need to run them again, but that is not the case. This has led me to think that maybe my migrations shouldn't be data migrations and I should rethink the process? I am not sure what to do.
you could maybe use fixtures in your tests for initial data for models.
https://docs.djangoproject.com/en/1.8/howto/initial-data/

Inconsistency between database and Django ORM output at breakpoint

While debugging a unit test in Django, I placed the following statement in my code
Student.objects.filter(id__in=[1,2]).delete()
...
import pdb; pdb.set_trace()
At the breakpoint, if I type Student.objects.count(), I can see that it has reduced due to delete however if I open psql (PostgreSQL command line) and check the test_database, I can still see the rows (which have been deleted according to Django). Why do I see this inconsistency between Django ORM & the database. Does the ORM cache my queries? How can I make it commit to the database at the breakpoint?
Update:
Quick solution for debugging. Add the following lines at breakpoint to see the data in psql. Thanks to #DanielRoseman for the tip.
from django.db import connection
connection.cursor().execute('commit;')
Django requests usually run inside a transaction. Your psql session won't see the changes until the transaction is committed when the session returns.

Django w/ MySQL non-transactional changed tables couldn't be rolled back

Keep getting this warning using a MySQL database:
Some non-transactional changed tables couldn't be rolled back
I'm not sure what it means or if it is even causing a problem but I was hoping someone would be able to fill me in on what this means.
I am taking a CSV file, reading it line-by-line and creating Django objects using get_or_create. After I get the message, when I try to recreate it, I get further into the CSV file before the warning occurs.
I tried reading about this error online but I really don't understand what it means. It would be ideal to figure out whats causing this but if I can't I am wondering if I can suppress the warning because maybe it isn't effect my database negatively.
This happens when you mix transactional and non-transactional tables. Changes to non- transactional tables are not effected by a ROLLBACK statement.
For some reasons this may have happened to you we can turn to the docs:
if you were not deliberately mixing transactional and nontransactional tables within the transaction, the most likely cause for this message is that a table you thought was transactional actually is not. This can happen if you try to create a table using a transactional storage engine that is not supported by your mysqld server (or that was disabled with a startup option). If mysqld does not support a storage engine, it instead creates the table as a MyISAM table, which is nontransactional.
This will effect things negatively if you say have an HTTP request that kicks of a transaction, you make some changes, and you need to rollback. The transactional tables will rollback but the others will not. If a transactional storage engine is a requirement for your software you should consider taking steps to migrate all the relevant tables to the InnoDB engine.
For me this error happened after I imported a table from another Django application. The origin DB had all the table engines set to MyISAM and the destination app had all the engines set as InnoDB. When I imported the existing table the engine was changed from InnoBD to MyISAM to match the source. I resolved this using MySQL on the command line like so:
$ mysql -uroot -pPASSWORD
> use MY_DB;
> show table status;
> alter table TABLE_WITH_MYISAM engine=innodb;
> quit;
I had imported 5 tables so I had to do the alter command for each table. The show command above will print out the table names and engine settings for all tables in MY_DB.
I hope this helps solve your issue! Cheers!

Categories