django sync db question - python

In django models say this model exist in details/models.py
class OccDetails(models.Model):
title = models.CharField(max_length = 255)
occ = models.ForeignKey(Occ)
So when sync db is made the following fields get created
and later to this of two more fields are added and sync db is made the new fields doesnt get created.How is this to be solved,Also what is auto_now=true in the below
these are the new fields
created_date = models.DateTimeField(auto_now_add=True)
modified_date = models.DateTimeField(auto_now_add=True, auto_now=True)

syncdb creates the database tables for all apps in INSTALLED_APPS whose tables have not already been created.
Syncdb will not alter existing tables
syncdb will only create tables for models which have not yet been installed. It will never issue ALTER TABLE statements to match changes made to a model class after installation. Changes to model classes and database schemas often involve some form of ambiguity and, in those cases, Django would have to guess at the correct changes to make. There is a risk that critical data would be lost in the process.
you can either
Issue a manual ALTER TABLE command
DROP TABLE the particular table (will lose data) and run syncdb again
run django-admin sqlclear to get a list of sql statements to clear the entire db and run those commands (will flush the db - you'll lose all existing data) or
DateField.auto_now: automatically set the field to NOW() every time the object is saved. Useful for "last-modified" timestamps. Note that the current date is always used; it's not just a default value that you can override.
Thus, the modified_date column will be automatically updated every time you call object.save()

This is a common problem with Django. As said by Amarghosh, syncdb can not modify the schema of existing tables.
South has been created to solve this problem.
I do recommend it.

Related

Best practice when add a new unique field to an existing django model

I have an existing model that looks somewhat like the following...
class Resource(models.Model):
id = models.AutoField(primary_key=True)
We have been using this for some time, and now have ~1M instances of these Resource objects (and associated ForeignKey/else usages) in our database.
I now have a need to track another ID on this model, one that I want to enforce is unique.
other_id = models.IntegerField(unique=True)
This other_id information is currently stored in some external CSVs, and I want to (at some point in the process) load this information in to all existing Resource instances.
After adding the above field, Django's makemigrations works just fine. However when I go to apply said migration against an existing database I get an error indicating that I need to provide a default to use for all existing Resource instances. I'm sure many of you have seen something similar.
What is the best approach to getting around this limitation? Some methods I have thought of...
Remove the unique=True requirement
apply the migration
externally load in the other_id value to all existing models (through some management command, or 1-off script)
add the unique=True back in and apply the migration
Dump all existing data to JSON
flush all tables
apply the migration (with unique=True)
write a script that loads the data back in, adding the correct other_id value
(unsure if this is possible) - Write some custom migration logic to automatically reference these external CSVs to load other_id values when I run manage.py migrate. This could hit issues if (at some point in the future) someone re-runs these migrations and this part fails (cannot find existing resource id in the CSVs to pull out other_id).
All of these feel complicated, but then again I guess what I am trying to do isn't the simplest thing either.
Any ideas? I have to imagine someone has had to work around a similar issue in the past.
Thanks!
Actually, the source or your issue is not the unique constraint by itself but the fact that your field doesn't allow nulls and has no default value - you'd have the very same error with a non-unique field.
The proper solution here is to allow the field to be null (null=True) and default it to None (which will translate to sql "null"). Since null values are excluded from unique constraints (at least if your db vendor respects SQL standard), this allow you to apply the schema change while still making sure you cannot have a duplicate for non-null values.
Then you may want a data migration to load the known "other_id" values, and eventually a third schema migration to disallow null values for this field - if and only if you know you have filled this field for all records.
Django has something called Data Migrations where you create a migration file that modifies/remove/add data to your database as you apply your migrations.
In this case you would create 3 different migrations:
Create a migration that allow null values with null=True.
Create a data migration that populate the data.
Create a migration that disallow null values by removing the null=True added in step 1.
As you then run python manage.py migrate it would apply all of the migrations in step 1-3 in the correct order.
Your data migration would look something like this:
from django.db import migrations
def populate_reference(apps, schema_editor):
MyModel = apps.get_model('yourappname', 'MyModel')
for obj in MyModel.objects.all():
obj.other_id = random_id_generator()
obj.save()
class Migration(migrations.Migration):
dependencies = [
('yourappname', '0001_initial'),
]
operations = [
migrations.RunPython(populate_reference),
]
You can create an empty migration file using the ./manage.py makemigrations --empty yourappname command.

members_data.dob may not be NULL in Django

Using Django 1.4 in my app I defined a model called Member and another called Data.Every member has basic like ID and it is related to a Data object that contains additional variables describing the "member".
I had initially created the member model without specifying that the dob variable could be NULL. I have since then changed this to allow for blank or null but I was still getting the members_data.dob may not be NULL error.
I thought it was because I needed to run a syncdb command, which I did, however this did not fix the problem.
dob = models.CharField(max_length=200, blank=True, null=True)
Any ideas? Thanks
ps. If you want to get an overall picture of what I am trying to implement please refer to: Can I use JSON data to add new objects in Django?
Thanks so much.
The syncdb command only creates tables if they do not exist. It does not handle migrations for you. You have a few options:
If there is no important data in the table, drop the table and run syncdb again to recreate it.
Update the column to allow null in a db shell. The correct command depends on which database you are using.
Use a migration tool, like South.
To drop the table in sqlite:
Open a dbshell
./manage.py dbshell
Drop the table
drop table <tablename>

Django postgresql foreign key

I have this models.py
from django.db import models
from django.contrib.auth.models import User
class Service(models.Model):
user = models.ForeignKey(User)
name = models.CharField(max_length=800, blank=False)
service = models.CharField(max_length=2000, blank=False)
def __unicode__(self):
return self.name
class ServiceCheck(models.Model):
service = models.ForeignKey(Service)
check_status = models.CharField(max_length=80)
check_time = models.DateTimeField()
When i run syncdb against postgresql it ends up with error:
~/dev-md> sudo python manage.py syncdb
Creating tables ...
Creating table monitor_service
Creating table monitor_servicecheck
DatabaseError: Hash/Modulo distribution column does not refer to hash/modulo distribution column in referenced table.
Code looks okay upon first glance. I suspect it's a database related issue.
As the max_lengths provided are pretty huge, try reducing the max_lengths to something <= 255 or use a TextField instead of a CharField.
Failing this, try renaming the field service in Service and/or ServiceCheck.
The issue you're hitting is actually related to how Postgres-XC/StormDB (or now Postgres-XL, where I hit this issue) handle partitioning of tables between different datanodes.
Basically, the issue is that the database engine can't guarantee the constraints on foreign keys or on unique constraints. Per an old article on the StormDB site about a previous version Django and Postgres-XC/StormDB, you can do this by setting loose_constraints=true on the database. In a modern version of Django (1.6 or later), the equivalent seems to be setting db_constraint=False on a ForeignKey as per Django's model field docs (which I can't link to directly as I don't have enough rep).
Another solution for if you're more concerned about availability and not performance is to replicate the data, which means that you won't have the problem, because all of the datanodes have the same data.
I don't know of any way to do this directly in Django, but you can modify your table creation to use DISTRIBUTE BY REPLICATION as detailed in the CREATE TABLE docs.

Update Field schema in Django Model

I'm trying to update a field key from the following in the model:
name = models.CharField(max_length=255, unique = True)
to:
name = models.CharField(max_length=255, unique = False)
However I need to update the mysql db without losing any of the current data. How would I go about doing this? I tried $ python manage.py syncdb but it doesn't seem to update the key.
That is because syncdb doesn't do that. It only creates new tables, but doesn't change the existing if your model changes.
On a dev environment, to update your database after changing your model, you can use python manage.py reset appname to empty out your database, and use syncdb again.
Otherwise, you have to use tools such as South. South was made to handle migrations, but also changing the database when the model changes.
There is also django-evolutions that does just what you want, but I still recommend South as its migrations features are almost always going to be of some use.
You could use PHPMyAdmin and remove the unique index from this field.

"DatabaseError, column does not exist". But shows in my manage.py sql

Regarding Django & Python:
The Error:
Exception Type: DatabaseError
Exception Value:
column objects_thing.name_id does not exist
LINE 1: ...s_thing"."created", "objects_thing"."modified", "objects...
In my manage.py sql objects
CREATE TABLE "objects_thing" (
otherstuff,
otherstuff,
"name_id" integer NOT NULL REFERENCES "profiles_name" ("id"),
otherstuff,
);
So it clearly exists.
I've ran syncdb.
Why am I getting this error? And how do I go about fixing it? (I'm a newbie to all of this) Thank you in advance for the help.
EDIT:
Thing Model:
class Thing(models.Model):
created = models.DateTimeField(auto_now_add=True)
modified = models.DateTimeField(auto_now=True)
name = models.ForeignKey(Name)#The name for this thing
current_allocation = models.DecimalField(max_digits=13, decimal_places=2, null=True, blank=True)
target_allocation = models.DecimalField(max_digits=13, decimal_places=2, null=True, blank=True)
placeholder = models.ForeignKey(Entity, null=True, blank=True, on_delete=models.SET_NULL)
avatar = models.ImageField(upload_to='avatars/thing/', null=True, blank=True)
syncdb doesn't change existing tables in your database, so if you run that, and then change your model, your model is now out of sync with the table it represents. Running syncdb again will not fix that.
You either need to use something like south to do a migration, delete the table from your DB so that syncdb will recreate it, or manually run an ALTER TABLE on your DB.
EDIT (greater detail)
When you create a subclass of Model in models.py, it acts as a representation of a database table, but doesn't automatically have a database table. You get that by running python manage.py syncdb. Django, then, looks through all your models.py files, generates the SQL required to actually create a table like that and then runs it on your database. The end result is that you end up with actual database tables that are tied to your models.
However, syncdb only creates tables. It does not alter them. So, if you go and change one of your models (add a field, change the name of a field, etc.), nothing has happened at the database level. Running syncdb again will not help, either, because there's no new tables to create. You have to somehow get the table to match the model and vice versa, though, so that's where your options come in:
Use South (link above). South enables you to create migrations, so when you change something on your models you can run:
python manage.py schemamigration --auto yourapp
And it will generate code that will alter the table to match your model. You then need only apply the migration with:
python manage.py migrate yourapp
And you're done. The table now matches your model and all is right in the world again.
You can manually delete the table from your database. You wouldn't want to do this in production because all the data in that table will go along with it, but in development it shouldn't be a problem. After the table is gone, you can then run:
python manage.py syncdb
Because, the table no longer exists, Django will create it, but it will create it based on your current model's state. The net result is the same, your model and table match, so you're good to go.
You can manually alter the table. This requires that you figure out what SQL needs to be applied to change the table to match your model. You run this SQL on your database, and then the table is in parity with the model.
The point is that somehow, someway, you must update the table to reflect any changes you make to your models. The model isn't the table, it's just a programmatic representation of it.
The column might not necesarrily exist. The sql command just shows the sql used to create it, It is based on your current model. You could delete the table and re syncdb or manually add the column. https://docs.djangoproject.com/en/dev/ref/django-admin/#sql-appname-appname
I think there is a little confusion regarding the django Model and the actual database table
The django model is just some python code. It is a python object that is connected to a sql table. The database backend is specified in settings.py. The database contains the actual table. The error you are encountering is indicating that the python model is not the same as the actual database table.

Categories