I have this models.py
from django.db import models
from django.contrib.auth.models import User
class Service(models.Model):
user = models.ForeignKey(User)
name = models.CharField(max_length=800, blank=False)
service = models.CharField(max_length=2000, blank=False)
def __unicode__(self):
return self.name
class ServiceCheck(models.Model):
service = models.ForeignKey(Service)
check_status = models.CharField(max_length=80)
check_time = models.DateTimeField()
When i run syncdb against postgresql it ends up with error:
~/dev-md> sudo python manage.py syncdb
Creating tables ...
Creating table monitor_service
Creating table monitor_servicecheck
DatabaseError: Hash/Modulo distribution column does not refer to hash/modulo distribution column in referenced table.
Code looks okay upon first glance. I suspect it's a database related issue.
As the max_lengths provided are pretty huge, try reducing the max_lengths to something <= 255 or use a TextField instead of a CharField.
Failing this, try renaming the field service in Service and/or ServiceCheck.
The issue you're hitting is actually related to how Postgres-XC/StormDB (or now Postgres-XL, where I hit this issue) handle partitioning of tables between different datanodes.
Basically, the issue is that the database engine can't guarantee the constraints on foreign keys or on unique constraints. Per an old article on the StormDB site about a previous version Django and Postgres-XC/StormDB, you can do this by setting loose_constraints=true on the database. In a modern version of Django (1.6 or later), the equivalent seems to be setting db_constraint=False on a ForeignKey as per Django's model field docs (which I can't link to directly as I don't have enough rep).
Another solution for if you're more concerned about availability and not performance is to replicate the data, which means that you won't have the problem, because all of the datanodes have the same data.
I don't know of any way to do this directly in Django, but you can modify your table creation to use DISTRIBUTE BY REPLICATION as detailed in the CREATE TABLE docs.
Related
I'm new to Django and I am trying to use a mysql database created and filled with data by someone else
I created a model with the same name as the table I want to get data from, my models is as follows
class Study(models.Model):
study_name = models.TextField(default='Unknown')
description = models.TextField(default='Unknown')
language = models.TextField(default='Unknown')
number_of_years = models.IntegerField(default='0')
database connected but when I go to admin I don't see the data there
Please help me with this
A step by step solution would be:
get the name of the table containing your data, I'll call it study_table
make sure you know how the table was defined so you can match it with django model definition. Connect to the database with a MySQL client and run the following query:
DESCRIBE study_table;
based on the table name, column types and column names, define your model to match everything. Django models do a lot of automated naming so you have force the naming to make sure your model matches your database. Principles are:
Specify the table name as a meta option.
Create fields with names matching column names and field types matching column types. Taking an example from your code, the field study_name should match a column with the same name in the table study_table.
class Study(models.Model):
study_name = models.TextField(default='Unknown')
description = models.TextField(default='Unknown')
language = models.TextField(default='Unknown')
number_of_years = models.IntegerField(default=0)
class Meta:
db_table = study_table
Side note: your IntegerField has a default as a string '0'.
making sure the app (I'll call it study_app) containing your model is enabled, the database is configure properly in your django settings, try to access data from the admin shell (python manage.py shell):
>>> from study_app.models import Study
>>> Study.objects.first()
This should return an answer, if it does not, your model doesn't match the database data.
to make accessing the data easier, create an admin page as suggested by #iklinac. You can now read, edit your data through your browser.
A few suggestions you could consider:
study_name should probably be a models.CharField(max_length=255) or similar
description should be allowed to be empty models.TextField(blank=True)
language should probably be a models.CharField with a choices option.
You should create ModelAdmin instance for your model
The ModelAdmin class is the representation of a model in the admin
interface. Usually, these are stored in a file named admin.py in your
application.
from django.contrib import admin
from myproject.myapp.models import Study
class StudyAdmin(admin.ModelAdmin):
pass
admin.site.register(Study, StudyAdmin)
If you have a MySQL database with tables of data that don't have models created yet, you can use the dumpdata command to automatically generate the models:
https://docs.djangoproject.com/en/3.0/ref/django-admin/#dumpdata
Then, you can register those models in the Django admin. dumpdata should only be used as a starting point, since they are auto-generated and won't contain many of Django's data integrity features.
Good luck!
Strangest bug I ever encountered
User model is the regular one from conrtib.auth.User
Let's say I have the following model:
class Region(model.Model):
name = models.CharField(max_length=256)
rakazim = models.ManyToManyField(settings.AUTH_USER_MODEL)
goal = models.IntegerField(default=0)
and someplace else I have:
user_model = get_user_model()
rakaz = user_model.objects.create_user(username, email, password)
Then Immediately after the user creation method is called the "rakaz" instance has a random region connected
rakaz.region_set.all() = [<random_region>]
It also sometimes connects to another model that has a similar ManyToManyField to AUTH_USER_MODEL
I debugged with pdb into the user creation method (in auth contrib) and immediately after calling save inside this happens.
AFAIK it happens only on the staging server, but until I find the reason I'm afraid to deploy to prod..
Django version 1.84. server using mariabdb on RDS
I don't use signals in my code (and at all :) ) and can't find relevant third party code doing this, (And if so it would happen on my machine also)
Any Ideas?
The issue turned out to be:
I seeded the staging server with data from prod for "region" the dumpdata command dumps the region with foreign keys for rakazim . But since the users where actually missing (I didn't copy the users from my prod environment) Instead of shouting at me, and not allowing me to loaddata with non existing foreign keys, mariadb chose to give me :+1 and add random foreign keys each time I created a user (Perhaps not random, but according to the imported mapping, not sure).
Lesson learned: use a proper DBMS and not a mysql varient.
I have looked for django doc in their official site but i can't find the article about the on_update model function here in Related objects reference except for on_delete.
Here is an example code:
from django.db import models
class Reporter(models.Model):
# ...
pass
class Article(models.Model):
reporter = models.ForeignKey(Reporter, on_delete=models.CASCADE)
Is there any version of on_update?
I have visited this Cascade on update and delete wih django but there is not a clear answer about the on_update
I am using mysql and define the relationship in the ERD and sync it to the db and tried running the python manage.py inspectdb to generate the django-model but it shows only models.DO_NOTHING.
Is there a better way to achieve this, if any?
It's normally adviseable to completely leave the primary key alone when setting up your Django models, as these are used by Django in a number of ways to maintain relationships between objects. Django will set them up and use them automatically.
Instead, create a separate field in your model to keep track of unique data:
class Reporter(models.Model):
emp_id = models.CharField(unique=True)
This way you can obtain the emp_id with reporter_object.emp_id and if you need it, you can still get the pk with reporter_object.id.
You can read about how it works it in the Django 1.9 Documentation
Suppose you had a Django model:
from django.db import models
class Person(models.Model):
first_name = models.CharField(max_length=30)
last_name = models.CharField(max_length=30)
And you migrate it to:
from django.db import models
class Person(models.Model):
both_names = models.CharField(max_length=30)
You might write a data migration for this case (or mine, not the above, but similar and not cogent to the question) and you would definitely want to test it.
I used call_command('migrate', app, state) to roll the database back to the old state and then I thought I'd just use a historic ORM of Person to instantiate some Person objects, then do the migration, then make some assertions to check if my data migration went properly.
In South, you would just call up the old ORM by Migration. How does one do this in Django 1.8?
I'm basing my strategy on this documentation: https://micknelson.wordpress.com/2013/03/01/testing-django-migrations/
And I'm hoping to find the implementation for Django's historic-models: https://docs.djangoproject.com/en/1.8/topics/migrations/#historical-models
To restate the question - How would I get the 'old' state of Person, such that I would create a person like so: Person(first_name='bob', last_name='jones') - for testing purposes.
The public API doesn't provide any means to get historical models outside of a migration. However, you can get it using the following code:
from django.db import connection
from django.db.migrations.loader import MigrationLoader
loader = MigrationLoader(connection)
state = loader.project_state(loader.applied_migrations)
apps = state.apps
apps is the same object that is passed to the function in a RunPython operation. You can get a historical model for the currently applied migrations using apps.get_model('<app_label>', '<ModelName>').
This should work in 1.8 and 1.9. It may or may not work in future versions, though I see little reason why this part of the API should change anytime soon.
For Django 3.0 and 3.1 , the solution is mostly the same as the answer from #knbk , except the input argument to project_state() is simply a tuple like ('<APP_LABEL>', '<MIGRATION_NAME>') instead of entire applied_migrations dictionary.
you can get the state from initial migration to the specific committed migration of specified application, by directly specifying <APP_LABEL> or <MIGRATION_NAME> like this :
from django.db import connection
from django.db.migrations.loader import MigrationLoader
loader = MigrationLoader(connection)
state = loader.project_state(('your_app_label', '0023_alter_some_fields'))
apps = state.apps
Note that the code above cannot be used when running migration commands (e.g. makemigrations, sqlmigrate, migrate), otherwise it will cause recursive callstack error.
In django models say this model exist in details/models.py
class OccDetails(models.Model):
title = models.CharField(max_length = 255)
occ = models.ForeignKey(Occ)
So when sync db is made the following fields get created
and later to this of two more fields are added and sync db is made the new fields doesnt get created.How is this to be solved,Also what is auto_now=true in the below
these are the new fields
created_date = models.DateTimeField(auto_now_add=True)
modified_date = models.DateTimeField(auto_now_add=True, auto_now=True)
syncdb creates the database tables for all apps in INSTALLED_APPS whose tables have not already been created.
Syncdb will not alter existing tables
syncdb will only create tables for models which have not yet been installed. It will never issue ALTER TABLE statements to match changes made to a model class after installation. Changes to model classes and database schemas often involve some form of ambiguity and, in those cases, Django would have to guess at the correct changes to make. There is a risk that critical data would be lost in the process.
you can either
Issue a manual ALTER TABLE command
DROP TABLE the particular table (will lose data) and run syncdb again
run django-admin sqlclear to get a list of sql statements to clear the entire db and run those commands (will flush the db - you'll lose all existing data) or
DateField.auto_now: automatically set the field to NOW() every time the object is saved. Useful for "last-modified" timestamps. Note that the current date is always used; it's not just a default value that you can override.
Thus, the modified_date column will be automatically updated every time you call object.save()
This is a common problem with Django. As said by Amarghosh, syncdb can not modify the schema of existing tables.
South has been created to solve this problem.
I do recommend it.