How do you load a Django fixture so that models referenced via natural keys don't conflict with pre-existing records?
I'm trying to load such a fixture, but I'm getting IntegrityErrors from my MySQL backend, complaining about Django trying to insert duplicate records, which doesn't make any sense.
As I understand Django's natural key feature, in order to fully support dumpdata and loaddata usage, you need to define a natural_key method in the model, and a get_by_natural_key method in the model's manager.
So, for example, I have two models:
class PersonManager(models.Manager):
def get_by_natural_key(self, name):
return self.get(name=name)
class Person(models.Model):
objects = PersonManager()
name = models.CharField(max_length=255, unique=True)
def natural_key(self):
return (self.name,)
class BookManager(models.Manager):
def get_by_natural_key(self, title, *person_key):
person = Person.objects.get_by_natural_key(*person_key)
return self.get(title=title, person=person)
class Book(models.Model):
objects = BookManager()
author = models.ForeignKey(Person)
title = models.CharField(max_length=255)
def natural_key(self):
return (self.title,) + self.author.natural_key()
natural_key.dependencies = ['myapp.Person']
My test database already contains a sample Person and Book record, which I used to generate the fixture:
[
{
"pk": null,
"model": "myapp.person",
"fields": {
"name": "bob"
}
},
{
"pk": null,
"model": "myapp.book",
"fields": {
"author": [
"bob"
],
"title": "bob's book",
}
}
]
I want to be able to take this fixture and load it into any instance of my database to recreate the records, regardless of whether or not they already exist in the database.
However, when I run python manage.py loaddata myfixture.json I get the error:
IntegrityError: (1062, "Duplicate entry '1-1' for key 'myapp_person_name_uniq'")
Why is Django attempting to re-create the Person record instead of reusing the one that's already there?
Turns out the solution requires a very minor patch to Django's loaddata command. Since it's unlikely the Django devs would accept such a patch, I've forked it in my package of various Django admin related enhancements.
The key code change (lines 189-201 of loaddatanaturally.py) simply involves calling get_natural_key() to find any existing pk inside the loop that iterates over the deserialized objects.
Actually loaddata is not supposed to work with existing data in database, it is normally used for initial load of models.
Look at this question for another way of doing it: Import data into Django model with existing data?
Related
Have to make django models and take a JSON file to feed all the data for a student and classes display webapp. JSON file is what going to drive my modeling, it looks like this (truncated to couple of data points)...
{
"students": [
{
"first": "John",
"last": "Smith",
"email": "johnsmith#mailinator.com",
"studentClasses": [
{
"id": 1,
"grade": 4
},
{
"id": 2,
"grade": 3
},
]},
{...#truncated data, this follows with more students
"classes": {
"1": "Math 101",
"2": "English 101",
"3": "Science 101",
#total 8 classes declared, truncated
}
I have my data models as.....
class Student(models.Model):
first = models.CharField(max_length=200)
last = models.CharField(max_length=200)
email = models.EmailField()
class Classes(models.Model):
student = models.ForeignKey(Student)
class_name = models.CharField(max_length=50)
Here are my questions...
(1) How can I model in a way that takes in studentClasses:[{id:1, grade:4}] type relational input from JSON file and populates my database tables ? It seems I might have to declare serializer, why and how?
(2) Getting confused by ID in classes table and not ID in students table, do I explicitly have to declare primary key in modeling with ID in classes but not students models ?
(3) It seems I can load tables with "python manage.py load data myjsonfile.json" (would that be sufficient)?
(4) Do I need a third model called 'studentClasses' that keeps track of which student has taken what class and grade for that class ?
Thanks in advance for your help.
As for me, it seems better to write short function (in separate file) that updates database:
import json
from app.models import Student, Classes
data = json.loads('fixtures.json')
for student in data['students']:
# parse student; save classes and student objects
print('done')
(2) I think you should use "many to many" relation, not "one to many"
You need to add a field student_classes to Student, you can serialize it using jsonpickle. By having that field, I don't think you would need any foreign keys...You are talking about "loading tables" - that seems is where you are also confused. The model in django consists of classes just as your modeling code shows, therefore, to work with the models (and data in them), you would be importing the classes, such as "from models import Student, Classes".
Using Django 1.7.
I have a model class Topic that I want to serialize. Topic has an attribute Creator that is a ForeignKey to a class UserProfile. The output from serialization gives me the following string:
'{"fields": {"building": 1, "title": "Foobar", "created": "2015-02-13T16:14:47Z", "creator": 1}, "model": "bb.topic", "pk": 2}'
I want key "creator" to say the username of associated with UserProfile (as opposed to right now, where it is giving me the pk value associated with UserProfile. The username is held within a OneToOneField with django.contrib.auth.models class User.
I tried to implement a UserProfileManager, but either I have done it incorrectly or the following is not an appropriate strategy:
def get_by_natural_key(self, user, picture, company):
return self.get(user_id=user.id, user_pk=user.pk, username=user.get_username, picture=picture, company=company)
Finally, I looked around SO and found pointers to this: https://code.google.com/p/wadofstuff/wiki/DjangoFullSerializers but it says it was last updated in 2011 so I am assuming there is something else out there.
Thanks. Any help is much appreciated.
It looks you haven't implemented all the code needed to serialize the UserProfile with a natural key.
Actually the get_by_natural_key method is called when deserializing an object. If you want it to be serialized with a natural key instead of pk you should provide the natural_key method to the model.
Something like:
class UserProfileManager(models.Manager):
def get_by_natural_key(self, user, company):
return self.get(user__user__username=user, company=company)
class UserProfile(models.Model):
objects = UserProfileManager()
user = models.OneToOneField(User)
company = models.CharField(max_length=255)
def natural_key(self):
return (self.user.name, self.company)
Now, if you serialize a Topic instance:
import django.core.serializers
obj = serializers.serialize('json', Topic.objects.all(), indent=2, use_natural_foreign_keys=True, use_natural_primary_keys=True)
you should get an output similar to:
{
"fields": {
"building": 1,
"title": "Foobar",
"created": "2015-02-13T16:14:47Z",
"creator": [
"dummy",
"ACME Inc."
]
},
"model": "bb.topic", "pk": 2
}
supposing a dummy user exist with a company named ACME Inc. in its user profile.
Hope this helps.
Essentially, I am creating an application where users can have multiple skills. So I have it setup like this:
class Skill(models.Model):
name = models.CharField(max_length=30)
class Listing(models.Model):
...(other stuff for the model here)
skill = models.ManyToManyField(Skill,)
And then I'm going to create a form that looks something like this:
class ListingForm(ModelForm):
skill = forms.ModelMultipleChoiceField(queryset=Skill.objects.all())
class Meta:
model = Listing
The end result being that I want each skill to show up as a checkbox in the form. So there might be 30 skills to choose from, and then the user could just check any of them that they were proficient in. The problem I am facing is that I somehow have to create those 30 skill objects initially. I know how to create objects, but I don't know where to put the code so that the ~30 skills only get created the first time the server starts. Where should I create the initial skill objects? Is there a better way to do this?
You can create a fixtures.json file and use loaddata:
fixtures.json
[
{
"pk": 1,
"model": "appname.skill",
"fields": {
"name": "skill name",
}
}
}
cmd line:
python manage.py loaddata path/to/fixtures.json
Here are some docs for it: Providing initial data
I am trying to export some data from a MySQL database and import it into a PostgreSQL database. The two models are described below:
class Location(models.Model):
name = models.CharField(max_length=100)
class Item(models.Model):
title = models.CharField(max_length=200)
location = models.ForeignKey(Location)
class Book(Item):
author = models.CharField(max_length=100)
Notice that the Book model inherits from the Item model. (Also, I do realize that author really should be a separate model - but I need something simple to demonstrate the problem.) I first attempt to export the data from the model using the dumpdata command:
./manage.py dumpdata myapp.location >locations.json
./manage.py dumpdata myapp.item >items.json
./manage.py dumpdata myapp.book >books.json
The JSON in items.json looks something like this:
{
"fields": {
"title": "Introduction to Programming",
"location": 1
},
"model": "myapp.item",
"pk": 1
}
The JSON in books.json looks something like this:
{
"fields": {
"author": "Smith, Bob"
},
"model": "myapp.book",
"pk": 1
}
I can import locations.json and items.json without issue but as soon as I attempt to import books.json, I run into the following error:
IntegrityError: Problem installing fixture 'books.json': Could not load
myapp.Book(pk=1): null value in column "location_id" violates not-null
constraint
Edit: the schema for myapp.books (according to PostgreSQL itself) is as follows:
# SELECT * FROM myapp_book;
item_ptr_id | author
-------------+--------
(0 rows)
The books.json file needs to have all the fields of the parent model, in your case 'title' and 'location' fields (and appropriate values of course) need to be added (from the items.json file). loaddata doesn't use the database structure (with intermediate the table and all), but the checks the actual fields of the model.
To avoid ending up with double entries, you will also have to remove all the json entries in the items.json file that are pointed to by the original mysql database in the myapp_book table.
Another solution would be to use http://pypi.python.org/pypi/py-mysql2pgsql (see also this question)
Seems like for whatever reason the Books schema in your postgres db doesn't match the models - it has a location_id column. You should drop the table and rerun syncdb.
How do you exclude the primary key from the JSON produced by Django's dumpdata when natural keys are enabled?
I've constructed a record that I'd like to "export" so others can use it as a template, by loading it into a separate databases with the same schema without conflicting with other records in the same model.
As I understand Django's support for natural keys, this seems like what NKs were designed to do. My record has a unique name field, which is also used as the natural key.
So when I run:
from django.core import serializers
from myapp.models import MyModel
obj = MyModel.objects.get(id=123)
serializers.serialize('json', [obj], indent=4, use_natural_keys=True)
I would expect an output something like:
[
{
"model": "myapp.mymodel",
"fields": {
"name": "foo",
"create_date": "2011-09-22 12:00:00",
"create_user": [
"someusername"
]
}
}
]
which I could then load into another database, using loaddata, expecting it to be dynamically assigned a new primary key. Note, that my "create_user" field is a FK to Django's auth.User model, which supports natural keys, and it output as its natural key instead of the integer primary key.
However, what's generated is actually:
[
{
"pk": 123,
"model": "myapp.mymodel",
"fields": {
"name": "foo",
"create_date": "2011-09-22 12:00:00",
"create_user": [
"someusername"
]
}
}
]
which will clearly conflict with and overwrite any existing record with primary key 123.
What's the best way to fix this? I don't want to retroactively change all the auto-generated primary key integer fields to whatever the equivalent natural keys are, since that would cause a performance hit as well as be labor intensive.
Edit: This seems to be a bug that was reported...2 years ago...and has largely been ignored...
Updating the answer for anyone coming across this in 2018 and beyond.
There is a way to omit the primary key through the use of natural keys and unique_together method. Taken from the Django documentation on serialization:
You can use this command to test :
python manage.py dumpdata app.model --pks 1,2,3 --indent 4 --natural-primary --natural-foreign > dumpdata.json ;
Serialization of natural keys
So how do you get Django to emit a natural key when serializing an object? Firstly, you need to add another method – this time to the model itself:
class Person(models.Model):
objects = PersonManager()
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
birthdate = models.DateField()
def natural_key(self):
return (self.first_name, self.last_name)
class Meta:
unique_together = (('first_name', 'last_name'),)
That method should always return a natural key tuple – in this example, (first name, last name). Then, when you call serializers.serialize(), you provide use_natural_foreign_keys=True or use_natural_primary_keys=True arguments:
serializers.serialize('json', [book1, book2], indent=2, use_natural_foreign_keys=True, use_natural_primary_keys=True)
When use_natural_foreign_keys=True is specified, Django will use the natural_key() method to serialize any foreign key reference to objects of the type that defines the method.
When use_natural_primary_keys=True is specified, Django will not provide the primary key in the serialized data of this object since it can be calculated during deserialization:
{
"model": "store.person",
"fields": {
"first_name": "Douglas",
"last_name": "Adams",
"birth_date": "1952-03-11",
}
}
The problem with json is that you can't omit the pk field since it will be required upon loading of the fixture data again. If not existing, json will fail with
$ python manage.py loaddata some_data.json
[...]
File ".../django/core/serializers/python.py", line 85, in Deserializer
data = {Model._meta.pk.attname : Model._meta.pk.to_python(d["pk"])}
KeyError: 'pk'
As pointed out in the answer to this question, you can use yaml or xml if you really want to omit the pk attribute OR just replace the primary key value with null.
import re
from django.core import serializers
some_objects = MyClass.objects.all()
s = serializers.serialize('json', some_objects, use_natural_keys=True)
# Replace id values with null - adjust the regex to your needs
s = re.sub('"pk": [0-9]{1,5}', '"pk": null', s)
Override the Serializer class in a separate module:
from django.core.serializers.json import Serializer as JsonSerializer
class Serializer(JsonSerializer):
def end_object(self, obj):
self.objects.append({
"model" : smart_unicode(obj._meta),
"fields" : self._current,
# Original method adds the pk here
})
self._current = None
Register it in Django:
serializers.register_serializer("json_no_pk", "path.to.module.with.custom.serializer")
Add use it:
serializers.serialize('json_no_pk', [obj], indent=4, use_natural_keys=True)