Suppose I have a table of animals which has two attributes name and type, whereas type can be: 'dog', 'cat', etc. Here are two ways to implement this in Django: one where type is a ForeignKey to AnimalType:
class Animal(models.Model):
name = models.CharField(max_length=10)
type = models.ForeignKey(AnimalType)
The other is to just have type as a predefined choice which would be defined in the imported module:
class Animal(models.Model):
name = models.CharField(max_length=10)
type = models.CharField(
max_length=10,
choices=ANIMAL_TYPE_CHOICES
)
The latter (predefined choice) seems more efficient to me, since types will never be dynamically updated by user interaction and if a new type needs to be added it will be added by a developer, i.e. the code would be updated rather than the database.
However, I would like to know if this would be a good/acceptable practice? Or should I waste a separate database table for such a "static" entry and also pay with extra time caused by db accesses?
Thanks.
The first way has the advantage that you don't have to touch the code in order to add a new type of animal.
And of course, someone using your app neither.
Adding a new animal type is something trivial and, for instance, you shouldn´t be messing with a working code deployed on a production server for just add an animal type.
If you´re having problems due to your database is empty at start using the application and because of that you don't have any animal types, well, try Django fixtures: Providing initial data for models
I prefer second way.
If you don't need to edit types from admin panel and always will change it with changes in your code, you do not need to have ForeignKeys and separate table.
In case of ForeignKey, you will have additional integrity check on the database level.
It can be useful if you delete some type and do not want to leave it in DB, for example.
I prefer field choices due to performance reasons. Even if the potential choices increases, as long as the functionality is just a choice selection, there's no reason to create an extra table
Related
I'm building an app that puts together the hardware of a computer. This is my first time working with django. Say I have the following models:
class Memory(models.Model):
partNum = models.CharField()
capacity = models.CharField()
class Computer(models.Model):
name = models.CharField()
memory = models.ManyToManyField(Memory)
# also has cpus, hard drives, and motherboard, but focus on memory for now
One memory object can belong to many computer objects, and one computer object can have many memory objects - hense Many-To-Many. However, computers require the same exact memory sticks installed if using more than one.
Yet django's manytomany field (by default?) only allows one instance of a memory-computer relationship, it must me unique. Any way around this?
If I try, in the admin page, to add many of the same memory objects to a computer, it says "Computer-memory relationship with this Computer and Memory already exists". If I try adding more than once the same memory object to a server object in the manage.py shell, it appears that only one memory object was added. If I try to manually edit the database to have a duplicate entry, I get an error saying that the entry already exists. I see that in the database structure, some sorta "unique together" index thingy is enforcing is enforcing this. If I altered the table to remove that clause, would that solve my problem? Probably not unless the django manager is more stupid than expected.
What are my options? Write my own intermediary models and use the through construct? But then I won't get to use the cool filter_horizontal widget! Rewrite my Computer model to have a Foreign Key field plus a field for the number of memory object? But then I won't get the ManyToMany API facilities. Help!
Edit: sorry, I did not read your post well enough, about not wanting to use 'through'.
One way to circumvent the problem, would be to use the "through" parameter, in which you can manually specify an intermediate model to use for the many-to-many relationship. In this way, you should still have (most of) the many-to-many facilities that Django provides.
The intermediate model could then have count (which I would find easier to manage than having multiple relations):
class Memory(models.Model):
partNum = models.CharField()
capacity = models.CharField()
class Computer(models.Model):
name = models.CharField()
memory = models.ManyToManyField(Memory, through='ComputerMemory')
class ComputerMemory(models.Model):
memory = models.ForeignKey(Memory)
computer = models.ForeignKey(Computer)
count = models.IntegerField()
For further information, take a look in the Django documentation: https://docs.djangoproject.com/en/dev/topics/db/models/#intermediary-manytomany
No this is not a compromise, Even if you dont create another table for the through thing, than also django will create it to remember exactly which memory is associated with each computer, so better is that you do it yourself...and this also allows you to get other fields in there that are required for a specific computer with a specific memory
I have some Django objects like this:
class Award(Model):
name = CharField(...)
date = DateTimeField(...)
class Book(Model):
name = CharField(...)
award = ForgeinKey(Award, blank=True)
(i.e. each Book may or may not have one award)
I want to make a form for adding/editing Awards. Assume lots of Books have already been created. Initially I created a ModelForm like this:
class AwardForm(ModelForm):
class Meta:
model = Award
which created a correct form with all the data, however there was no way to add books to the award (or mark this award as applicable to the books that were selected).
I was going to manually add a ModelMultipleChoiceField to the AwardForm (with a queryset for the Books), then add logic to is_valid that would check the Books that were selected, and add something to save to go through the selected Books and set the forgein key on that object to this object.
However, Is there a good 'django' way to do this automatically? i.e. just add a certain field to the Form which will do that all itself? If I were to change the Book.award to be a Award.books manytomany key it would probably do all this automatically, but I like the internal logic of having that as a forgeinkey on the Book.
I was just going to suggest using Inline's but upon re-reading your question the hard part is selecting objects that already exist and linking them to the Award that you are editing.
The way that you have suggested is the first way that I would do it, and the only other way I can think of would involve hacking away at some of the methods on your AwardAdmin to add in the functionality that you desire.
Using the ModelMultipleChoiceField though seems like quite a clean way of doing it to be honest, especially since you shouldn't really need much/any editing of the save method as the field should handle that itself?
We are migrating the data in several instances of our Django project to a new schema.
The old schema had:
class Group(models.Model)
class User(models.Model)
And the new schema has:
class AccessEntity(models.Model)
class Group(AccessEntity)
class User(AccessEntity)
We are trying to use South to do a data migration for these groups and users. http://south.aeracode.org/docs/tutorial/part3.html
I've gathered that I'll need to use forward rules to specify how to migrate the Users but there are a few issues I've run up against.
The main issue is how to keep the ID of the User/Group the same if I were to create a new User object that extends the AccessEntity class.
Users & Groups are referenced to by objects they own or are assigned to them. If I change their ID that information would be lost. Is there a way of keeping the same ID for an object even though I need it to now extend from AccessEntity?
not sure if I understand your question correctly, but the way multi-table model inheritance works ist that there will be an implicit one-to-one field in the parent and child models. So both User and Group would use an ID field of AccessEntity if AccessEntity has such a field.
If you create AccessEntity such that it has a field ID you can assign to it when you write a forward (data)-migration. That way you can make sure that the AccessEntity gets the right ID.
If have written a longer multi-table inheritance tutorial and it looks like you are trying to do something similar.
And furthermore the answer to this question could also be helpful (note that some things in the original answer does will not work in new versions of django / south, see my tutorial / the answer at the bottom for changes).
What might be a problem in your case is that if you already have data in both User and Groups and the id field is auto-generated, IDs likely not be distinct, e.g. you are likely going to have both a User and a Group with ID==1. This could be a problem if you want to query based on those IDs and of course ID could not be a primary key for AccessEntity then.
I'm working on a logging app in Django to record when models in other apps are created, changed, or deleted. All I really need to record is the user who did it, a timestamp, a type of action, and the item that was changed. The user, timestamp, and action type are all easy, but I'm not sure what a good way to store the affected item is beyond storing an id value and a class name so the item can be retrieved later. I imagine that storing the class name will result in a bit of a hacky solution in order to find the actual class, so I'm wondering if there's a better way. Does anyone know of one?
Use generic relations which do just that (use instance id and model class) but are integrated in Django and you also get a shortcut attribute that returns related instance so you don't have to query it yourself. Example usage.
Check out generic relations.
We're running django alongside - and sharing a database with - an existing application. And we want to use an existing "user" table (not Django's own) to store user information.
It looks like it's possible to change the name of the table that Django uses, in the Meta class of the User definition.
But we'd prefer not to change the Django core itself.
So we were thinking that we could sub-class the core auth.User class like this :
class OurUser(User) :
objects = UserManager()
class Meta:
db_table = u'our_user_table'
Here, the aim is not to add any extra fields to the customized User class. But just to use the alternative table.
However, this fails (likely because the ORM is assuming that the our_user_table should have a foreign key referring back to the original User table, which it doesn't).
So, is this sensible way to do what we want to do? Have I missed out on some easier way to map classes onto tables? Or, if not, can this be made to work?
Update :
I think I might be able to make the change I want just by "monkey-patching" the _meta of User in a local_settings.py
User._meta.db_table = 'our_user_table'
Can anyone think of anything bad that could happen if I do this? (Particularly in the context of a fairly typical Django / Pinax application?)
You might find it useful to set up your old table as an alternative authentication source and sidestep all these issues.
Another option is to subclass the user and have the subclass point to your user-model. Override the save function to ensure that everything you need to do to preserve your old functionality is there.
I haven't done either of these myself but hopefully they are useful pointers.
Update
What I mean by alternative authentication in this case is a small python script that says "Yes, this is a valid username / password" - It then creates an instance of model in the standard Django table, copies fields across from the legacy table and returns the new user to the caller.
If you need to keep the two tables in sync, you could decide to have your alternative authentication never create a standard django user and just say "Yes, this is a valid password and username"