I am beginning to learn Python and Django. I want to know how if I have a simple class of "player" with some properties, like: name, points, inventory, how would I make the class also write the values to the database if they are changed. My thinking is that I create Django data models and then call the .save method within my classes. Is this correct?
You are correct that you call the save() method to save models to your db, But you don't have to define the save method within your model classes if you don't want to. It would be extremely helpful to go through the django tutorial which explains all.
https://docs.djangoproject.com/en/dev/intro/tutorial01/
https://docs.djangoproject.com/en/dev/topics/db/models/
Explains django models
django uses its own ORM (object-relational mapping)
This does exacxtly what it sounds like maps your django/python objects (models) to your backend.
It provides a sleek, intuitive, pythonic, very easy to use interface for creating models (tables in your rdbms) adding data and retrieving data.
First you would define your model
class Player(models.Model):
points = models.IntegerField()
name = models.CharField(max_length=255)
django provides commands for chanign this python object into a table.
python manage.py syncdb
you could also use python manage.py sql <appname> to show the actual sql that django is generating to turn this object into a table.
Once you have a storage for this object you can create new ones in the same manner you would create python objects
new_player = Player(points=100, name='me')
new_player.save()
Calling save() actually writes the object to your backend.
You're spot on...
Start at https://docs.djangoproject.com/en/dev/intro/tutorial01/
Make sure you have the python bindings for MySQL and work your way through it... Then if you have specific problems, ask again...
Related
I have a few databases in MongoDB that I want to create models for dynamically, since there are many databases and I cannot do it manually. Questions:
What should my models.py look like? (Does inspectdb work with mongodb databases or only SQL based dbs?)
Since the database models are created dynamically, how do I code the serializer class to return the dynamic fields?
Thanks in advance.
Django supports an object-relational mapper, that is aimed at traditional relational databases. While there are a number of mongodb packages for Django, none of them support inspectdb to construct your models. Either way, inspectdb is a kludge designed as a one of process to help a one-of migratation away from a legacy system, i.e. you'd build your models.py file once and never run inspectdb again. This is not what you want to do, as you seem to want dynamic models that can be added or altered at runtime.
On the bright side, Django MongoDB Engine has some support for arbitrary embedded models within pre-defined models. But even then they don't seem too supportive of it:
As you can see, generic embedded models add a lot of overhead that bloats up your data records. If you want to use them anyway, here’s how you’d do it...
In summary, try to build your models as best you can to actually match your requirements. If you know nothing about your models ahead of production, then perhaps Django isn't the right solution for you.
Look at this Django ORM code:
my_instance = MyModel()
my_instance.some_related_object = OtherModel.objects.using('other_db').get(id)
At this point, in the second line, Django will throw an error:
ValueError: Cannont assign "<OtherModel: ID>": instance is on database "default", value is on database "other_db"
To me, it doesn't make much sense. How Django can tell on which database my_instance is, if I haven't even called:
my_instance.save(using='some_database')
yet?
I guess, that during the construction of an object Django automatically assigns it to the default database. Can I change it? Can I specify database when creating an object, by passing an argument to its constructor? According to the documentation, the only arguments I can pass, when creating an object are the values of its fields. So how can I solve my problem?
In Django 1.8 There is a new method called Model.from_db (https://docs.djangoproject.com/en/1.8/ref/models/instances/) but I'm using earlier version of Django and can't switch to the newer now. Looking at the implementation all it does is setting two model's attributes:
instance._state.adding = False
instance._state.db = db
So would it be enough to change my code to:
my_instance = MyModel()
my_instance._state.adding = False
my_instance._state.db = 'other_db'
my_instance.some_related_object = OtherModel.objects.using('other_db').get(id)
or it is too late to do it because those flags are used in constructor and have to be set in constructor only?
You might want to look into database routing, which has been supported since Django 1.2. This will let you setup multiple databases (or "routers") for different models.
You can create a custom database router (a class inheriting from the built-in object type), with db_for_read and db_for_write methods that return the name of the database (as defined in the DATABASES setting) that should be used for the model passed into that method. Return None to let Django figure it out.
It's usually used for handling master-slave replication, so you can have a separate read-only database from your writeable one, but the same logic would apply to let you specify that certain models live in certain databases.
You would probably also want to define an allow_syncdb method so that only the models you want to appear in database B will appear there, and everything else will appear in database A.
Django knows what database each object comes from because it notes it such in its internal properties. The QuerySet too has this information stored within itself.
Actually, database routing isn't really needed to achieve what you want here.
Consider the following code fragment:
my_instance = MyModel()
my_instance.some_related_object_id = OtherModel.objects.using('other_db').get(id).id
Note how I assign just the ID, not the object itself.
You will lose the actual object here, but gain the ability to store referential data.
AFAIK there's no API to change an object's associated database.
I am currently using Django framework including its Models mechanism to abstract the database schema declaration and general db access, which is working fine for most scenarios.
However, my application also requires tables to be created and accessed dynamically during runtime, which as far as I can see, is not supported by Django out of the box.
These tables usually have an identical structure, and can basically be abstracted by the same Model class, but Django doesn't let you change the underlying db_table of a certain model query, as it is declared on the Model class and not on the Manager.
My solution for this is to do this process whenever I need a new table to be created, populated and accessed:
Create and populate the table using raw sql
Add indexes to the table using raw sql
When I need to access the table (using django queryset api), I declare a new type dynamically and return it as the model for the query, by using this code:
table_name = # name of the table created by sql
model_name = '%d_%s' % (connection.tenant.id, table_name)
try:
model = apps.get_registered_model('myapp', model_name)
return model
except LookupError:
pass
logger.debug("no model exists for model %s, creating one" % model_name)
class Meta:
db_table = table_name
managed = False
attrs = {
'field1' : models.CharField(max_length=200),
'field2' : models.CharField(max_length=200),
'field3' : models.CharField(max_length=200)
'__module__': 'myapp.models',
'Meta':Meta
}
model = type(str(model_name), (models.Model,), attrs)
return model
Note that I do check if the model is already registered in django and I'm using an existing model in case it does. The model name is always unique for each table. Since I'm using multi tenants, the tenant name is also part of the model name to avoid conflict with similar tables declared on different schemas.
In case it's not clear: the tables created dynamically will and should be persisted permanently for future sessions.
This solution works fine for me so far.
However, the application will need to support a large number of these tables. i.e. 10,000 - 100,000 such tables(and corresponding model classes), with up to a million rows per table.
Assuming the underlying db is fine with this load, my questions are:
Do you see any problem with this solution, with and without regards to the expected scale ?
Anybody has a better solution for this scenario ?
Thanks.
There is a wiki page on creating models dynamically, although it has been a while since it was last updated:
DynamicModels Django
There are also a few apps that are designed for this use case, but I don't think any of them is being actively maintained:
Django Packages: Dynamic models
I understand that if you are already committed to Django this isn't very helpful, but this a use case for which Django isn't really good. It might be more costly to fight against the abstractions provided by Django's model layer, than to just use psycopg2 or whatever other adapter is appropriate for your data.
Depending on what sort of operations you are going to perform on your data, it may be also more reasonable to use a single model with an indexed field that allows you to distinguish in which table that row would be and then sharding the data by that column.
If you still need to do this, the general idea would be:
Create a metaclass that extends Django's ModelBase. This metaclass you would use as a factory for your actual models.
Consider stuff mentioned on that wiki page, like circumventing the app_label issue.
Generate and execute the sql for the creation of the model as also shown on the wiki page.
How do I make syncdb execute SQL queries (for table creation) defined by me, rather then generating tables automatically.
I'm looking for this solution as some particular models in my app represent SQL-table-views for a legacy-database table.
So, I've created their SQL-views in my django-DB like this:
CREATE VIEW legacy_series AS SELECT * FROM legacy.series;
I have a reverse engineered model that represents the above view/legacytable. But whenever I run syncdb, I have to create all the views first by running sql scripts, otherwise syncdb simply creates tables for them (if a view is not found).
How do I make syncdb run the above mentioned SQL?
There are 2 possible approaches I know of to adapt your models to a legacy database table (without using views that is):
1) Run python manage.py inspectdb within your project. This will generate models for existing database tables, you can then continue to work with those.
2) Modify your tables with some specific settings. First of all you define the table name in your model by setting the db_table option in your meta options. Secondly you define for each field the column name to match your legacy database by setting the db_column option. Note there are other db_ options listed you possibly could use to match your legacy database.
If you really want the views approach an (ugly) workaround is possible, you can define custom sql commands per application model. This file is found in "application"/sql/"model".sql . Django will call this sql's after it created all tables. You can try to specify DROP statements for the generated tables followed by your view create statement in this file. Note that this will be a bit tricky for the tables with foreign keys as django guarantees no order of execution of these files (so stuffing all statements in one .sql will be the easiest way I think, I've never tried this before).
You could use unmanaged models for your reverse-engineered models, and initial SQL scripts for creating your views.
EDIT:
A bit more detailed answer. When you use unmanaged models, syncdb will not create your database tables for you, so you have to take care of it yourself. An important point is the table name, and how django maps Model classes to table names, I suggest you read the doc on that point.
Basically, your Series model will look like that :
class Series(models.Model):
# model fields...
...
class Meta:
managed = False
db_table = "legacy_series"
Then, you can put your SQL commands, in the yourapp/sql/series.sql file :
### yourapp/sql/series.sql
CREATE VIEW legacy_series AS SELECT * FROM legacy.series;
You can then syncdb as usual, and start using your legacy models.
We're running django alongside - and sharing a database with - an existing application. And we want to use an existing "user" table (not Django's own) to store user information.
It looks like it's possible to change the name of the table that Django uses, in the Meta class of the User definition.
But we'd prefer not to change the Django core itself.
So we were thinking that we could sub-class the core auth.User class like this :
class OurUser(User) :
objects = UserManager()
class Meta:
db_table = u'our_user_table'
Here, the aim is not to add any extra fields to the customized User class. But just to use the alternative table.
However, this fails (likely because the ORM is assuming that the our_user_table should have a foreign key referring back to the original User table, which it doesn't).
So, is this sensible way to do what we want to do? Have I missed out on some easier way to map classes onto tables? Or, if not, can this be made to work?
Update :
I think I might be able to make the change I want just by "monkey-patching" the _meta of User in a local_settings.py
User._meta.db_table = 'our_user_table'
Can anyone think of anything bad that could happen if I do this? (Particularly in the context of a fairly typical Django / Pinax application?)
You might find it useful to set up your old table as an alternative authentication source and sidestep all these issues.
Another option is to subclass the user and have the subclass point to your user-model. Override the save function to ensure that everything you need to do to preserve your old functionality is there.
I haven't done either of these myself but hopefully they are useful pointers.
Update
What I mean by alternative authentication in this case is a small python script that says "Yes, this is a valid username / password" - It then creates an instance of model in the standard Django table, copies fields across from the legacy table and returns the new user to the caller.
If you need to keep the two tables in sync, you could decide to have your alternative authentication never create a standard django user and just say "Yes, this is a valid password and username"