I have a typical Django project with one primary database where I keep all the data I need.
Suppose there is another DB somewhere with some additional information. That DB isn't directly related to my Django project so let's assume I do not even have a control under it.
The problem is that I do ont know if I need to create and maintain a model for this external DB so I could use Django's ORM. Or maybe the best solution is to use raw SQL to fetch data from external DB and then use this ifo to filter data from primary DB using ORM, or directly in views.
The solution with creating a model seems to be quite ok but the fact that DB isn't a part of my project means I am not aware of possible schema changes and looks like it's a bad practice then.
So in the end if I have some external resources like DBs that are not related to but needed for my project should I:
Try to create django models for them
Use raw SQL to get info from external DB and then use it for filtering data from the primary DB with ORM as well as using data directly in views if needed
Use raw SQL both for a primary and an external DB where they intersect in app's logic
An alternative is to use SQLAlchemy for the external database. It can use reflection to generate the SQLAlchemy-equivalent of django models during runtime.
It still won't be without issues. If your code depends on a certain column, it would still break if that column is removed or changed in an incompatible way. However, it will add a bit more flexibility to your database interactions, e.g. a Django model would definitely break if an int column is changed to a varchar column, but using database reflection, it will only break if your code depends on the fact that it is an int. If you simply display the data or something, it will remain fully functional. However, there is always a chance that a change doesn't break the system, but causes unexpected behaviour.
If, like Benjamin said, the external system has an API, that would be the preferred choice.
I suggest you to read about inspectdb and database routers. It's possible to use the django ORM to manipulate a external DB.
https://docs.djangoproject.com/en/1.7/ref/django-admin/#inspectdb
I would create the minimal django models on the external databases => those that interact with your code:
Several outcomes to this
If parts of the database you're not interested in change, it won't have an impact on your app.
If the external models your using change, you probably want to be aware of that as quickly as possible (your app is likely to break in that case too).
All the relational databases queries in your code are handled by the same ORM.
Related
I'm a few weeks into Python/Django and encountering an annoying problem. I have some existing databases set up in settings.py, everything looks good, I've even accessed the databases using connections[].cursor()
But the databases (and data) are not making their way into models that I want to use, despite doing the makemigrations and migrate commands. I was able to use py manage.py inspectdb --database-dbname and copied that class information manually into my models.py, but that didn't work either (typing py manage.py inspectdb on its own does not pull up these databases, I was only able to view by that --database extension). So I'm stumped, as it seems I'm doing all the right steps but not able to use these existing databases in Django.
Any other hints and steps I can take are welcome!
(Almost) all the tutorials, examples, and third-party app you'll find on the internet, and most of the Django documentation assume you use one database for your app. That's because it's fairly tricky and unusual to use multiple databases in one app.
But it's not impossible to use multiple databases and the documentation contains instructions on how to do this and what changes you'll need to make to make it work.
IMO, these are the pre-conditions to use multiple databases in one project:
The databases contain explicitly unrelated information, i.e. you won't have SQL relationships between tables in different databases. One database may contain a table with a column that maps to a column in a table in another database, but they aren't explicit (no ForeignKey or ManyToManyField in your models).
You don't need to mix databases in one query: This basically derives from the previous condition. It just means that if you need to get objects from one database that depend on the rows coming from another database, you establish the relationship in python. E.g. fetching as list of names from one database and using that list to filter a queryset on the other database.
For example, if you have an existing database that contains Strava routes (which are regularly updated via some external mechanism) and your app is a broader app that helps users getting to know their neighbourhood where they can recommend locations and things to do, being able to offer a list of routes with a starting point nearby might be something you'd want to show.
Now that you know this, the way to go is described in the doc linked above:
Create a database router so that queries for certain models are automatically routed to the correct database. E.g. Route.objects.filter(start_city=city) would automatically fetch routes from your Strava routes database.
If you need to save information about a route in your app, save it in a model in the default database and use a unique identifier of the route that will map to the strava database. Use separate queries (no relationships) to fetch information about a specific route.
That being said, if the Strava database is not regularly updated via 3rd channels and its purpose is just to pre-populate your default database, then export the data from the Strava database as json and import it into your django db using manage.py loaddata or a migration file, the latter being more flexible as to the structure of the json file.
I am setting a web server based on Django Framework with PostgreSQL. I need to update the models in my Django app every hour and I am wondering if is it better way to update the database from another script file using python and psycopg2 or using script that directly connects to Django models and update them using Django API for models? I use this Database only in Django web server.
Summarizing... What is better logic?
update_db.py that contains:
connect to database via psycopg2, update the tables
update_db.py that contains:
connect to Django app models, update them using Django API for models
I will run script every hour using Cron, and the Django Project is already connected to that specific database and it works fine. I am just asking for better performance and better way in logic sense. Data won't be updated very often but it would have big jsons.
Unless you have very compelling reasons to do otherwise, the obvious solution here is to use a custom management command and the ORM - your code will be at the obvious place (for someone else having to work on the project), can be tested as part of your whole project's test suite, and won't require having to mentally "translate" from raw SQL to Django ORM code (not to mention that you'll have more chance to catch a mismatch between your Django models and your script's code when your schema changes - from experience, raw sql scripts that aren't under test are usually forgotten when doing a schema migration).
FWIW, optimizing "for performances" doesn't really makes sense in your case, unless your updates are long and complex and do block access to the site - but even then the overhead of the ORM is certainly not going to be the main bottleneck (just make sure you use properly).
I have a few databases in MongoDB that I want to create models for dynamically, since there are many databases and I cannot do it manually. Questions:
What should my models.py look like? (Does inspectdb work with mongodb databases or only SQL based dbs?)
Since the database models are created dynamically, how do I code the serializer class to return the dynamic fields?
Thanks in advance.
Django supports an object-relational mapper, that is aimed at traditional relational databases. While there are a number of mongodb packages for Django, none of them support inspectdb to construct your models. Either way, inspectdb is a kludge designed as a one of process to help a one-of migratation away from a legacy system, i.e. you'd build your models.py file once and never run inspectdb again. This is not what you want to do, as you seem to want dynamic models that can be added or altered at runtime.
On the bright side, Django MongoDB Engine has some support for arbitrary embedded models within pre-defined models. But even then they don't seem too supportive of it:
As you can see, generic embedded models add a lot of overhead that bloats up your data records. If you want to use them anyway, here’s how you’d do it...
In summary, try to build your models as best you can to actually match your requirements. If you know nothing about your models ahead of production, then perhaps Django isn't the right solution for you.
Recetly I've seen an app powered with django and mongodb as backend,thing is that app doesn't have a models.py file.All the datas are inserted directly in views.py.I Just need a little clarification about this particular things "Using django without models.py with mongodb."
models.py is the Django ORM way of inspecting a fixed relational schema and generating the relevant SQL code to initialize (or modify) the database. "ORM" stands for "Object-Relational Mapping".
Mongo is not relational, hence you don't need this type of schema.
(Of course, that can cause a lot of other problems if the needs of your project change later...)
But you don't need a relational schema since you're not using a relational DB.
A short answer
models.py is the ORM that comes free with django.
ORM relates your SQL schema into oopsy objects.
You can read more about ORM here-> https://en.wikipedia.org/wiki/Object-relational_mapping.
When using a noSQL, you can push objects directly into DB. So, you do not really need an ORM.
That said, whether to use it or not is a debatable part.
PS. even while using SQL, some people prefer other ORMs instead of django's built-in models.
I have some models I'm working with in a new Django installation. Is it possible to change the fields without losing app data?
I tried changing the field and running python manage.py syncdb. There was no output from this command.
Renavigating to admin pages for editing the changed models caused TemplateSyntaxErrors as Django sought to display fields that didn't exist in the db.
I am using SQLite.
I am able to delete the db file, then re-run python manage.py syncdb, but that is kind of a pain. Is there a better way to do it?
Django does not ever alter an existing database column. Syncdb will create tables, but it does not do 'migrations' as found in Rails, for instance. If you need something like that, check out Django South.
See the docs for syndb:
Syncdb will not alter existing tables
syncdb will only create tables for models which have not yet been installed. It will never issue ALTER TABLE statements to match changes made to a model class after installation. Changes to model classes and database schemas often involve some form of ambiguity and, in those cases, Django would have to guess at the correct changes to make. There is a risk that critical data would be lost in the process.
If you have made changes to a model and wish to alter the database tables to match, use the sql command to display the new SQL structure and compare that to your existing table schema to work out the changes.
You have to change the column names in your DB manually through whatever administration tools sqlite provides. I've done this with MySQL, for instance, and since MySQL lets you change column names without affecting your data, it's no problem.
Of course there is.
Check out South
You'll have to manually update the database schema/layout, if you're only talking about adding/removing columns.
If you're attempting to rename a column, you'll have to find another way.
You can use the python manage.py sql [app name] (http://docs.djangoproject.com/en/dev/ref/django-admin/#sql-appname-appname) command to see what the new SQL should look like, to see what columns, of what type/specification Django would have you add, and then manually run corresponding ALTER TABLE commands.
There are some apps/projects that enable easier model/DB management, but Django doesn't support this out of the box, on purpose/by design.