I need a way of reading data in django from different DBs using different models, because models and fields in the db have changed between the projects.
What I try to do is this:
from sbo.core import models as sbo_core_models
from sbo_cloud.core import models as cloud_core_models
company_details = sbo_core_models.CompanyDetails.objects.using('sbo').filter(company=sbo_company).order_by("id")[0]
new_company_details = cloud_core_models.CompanyDetails.objects.get(id=int(reply['id']))
the model that actually gets used for the new_company_details is actually sbo_core_models.CompanyDetails and not cloud_core_models.CompanyDetails because it is missing properties that would appear in the second one.
any ideea why this might happen and what I am doing wrong, from what i've seen it uses the models that I import first, no matter what model I tell it to use.
I am using python2.7 and django 1.3.3
All you need it's on the Django documentation
https://docs.djangoproject.com/en/1.3/topics/db/multi-db/
I think that the best solution for your case is a database router.
Related
In the stackoverflow below is mention how to fetch all related objects in Django
Get all related Django model objects
To be short, the solution that best fits my needs would be something like code below..
from django.contrib.admin.utils import NestedObjects
collector = NestedObjects(using="default") #database name
collector.collect([objective]) #list of objects. single one won't do
print(collector.data)
Basically I'm trying to figure out how to do the same approach but using SqlAlchemy.
Thanks in advance!!
I am quite new to Django. Right now I am working on a prototype with ~30 database tables. I have understood that with Django you preferably want to have quite small Apps, with limited number of Models.
I want to use the same Models (i.e. database tables) in several different Apps. What is the best practice to achieve this? I am using mysql.
might be quite late for your question.
You may use same model by different app just by refering to it.
First your apps should be under the same project.
Choose one app as 'core' and define your models under it,(I have read that models are better to be at app level)
Then whenever you would like to use a model from that app, import that app's model at the beginning of the code.
Let say we have two apps: app_1 and app_2
and a model defined under app_1/models.py: model_1
In app_2/views.py , import model_1
from app_1.models import model_1
I like the Django ORM. It's simple, easy to use, and reasonably powerful.
I'm currently developing some internal sites for the VFX company I work for, for which I've used Django. In the meantime, we are developing other python applications and libraries to be used in various contexts in production. There's a number of places in which our core library needs to be interacting with some databases, and using an ORM like Django would really help things. I'm aware of other options like SqlAlchemy or PeeWee, but I'd like to see if Django will work since I use it on the websites and I like its API better.
Using Django as an ORM in a library is tricky (as I explored in a previous question), because Django expects to be used as a website with "apps". In a library, I might want to define any number of data models, which would exist in appropriate places in the library but not inside any Django app (as we're not using any other parts of the framework). So far so good.
I can create a baseclass for my models anywhere in the library as follows:
from django.db import models
from django.apps import apps
import django.conf
django.conf_settings.configure(
DATABASES = ...
)
apps.populate((__name__,))
class LibModel(models.Model):
class Meta:
abstract = True
app_label = __name__
Then anywhere in the library I can create my own models with this baseclass. Since I'm not relying on the "app" for the database names, I need to state them explicitly.
class SpecificModel(LibModel):
# fields go here
class Meta(LibModel.Meta):
db_table = "specific_model_table_name"
This gets around my concern of having to simulate the structure of an "app". The name property in the base class supplies Django with all it needs, and then Django quits whining about not finding an app. The other model files can live wherever they want.
However, there is a glaring use case where this all falls apart. Say that my Django web application wants to use some functionality from the company core python library, which now uses the Django ORM for various things. Since I make a call to django.conf.settings.configure in the library, Django is going to scream about defining the settings more than once when it tries to run the main application.
So basically, a library using the Django ORM is incompatible with Django. Wonderful.
Is there any way around this? I mean, it's a lovely ORM - is it really this impossible to use in a standalone modular way? Is the Django architecture utterly singleton in nature, making this impossible?
*Not a duplicate
I'm trying to have a company python library that uses Django as an ORM. Some of the things that could depend on it might be Django websites themselves. How do I get around Django's singleton insistence on only setting the settings config once? Or is it possible? None of these answers address this!
You can check if django has already been configured.
from django.apps import apps
from django.conf import settings
if not apps.ready:
settings.configure()
django.setup()
When starting Django application - core python library can be configured as separate app an be loaded on startup.
Also, check this answer on dynamic app loading at runtime.
A simple answer is how to initialize Django in a standalone application and do it compatible with Django applications.
import os
import django
if not 'DJANGO_SETTINGS_MODULE' in os.environ:
os.environ['DJANGO_SETTINGS_MODULE'] = 'mysettings'
# # or without DJANGO_SETTINGS_MODULE directly
# from django.conf import settings
# settings.configure(DATABASES=... other...)
django.setup()
# this shouldn't be before DJANGO_SETTINGS_MODULE or settings.configure(...)
from myapp.models import MyModel
# this shouldn't be called before django.setup()
queryset = MyModel.objects.filter(...)
This is more compatible with Django then the answer by Oleg Russkin (where a risk of cyclic dependency at django.setup() is possible if the code is called inside inside a setup() initiated by another similar code or a normal Django project started by manage. It is similar to manage.py where django.setup() is also called internally by execute_from_command_line(sys.argv). The setup initializes all modules related to INSTALLED_APPS all urls modules and consequently all views etc. Many of them are lazy, but still. If any code called by setup() depends on this then neither the condition not apps.ready doesn't help. The setup() is not reentrant and the startup fails.)
Much more general answer
An important concept of Django is to support writing reusable parts of code ("applications" in Django terminology, that can be developed and tested independently. There can be also a tree of dependencies, but uncontrolled mutual dependencies should be avoided if possible) Reusable applications are expected that they can be easier combined to whole project ("project" in Django terminology is with all settings necessary to run it by Python.)
The only unavoidable and useful "singleton" in Django ORM are database connections django.db.connections and django.conf.settings especially INSTALLED_APPS. Only one connection should be used to the same database from the same process or thread.
Django is very configurable. An extreme example: It is possible to write a single file project where all code like settings, models, URL configs and views is defined in one file. That extreme that is probably useful only for some special tests or very short demos or as an exercise. It is even possible to define a project by one file with two "reusable" applications :-)
Django supports also "legacy databases" where the database structure is shared with existing non Django applications and models can be created by inspectdb command. Table names in such models are explicit and don't contain the app name. On the other hand the app name prefix is useful to prevent a possible conflict of the same table names from independent "applications". An important decision is if you can use it as a "legacy" database or a normal Django database.
You can decide between following two solutions or to combine them:
Use e.g. foo_models or bar.models and import all models from them e.g. to app.models and add only that "app" to INSTALLED_APPLICATIONS. This can be viable if it is only for one company and never otherwise and central name assigment is possible. (easiest but little naive)
Use some coarse separation of namespaces to several apps. You should probably use not more than one app with simple names without app name prefix.
Think ahead about migrations**. They will be probably very complicated and very soon impossible if you will create later more projects for the same database and different subsets of database tables without separating them to more apps and without app namespace.
There is really no "singleton" in Django ORM except of django.db.connections itself. If you use more databases you can direct some tables to a specific database by DATABASE_ROUTERS, even with two different models that use the same table name without a prefix.
I've enjoyed using Django quite a bit over the years. Right now I'm working for a company that is building some shared internal libraries for accessing information from our databases. Right now things are terribly sloppy - lots of inline SQL code. Some colleagues were working on doing some accessing code, and it occurred to me that this is the sort of thing that I'm used to Django doing out of the box. I had also heard that Django is fundamentally modular, so that I could just use the ORM system in isolation. I know there are other packages for this like SqlAlchemy, but I also hear that Django does things easier for the average case, which is likely all we'll need. I also know Django, and don't know SQLAlchemy.
So, I can do a proof of concept, like the following. Here is the directory structure:
+-- example.py
+-- transfer
| +-- __init__.py
| +-- models.py
Here is models.py
import django.conf
import django.db.models
django.conf.settings.configure(
DATABASES = ..., # database info
INSTALLED_APPS = ("transfer",),
SECRET_KEY = 'not telling',
)
django.setup()
class TransferItem(django.db.models.Model)
# model info here
example.py
from transfer.models import TransferItem
items = TransferItem.objects.all()
print items
This seems to work fine, as far as it goes. But I'm worried about the bootstrap code in a library context. Specifically:
Is there danger in django thinking of this as an app? In this case, "transfer" is a root module, but in a library context this could be buried deep. For example, "mycompany.data.protocols.transfer". Theoretically, we could have these data models defined throughout the codebase. How can this "app list" scale?
The call to setup really worries me. The django docs specifically say only to call setup once. And yet the nature of a library is that any python application could import whatever type of data model they want. I can't make any assumptions about which django "apps" they might want, or what order they want it in. Would if one type of model is used, data is returned, and only then does the python application decide it needs to import another type of model (quite possibly in a different "app")?
So, the fundamental question is this:
Is there a good way to use django ORM in a python library?
EDIT: This is not a duplicate of the CLI tool question. I know how to run django outside the web server. I even gave code samples showing this. I want to know if there's a way to run django where I don't have "apps" per se - where model files could be imported by client code and used in any order.
OK, here's my own attempt to answer the question after some research.
I can create a baseclass for my models anywhere in the library as follows:
from django.db import models
from django.apps import apps
import django.conf
django.conf_settings.configure(
DATABASES = ...
)
apps.populate((__name__,))
class LibModel(models.Model):
class Meta:
abstract = True
app_label = __name__
Then anywhere in the library I can create my own models with this baseclass. Since I'm not relying on the "app" for the database names, I need to state them explicitly.
class SpecificModel(LibModel):
# fields go here
class Meta(LibModel.Meta):
db_table = "specific_model_table_name"
This gets around my concern of having to simulate the structure of an "app". The name property in the base class supplies Django with all it needs, and then Django quits whining about not finding an app. The other model files can live wherever they want.
However, there is still a big concern I have which might be lethal. Django's initialization itself is still singleton in nature. If my library were itself imported by a Django application, it would all crash and burn. I've asked for solutions to this in this follow up question.
I have a few databases in MongoDB that I want to create models for dynamically, since there are many databases and I cannot do it manually. Questions:
What should my models.py look like? (Does inspectdb work with mongodb databases or only SQL based dbs?)
Since the database models are created dynamically, how do I code the serializer class to return the dynamic fields?
Thanks in advance.
Django supports an object-relational mapper, that is aimed at traditional relational databases. While there are a number of mongodb packages for Django, none of them support inspectdb to construct your models. Either way, inspectdb is a kludge designed as a one of process to help a one-of migratation away from a legacy system, i.e. you'd build your models.py file once and never run inspectdb again. This is not what you want to do, as you seem to want dynamic models that can be added or altered at runtime.
On the bright side, Django MongoDB Engine has some support for arbitrary embedded models within pre-defined models. But even then they don't seem too supportive of it:
As you can see, generic embedded models add a lot of overhead that bloats up your data records. If you want to use them anyway, here’s how you’d do it...
In summary, try to build your models as best you can to actually match your requirements. If you know nothing about your models ahead of production, then perhaps Django isn't the right solution for you.