Route single model to alternate database in django - python

I have two databases set up in my settings folder of my project
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'foo': {
'NAME': 'bar',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'some.site.com',
'USER': 'xxxxxx',
'PASSWORD': 'xxxxxxxx'
}
I also have models set up, one of them was created with
python manage.py inspectdb --database foo > tmp.py
That created some models I already had in foo, so I copied it over into my models folder. However, django is trying to use the existing default database for that model, when instead I want it to route to the foo database instead.
When looking online for how to get this done. Posts recommend using 'database-routing', but I cannot find documentation or an example that works for me or that I understand.
So please, what is the right way to set up a single model to use an external database?

The easiest way is to select database manually.
From https://docs.djangoproject.com/en/1.8/topics/db/multi-db/#manually-selecting-a-database
>>> # This will run on the 'default' database.
>>> Author.objects.all()
>>> # So will this.
>>> Author.objects.using('default').all()
>>> # This will run on the 'other' database.
>>> Author.objects.using('other').all()
>>> my_object.save(using='legacy_users')
Documentation has also other options, check: https://docs.djangoproject.com/en/1.8/topics/db/multi-db/

Related

How to access remote database's tables in Django?

I am accessing the remote database in my Django project as follows:
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
},
'remote_db' : {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'db_name',
'USER': 'db_user',
'PASSWORD': 'db_password',
'HOST': '192.*.*.*',
'PORT': '1433',
}
}
For accessing default database table's data, I use the following syntax:
from app_name.models import mymodel
mymodel.objects.all()
My remote database has tables like reports, emplayee_data, etc that are already there and my project has no models defined for these tables.
I need to access these tables and I am unsure how to perform this action.
remote_db.reports.all()
All in all, my main objective is to copy the data from remote_db to default database.
Note:
remote_db gets new data everyday.
I think you need to also define models for reports and employee_data in this project.
And then you can use them like the following:
reports.objects.using('remote_db').all()
You mentioned that there are no models defined for the tables. I think it will be a good option to make a script to fetch data from remote_db and add it to default. You will have to use raw SQL to do that if you don't create models for those tables. Another option would be getting the remote_db dump and importing it to the default database.

How to make Django use two different databases based on debug flag

I want to use the simple SQLite database on my local environment and use a Postgresql database in production. How can I configure the settings file to know which database to use based on the value of DEBUG?
There are several options available:
Below is a very cheap solution. Django always selects the database called 'default'. You can assign it conditionally in settings.py:
DATABASES = {
'dev': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
},
'production': {
'ENGINE': 'django.db.backends.postgresql',
# ...
},
}
DATABASES['default'] = DATABASES['dev' if DEBUG else 'production']
You can implement an alternate settings module called settings_dev.py. Configure database there and use the environment variable DJANGO_SETTINGS_MODULE to point to yourapp.settings_dev.
Implementing a custom database router. This is almost certainly overkill for many use-cases. See the Django documentation on multiple database support.

Can I configure Django for Unit tests with restricted view on unmanaged databases?

We are a small team, trying to work with Django with a restricted access to a futurely unmanaged PostgreSQL database (i.e: only views and stored procedures ; no access to any tables) for security reasons.
We tried to give (within Postgre) the user external_test the rights to create any tables inside his own schema on external, and to use the following settings (settings.py):
...
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'external': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgre_db',
'USER': 'external_user',
'PASSWORD': 'password',
'HOST': 'integration.project.net',
'PORT': '5432',
'TEST': {
'NAME': 'test_db',
'USER': 'external_test',
'PASSWORD': 'password',
...
Using simple passing unit tests (project/app/tests/test_views.py):
...
class InternalTest(TestCase):
database = ['default']
def test_something(self):
pass
class StoredProcedureTest(TestCase):
databases = ['external']
def test_one_procedure(self):
with connections["external"].cursor() as cursor:
cursor.callproc("some_procedure", [42, ])
pass
...
If we try the first one with ./manage.py test app.tests.test_views.InternalTest
→ ok
If we try the other one with ./manage.py test app.tests.test_views.StoredProcedureTest
→ circular dependency issue
(ImproperlyConfigured: Circular dependency in TEST[DEPENDENCIES])
probably because it's skipping the configuration of default
If we try both tests with ./manage.py test app.tests.test_views:
→ permission denied
Creating test database for alias 'default'...
Creating test database for alias 'external'...
Got an error creating the test database: permission denied to create database
(Django try to create a database test_db as the user external_user)
We don't really get what Django is trying to do and how to properly configure it.
If we give the rights to external_user to create their own databases:
the database test_db is created by external_user
the schema of default (sqlite) is created in test_db of external (postgre)
the schema of external (postgre) is not created in test_db
Questions
Is django able to handle this ?
What are we doing wrong ?
What is the point of specifying a user external_user for TEST if in the end django is using the normal user external_user ?
Why does it write the schema of default in test_db ? Is there a way to create only models of some apps in it ?
Why isn't it able to create the schema of external in test_db ?
I hope it was described enough. Thank you in advance for your responses =)

Django read and write from multiple databases

Hello guys I'm developing a web app in django and I'm using the postgresql database. I must be able also to grab some data from another app which uses a sqlserver database. The tables that I'm trying to get have lots of data so maybe it is not wise to use a direct link. Which is the best approach regarding this issue? Can I use a sql-odbc connection to get the data, also how can I populate the tables lets say I create a local table and migrate data from sql to postgresql schedualy. Would like to understand how you dealt with this issue and your experiences. Thank you!
In your settings.py edit this code (for multiple databases)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'django',
'USER': 'postgres',
'PASSWORD': '12345678',
'HOST': 'localhost',
'PORT': '5432',
},
'connection_other_db': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'mi_db',
'USER': 'root',
'PASSWORD': 'root',
'HOST': 'localhost',
'PORT': '3306',
}
}
Apply migrations command:
For default database
$ ./manage.py migrate
For other database (connection_other_db)
$ ./manage.py migrate --database=connection_other_db
In you views,
For using ORM, use this:
Mi_Model.objects.using('connection_other_db').all() # For mysql database
Mi_Model.objects.all() # For default database (postgresql)
For create object:
s = Mi_Model()
s._state.adding = False
s._state.db = 'connection_other_db'
s.field_1 = 'A value'
s.save()
or
s = Mi_Model.objects.using('connection_other_db').create(
field_1='A value'
# ....
)
For use transactions:
with transaction.atomic(using='connection_other_db'):
# Your code here
For use cursors
with connections['connection_other_db'].cursor() as cursor:
cursor.execute('your query')
Django documentation:
https://docs.djangoproject.com/es/2.1/topics/db/multi-db/

Saving django test database in a fixture?

I have a production database that contains a large set of data. I'd like to use some of that data for running unit tests, but taking all of it causes a fairly lengthy period at the start of the testing process to build the database., which I'd like to avoid.
I've created a test database using the manage.py testserver command, then deleted all the data I didn't want to be included through the admin interface. How do I create a fixture of the data that remains in the default test database?
Now it is easier to save the test data into a fixture.
When you run the django unittests, a test database is automatically created by putting test_ before the name of the default database. We can make django aware of this database by creating a new database on settings.py. For example
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'CONN_MAX_AGE': 3600,
'NAME': 'mydatabase',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': '127.0.0.1',
'PORT': '3306',
'OPTIONS': {'charset': 'utf8mb4'}
},
'test': {
'ENGINE': 'django.db.backends.mysql',
'CONN_MAX_AGE': 3600,
'NAME': 'test_mydatabase',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': '127.0.0.1',
'PORT': '3306',
'OPTIONS': {'charset': 'utf8mb4'}
}
}
Then run a unittest setting a breakpoint on the first line, edit the test database as you wish, then run the following command:
./manage.py dumpdata app_name -o filename.json --database test
This will dump all data in the test database into filename. Notice that you should edit the test database while the unittest is running (hence the breakpoint). If not, even if you preserve the test database all its data will be erased upon completion of the unit test.
you can use dumpdata to generate a json fixture, like this:
./manage.py dumpdata > fixture.json
if you want to save a fixture from your test, just serialize your qs:
# ... import your Models
from django.core.serializers import serialize
qs1 = Model1.objects.filter(...)
qs2 = Model2.objects.filter(...)
...
fixture = serialize('json', list(qs1) + list(qs2) + list(...))
with open('fixture.json', 'w') as f:
f.write(fixture)

Categories