I'd like my Django views to be atomic. I mean, if there is 2 DB-writes in the view, I want either 0 write, either 2 writes.
For example:
def test_view(request):
''' A test view from views.py '''
MyClass.objects.create()
raise Exception("whatever")
MyClass.objects.create()
What I found in documentation seemed promising :
A common way to handle transactions on the web is to wrap each request
in a transaction. Set ATOMIC_REQUESTS to True in the configuration of
each database for which you want to enable this behavior.
It works like this. Before calling a view function, Django starts a
transaction. If the response is produced without problems, Django
commits the transaction. If the view produces an exception, Django
rolls back the transaction.
However, even if I set ATOMIC_REQUESTS = True, when calling test_view(), the first MyClass object is created! What am I missing?
Note : I'm using Django 1.7
ATOMIC_REQUESTS is an an attribute of the database connection settings dict, not the top-level settings. So, for example:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydatabase',
'USER': 'mydatabaseuser',
'PASSWORD': 'mypassword',
'HOST': '127.0.0.1',
'PORT': '5432',
'ATOMIC_REQUESTS': True,
}
}
Related
I am accessing the remote database in my Django project as follows:
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
},
'remote_db' : {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'db_name',
'USER': 'db_user',
'PASSWORD': 'db_password',
'HOST': '192.*.*.*',
'PORT': '1433',
}
}
For accessing default database table's data, I use the following syntax:
from app_name.models import mymodel
mymodel.objects.all()
My remote database has tables like reports, emplayee_data, etc that are already there and my project has no models defined for these tables.
I need to access these tables and I am unsure how to perform this action.
remote_db.reports.all()
All in all, my main objective is to copy the data from remote_db to default database.
Note:
remote_db gets new data everyday.
I think you need to also define models for reports and employee_data in this project.
And then you can use them like the following:
reports.objects.using('remote_db').all()
You mentioned that there are no models defined for the tables. I think it will be a good option to make a script to fetch data from remote_db and add it to default. You will have to use raw SQL to do that if you don't create models for those tables. Another option would be getting the remote_db dump and importing it to the default database.
I'm writing a Django app that uses MongoDB as it's primary database. I simply need the app to make a query against the database (hosted on Heroku) and display results for each user request.
I'm aware Python modules such as PyMongo exist for easily connecting/interacting with MongoDB, but I don't want to have to establish a database connection each time a user requests the page. I want the database base to connect upon launching the Django app.
Right now in my settings.py file I have:
DATABASES = {
'default': {
'ENGINE': 'django_mongodb_engine', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'heroku_app33177236', # Or path to database file if using sqlite3.
# The following settings are not used with sqlite3:
# 'USER': 'admin',
# 'PASSWORD': '',
'HOST': 'mongodb://admin:######ds041581.mongolab.com:41581/heroku_app33177236', # Empty for localhost through domain sockets or '127.0.0.1' for localhost through TCP.
#'PORT': '', # Set to empty string for default.
}
}
And in my views.py:
def index(request):
context = RequestContext(request)
# !!!!!! I want to do something like this:
rooms = db.studybug.find()
return render_to_response('studybug/index.html', rooms, context)
As you can see above, I simply want to query the database each time a user requests and display the result.
I don't really see the point or need of defining models for this, because the operation is so lightweight.
Is there a way to do something like:
from settings import db
??
Thanks!
I have a production database that contains a large set of data. I'd like to use some of that data for running unit tests, but taking all of it causes a fairly lengthy period at the start of the testing process to build the database., which I'd like to avoid.
I've created a test database using the manage.py testserver command, then deleted all the data I didn't want to be included through the admin interface. How do I create a fixture of the data that remains in the default test database?
Now it is easier to save the test data into a fixture.
When you run the django unittests, a test database is automatically created by putting test_ before the name of the default database. We can make django aware of this database by creating a new database on settings.py. For example
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'CONN_MAX_AGE': 3600,
'NAME': 'mydatabase',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': '127.0.0.1',
'PORT': '3306',
'OPTIONS': {'charset': 'utf8mb4'}
},
'test': {
'ENGINE': 'django.db.backends.mysql',
'CONN_MAX_AGE': 3600,
'NAME': 'test_mydatabase',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': '127.0.0.1',
'PORT': '3306',
'OPTIONS': {'charset': 'utf8mb4'}
}
}
Then run a unittest setting a breakpoint on the first line, edit the test database as you wish, then run the following command:
./manage.py dumpdata app_name -o filename.json --database test
This will dump all data in the test database into filename. Notice that you should edit the test database while the unittest is running (hence the breakpoint). If not, even if you preserve the test database all its data will be erased upon completion of the unit test.
you can use dumpdata to generate a json fixture, like this:
./manage.py dumpdata > fixture.json
if you want to save a fixture from your test, just serialize your qs:
# ... import your Models
from django.core.serializers import serialize
qs1 = Model1.objects.filter(...)
qs2 = Model2.objects.filter(...)
...
fixture = serialize('json', list(qs1) + list(qs2) + list(...))
with open('fixture.json', 'w') as f:
f.write(fixture)
I have a long running process, (an crossbar.io python component) wich uses the django ORM for saving on a database.
My problem is that when it overpasses the mysql idle connection timeout, it doesnot auto-reconnects, just throw [2006, u'MySQL server has gone away']
I have tried defining CONN_MAX_AGE in django settings to 60 seconds, so it should close the connection every minute, and reopen it on demand, but it does not seems to work.
My settings.py looks like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'name',
'USER': 'user',
'PASSWORD': 'password',
'CONN_MAX_AGE': 60
}
}
And in the app, I just use normal simple CRUD model operations, without doing it by hand (as it seems to disable the autoreconnection feature of django, as said in other similar questions)
I have two databases defined in my settings, one is default for postgresql and another is mysql on external system used only for read-only access.
I have separated file for models on the mysql database that is not registered in INSTALLED_APPS and when I want to access that database I use objects.using("otherdb") to get to the data...
The problem is that recently noticed strange behavior: the mysql server is now inaccessible and it crashed syncdb.
Anyway now I have this on my local settings:
class DumbRouter(object):
def db_for_read(self, model, **hints):
return 'default'
def db_for_write(self, model, **hints):
return 'default'
def allow_relation(self, db1, db2, **hints):
return False
def allow_syncdb(self, db, model):
return db in ('default',)
DATABASE_ROUTERS = ['pathtomyawesomeprojectanditssettings.DumbRouter']
DATABASES = {
"default" : {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': location + 'sqlite3.db',
'USER': 'django',
'PASSWORD': '',
'HOST': '',
'PORT': '',
},
"otherserver": {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'anotherserver',
'USER': 'someuser',
'PASSWORD': 'asxasxasxasx',
'HOST': 'someipthatisnotup',
'PORT': '13306',
}
}
The problem is that when I run syncdb I get Can't connect to MySQL server on ...
So my question is: Why is django trying to access that server when my DumbRouter specifically says it should never use anything but default?
And more importantly how can I prevent this behavior and have a DB defined only for objects.using?
It will use the default database.In your example your sqlite3 database will be used.To which you want to connect keep in default.
This link may helpful
https://docs.djangoproject.com/en/dev/topics/db/multi-db/