Referring to the example in Django documentation for multiple databases in one application,
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#an-example
" It also doesn’t consider the interaction of transactions with the database utilization strategy. "
How do I handle the interaction stated above.
The scenario is this:
I am using postgresql as my database. I have setup up a replica and want all the reads to "auth" tables to go to replica. Following the documentation I wrote a database router. Now whenever I try to log in to my application is throws the following error.
DatabaseError: cannot execute UPDATE in a read-only transaction.
This happens when Django tries to save the "last_login" time. Because, in the same view it first fetches the record from replica, and then tries to update the last_login time. Since it happens in one transaction so the same database is used, i.e. replica.
How do I handle this?
Thoughts?
Related
I have accidentally dropped all databases from my mongo db. Then i tried to insert a document in new database. It throws error "Unable to persist transaction state because the session transaction collection is missing. This indicates that the config.transactions collection has been manually deleted."
My Sample code:
doc_client = MongoClient(host=host,
port=port,
connect=True, # Connect on first operation to avoid multi-threading related errors
j=True, # Requests only return once write has hit the DB journal
)
print(doc_client.database_names()) # It works fine
doc_client['test'].insert({'a': 'ss'}) # Throws Error
I have accidentally dropped all databases from my mongo db
It is likely that you have also dropped config.transactions collection. This is a collection for internal usage that stores records used to support retryable writes for replica sets and sharded clusters. See also Config Databases.
Since MongoDB v3.6+, users won't be able to drop the config database in replica set from mongo shell. Although if you are connecting using mongo shell prior to v3.6, you're still able to do so, please ensure to upgrade the shell to match the server version.
"Unable to persist transaction state because the session transaction collection is missing. This indicates that the config.transactions collection has been manually deleted."
You can manually re-create the collection on the primary node:
use config
db.createCollection("transactions");
Alternatively, a replica set election would also automatically re-creates it. This is because the creation of config.transactions collection is part of a replica set node step up. session_catalog_mongod.cpp#L156
The new config.transactions collection will be replicated to the secondaries after the primary completed the catch up phase.
Trying to simply update existing row in database (running on web2py),
but always getting database locked error..
Error message:
<class 'sqlite3.OperationalError'> database is locked
My setup
in models/db.py I create database and it works when using database administration (can insert, update using the web interface)
db.define_table('mytest', Field('name', 'string'))
I have added 1 row to mytest, using the web interface (so its not empty)
in controllers/test.py i have simple code to get first item and try to update the value, there it fails (I open the page is browser and it gives the internal error, with link to error log)
def index():
# connect
db = DAL('sqlite://storage.sqlite',pool_size=10,auto_import=True)
# get first record
record = db(db.mytest).select().first()
# try to update it.. database locked error here
record.update_record(name="asdfg")
# just in case needed?
db.commit()
db.close()
return "test"
Software
WinPython2.7
Running win2py.py (2.14.6) manually using Spyder ide
windows8
What i've tried so far
Different DAL settings, poolsize, without autoimport..
Close all web2py admin tools/tabs
Create new database
Restart web2py
Restart pc
Error log: http://pastebin.com/2WMWypt6
Current workaround:
- Create New Application, exact same code seems to work there
Solution was: by #GauravVichare
- Remove this line from controller (its already defined in db.py)
db = DAL('sqlite://storage.sqlite',pool_size=10,auto_import=True)
Check Whether there is no other connection (to sqlite db) open on your machine, if web2py shell is open, close it.
Check DAL is defined only once or not. Define DAL only in models/db.py, no need to define it again in controller.
Every variable defined in models is visible in controllers.
You must have defined DAL in models/db.py and you are defining once again in controller, so you have two connection open for SQLite db. Thats why you are getting error 'database is locked'.
My Suggestion is
1.First of all save the code when u make some changes.
2.Aftr saving u r new code try to reload web2py.exe
then run web2py so that u wont get Databaselocked error.
3.Dont ever create tables in Sqlite database before.
4.once u start running web2py and starts server and when ever u enter data into forms it automatically creates the tablesin sqlite database.
Try using myRecord instead of Record, since it may be a reserved word.
I know User has given people issues in web2py. I would just tend to stay away from very generic aliases.
Otherwise, is there anything currently in the db? If it is empty you would receive and error.
It might be better to :
myRecord = record = db(db.mytest).select().first()
if myRecord:
myRecord.update_record(name="asdfg")
else:
[insert statement here]
I'm working on a project which involves a huge external dataset (~490Gb) loaded in an external database (MS SQL through django-pyodbc-azure). I've generated the Django models marked managed=False in their meta. In my application this works fine, but I can't seem to figure out how to run my unit tests. I can think of two approaches: mocking the data in a test database, and giving the unit tests (and CI) read-only access to the production dataset. Both options are acceptable, but I can't figure out either of them:
Option 1: Mocked data
Because my models are marked managed=False, there are no migrations, and as a result, the test runner fails to create the database.
Option 2: Live data
django-pyodbc-azure will attempt to create a test database, which fails because it has a read-only connection. Also I suspect that even if it were allowed to do so, the resulting database would be missing the required tables.
Q How can I run my unittests? Installing additional packages, or reconfiguring the database is acceptable. My setup uses django 1.9 with postgresql for the main DB.
After a day of staring at my screen, I found a solution:
I removed the managed=True from the models, and generated migrations. To prevent actual migrations against the production database, I used my database router to prevent the migrations. (return False in allow_migrate when for the appropriate app and database).
In my settings I detect whether unittests are being run, and then just don't define the database router or the external database. With the migrations present, the unit tests.
I am trying to write a Django app that queries a remote database for some data, performs some calculations on a portion of this data and stores the results (in the local database using Django models). It also filters another portion and stores the result separately. My front end then queries my Django database for these processed data and displays them to the user.
My questions are:
How do I write an agent program that continuously runs in the backend, downloads data from the remote database, does calculations/ filtering and stores the result in the local Django database ? Particularly, what are the most important things to keep in mind when writing a program that runs indefinitely?
Is using cron for this purpose a good idea ?
The data retrieved from the remote database belong to multiple users and each user's data must be kept/ stored separately in my local database as well. How do I achieve that? using row-level/ class-instance level permissions maybe? Remember that the backend agent does the storage, update and delete. Front end only reads data (through http requests).
And finally, I allow creation of new users. If a new user has valid credentials for the remote database the user should be allowed to use my app. In which case, my backend will download this particular user's data from the remote database, performs calculations/ filtering and presents the results to the user. How can I handle the dynamic creation of objects/ database tables for the new users? and how can I differentiate between users' data when retrieving them ?
Would very much appreciate answers from experienced programmers with knowledge of Django. Thank you.
For
1) The standard get-go solution for timed and background task is Celery which has Django integration. There are others, like Huey https://github.com/coleifer/huey
2) The usual solution is that each row contains user_id column for which this data belongs to. This maps to User model using Django ORM's ForeignKey field. Do your users to need to query the database directly or do they have direct database accounts? If not then this solution should be enough. It sounds like it your front end has 1 database connection and all permission logic is handled by the front end, not the database itself.
3) See 2
I am writing a Python server in Tornado which works with HTML5 WebSockets. My server works by creating a connection with the client browser through JavaScript. Once a connection is created, it stays open until the browser (or the server closes it). I need to periodically check if one of my models has changed or if the database has updated.
Here's code example to demonstrate what I mean:
>>> mymodels = MyModel.objects.all()
>>> len(mymodels)
150
>>> # Some stuff happens on the client and the model is changed, one more entry is added
>>> mymodels = MyModel.objects.all()
>>> len(mymodels)
151
This all happens within a server application where the changes to the model will occur within one "session" of the server script running. Is there anyway I can check for new objects or refresh my Django database?
An example of what I mean if it still isn't clear: Let's say I have a model called MyModel. When the server script is first run, it has 150 different entries or database rows. I establish a WebSocket connection with my server from my client and request that I be updated whenever a new change occurs. Somewhere else in my client, some other user does something that creates a new row in my database for the MyModel class. My server, while still keeping the same connection that it has to the original client already, needs to be able to detect that change without stopping its execution.
Checking periodically isn't the problem, its actually making sure that the Django database API is aware of the newly added information. Is there anyway I can ensure that that happens? The originally posted example code does not actually work. The length of MyModel.objects.all() is still 150 no matter how many items I add to the model. If I restart my Django shell, it updates the count.
Some other things I have tried:
Reloading the models module using the built-in reload() function.
Filtering the model for a certain set of MyModel
Using raw SQL queries to both select everything and filter based on certain conditions
All of these methods keep returning the same number of MyModel objects no matter how many changes I make to the database. Interestingly enough, running the raw SQL in MySQL Workbench produces the expected results.
I FIGURED IT OUT!
The simplest way to force Django to update its database reference is to close the database connection. Django will automatically create a new one as it needs to.
django.db.close_connection()
If you have changes that need to be committed before you close the connection, this will accomplish the same as above, but keep the changes that you have made. (i.e. you will not need to close the connection as this refreshes the database anyway)
django.db.connection._commit()
Thanks for your comments and have a nice day!
If it's all within one "session" of the server-side script running, maybe you've got the whole thing running in one DB transaction (which would mean that nothing else could see it) - although the fact that you can see them incrementally in Workbench suggests not. Having said that, it's worth checking out what you're doing with transactions.
Also, have you read this to make sure that Django is doing what you think it's doing?