I'm trying to automate the concurrent creation of PostgreSQL indexes via Django. However, when I try to execute SQL via Django like:
from django.db import connections
cursor = connections['mydb'].cursor()
cursor.execute('CREATE INDEX CONCURRENTLY some_index ON some_table (...)')
I get the error:
DatabaseError: CREATE INDEX CONCURRENTLY cannot run inside a transaction block
Even if I use the old #commit_manually decorator or the new #atomic decorator, I still get this error. How do I completely disable transactions in Django?
Related
I am using SQLAlchemy and pg8000 to connect to a Postgres database.
I have checked table pg_stat_activity, which shows me a few select queries in 'idle in transaction' state, many of those. But the application much more reads than writes, that is, inserts are few and far between.
I suspect that a new transaction is created for each query, even for simple select statements.
Is it possible to run a read-only query without the need for a transaction? So that it does not need to be committed/rolled back?
Currently, the app runs its queries with method sqlalchemy.engine.Engine.execute for CRUD operations and cursors for calling stored procedures. How should I update these method calls to indicate I want some of them not to start transactions?
I have the following scenario: users can log in and update certain tables, and whenever that table is updated I would like to call a Python script that will query the same database for additional information (joins) and then use Django's email system to email all users in that group (same role) that information.
My current methodology is to create a user-defined function in SQLite3, and then create a trigger that will call that function (which is linked to a Python file). Is this possible with SQLite?
Django app, SQLite and Python.
import sqlite3
def email_materials_submittal(project_id):
print(project_id)
con = sqlite3.connect('../db.sqlite3')
con.create_function("FUNC_EMS", 1, email_materials_submittal)
con.commit()
cur = con.cursor()
cur.execute("CREATE TRIGGER TRIG_EMS AFTER INSERT ON core_level BEGIN SELECT FUNC_EMS(NEW.project_id); END;")
con.commit()
con.close()
I want SQL to call an external Python script whenever a new row is inserted into a certain table. Tried the code above, but SQLite throws the error: no such function: FUNC_EMS
This task is much easier using signals for some notifications and a combination of custom management commands and crontab jobs for others (date reminders). Django is awesome!
Is it possible to run a python flask app that pulls from a sql database, while also running a python script that updates the sql database every few seconds?
Yes, databases are designed to handle this type of concurrent access. If the database is in the middle of an update, it will wait until the update is complete before handling the Flask app's query, and it will complete the query before starting the next incoming update.
I'd like to make a query to a Postgres database then somehow lock the returned rows so that other SQLAlchemy threads/processes cannot modify those rows. In the same session/transaction of the query, I'd like to update the rows I received from the query and then commit the changes. Anyone know what to do?
I tried implementing a query with the with_for_update(nowait=True) function, but this throws an OperationalError. I could catch this exception and simply the query again, but I'd like to offload this to the db if possible.
I'm using:
Postgres 9.4.1
SQLAlchemy 1.0.11 (ORM)
Flask-Restful
FlaskSQLAlchemy
I'm prepared to use straight SQLAlchemy if it's not possible with FlaskSQLAlchemy.
Referring to the example in Django documentation for multiple databases in one application,
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#an-example
" It also doesn’t consider the interaction of transactions with the database utilization strategy. "
How do I handle the interaction stated above.
The scenario is this:
I am using postgresql as my database. I have setup up a replica and want all the reads to "auth" tables to go to replica. Following the documentation I wrote a database router. Now whenever I try to log in to my application is throws the following error.
DatabaseError: cannot execute UPDATE in a read-only transaction.
This happens when Django tries to save the "last_login" time. Because, in the same view it first fetches the record from replica, and then tries to update the last_login time. Since it happens in one transaction so the same database is used, i.e. replica.
How do I handle this?
Thoughts?