I have a Django project that utilizes multiple databases. https://docs.djangoproject.com/en/dev/topics/db/multi-db/
I perform a lot of raw queries like this:
cursor = connection.cursor()
cursor.execute("select * from my_table")
....
transaction.commit_unless_managed()
How can I specify which database to use?
Refer django docs on executing custom query directly. Specify database in your connection as given below:
from django.db import connections
cursor = connections['db_alias'].cursor()
cursor.execute("select * from my_table")
And then commit using
from django.db import transaction
transaction.commit_unless_managed(using='db_alias')
try this may be it should works.
from django.db import connections
cursor = connections[’my_db_name’].cursor()
# Your code here...
transaction.commit_unless_managed(using=’my_db_name’)
Related
I have some queryset qs = MyModel.objects.filter(fieldname='fieldname') and a custom plpgsql function my_function(), that I can use in pg_shell like:
SELECT my_function('SELECT * FROM my_model');
How to do this with Django ORM? I tried to:
with connection.cursor() as cursor:
query, params = qs.query.sql_with_params()
query = f"SELECT my_function('{query}')"
cursor.execute(query, params)
But qs.query.sql_with_params() is not intended to return valid SQL.
Is something like this even possible with Django ORM?
First of all, if you want to call function/procedure do it like
from django.db import connections
connection = connections['default']
with connection.cursor() as cursor:
cursor.callproc('name_of_your_function', ('SELECT * FROM some_table WHERE whatever_you_want', ))
result = cursor.fetchone()
Next thing is about real sql query from django orm, not poor query that brings queryset.query or qs.query.sql_with_params especially in case you work with dates but the real one that will pass to psycopg
from .models import SomeModel
qs = SomeModel.objects.filter(create_at__gt='2021-04-20')
from django.db import connections
conn = connections['default']
from django.db.models.sql.compiler import SQLCompiler
compiler = SQLCompiler(qs.query, conn, using=qs.using)
sql, params = compiler.as_sql()
I am newbie in Python Flask. In my project we are creating db object using below code.
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////tmp/test.db'
db = SQLAlchemy(app)
I want to get cursor object from db. Can someone please help me on it.
I know using connection object we can get cursor. But can we get cursor from db object which is created by above way? Thanks.
Finally got answer from Flask documentation, we can get cursor from db object using,
from sqlalchemy import create_engine
engine = create_engine('your_connection_string')
connection = engine.raw_connection()
cursor = connection.cursor()
It's ok in this way.
db = SQLAlchemy(app)
session = db.session()
cursor = session.execute(sql).cursor
create cursor from sqlalchemy orm session
from sqlalchemy.orm import sessionmaker
engine=create_engine(url)
session=sessionmaker(bind= engine)()
curs=session.connection().connection.cursor()
You don't have a cursor in Flask SQLAlchemy because it's an ORM. Methods exist to perform actions on the database directly, unlike packages like SQLite. Check the documentation for more info on these.
To execute SQL queries directly, you'd have to run them through the terminal directly through whichever RDBMS you're using (MySQL, PostgreSQL, etc.)
I'm currently working on app built on Django 1.8 and Postgres. This app is installed in several environments, in some of them there old tables in DB from which i need to delete records.
I wrote migration with following SQL query:
IF EXISTS (
SELECT relname FROM pg_class WHERE relname=tablename
) THEN
DELETE FROM tablename END IF;
But Django throws error at this query:
django.db.utils.ProgrammingError: syntax error at or near "IF"
Can i somehow check, in migration, that table exists, and only then execute query, like DROP FROM tablename ?
The easiest way to check if a table exists is to use django.db.connection.introspection.table_names():
from django.db import connection
...
all_tables = connection.introspection.table_names()
old_tables = set('old_table_1', 'old_table_2')
existing_old_tables = old_tables.union(all_tables)
# clean tables in existing_old_tables with migrations.RunSQL() as suggested above
Solved it using django.db.connection. Code:
from django.db import migrations
from django.db import connection
class Migration(migrations.Migration):
db_cursor = connection.cursor()
check_exists_query = "SELECT relname FROM pg_class WHERE relname=%s;"
base_query = "DELETE FROM {table} WHERE condition;"
tables = [tables]
existing_tables = []
for table in tables:
db_cursor.execute(check_exists_query, [table])
result = db_cursor.fetchone()
if result:
existing_tables.append(table)
operations = [
migrations.RunSQL(base_query.format(table=existing_table)) for existing_table in existing_tables
]
Hello thanks for reading. I am doing a quick site in Django and I have a very simple update statement in raw SQL I am making to my Postgres database. Something in here is making trouble:
from django.http import HttpResponse
from django.db import connection, transaction
def rsvp_update(request, rsvp_id, status):
cursor = connection.cursor()
cursor.execute("UPDATE public.rsvp SET status=%s WHERE rsvp_id = %s", [status, rsvp_id])
transaction.commit()
return HttpResponse('okay')
I am getting an error that says "TransactionManagementError at [URL]
This code isn't under transaction management". Any ideas?
You need to use the commit_manually decorator for the code where you manage transactions manually.
How do I copy all the rows in a django table to a new table while retaining the old table contents?
Execute raw SQL directly:
from django.db import connection, transaction
cursor = connection.cursor()
# commit required
cursor.execute("SELECT * INTO %s FROM %s" % (newtable, oldtable))
transaction.commit_unless_managed()
Working from model objects, the table name will be stored in _meta.db_table