Django migrations. How to check if table exists in migrations? - python

I'm currently working on app built on Django 1.8 and Postgres. This app is installed in several environments, in some of them there old tables in DB from which i need to delete records.
I wrote migration with following SQL query:
IF EXISTS (
SELECT relname FROM pg_class WHERE relname=tablename
) THEN
DELETE FROM tablename END IF;
But Django throws error at this query:
django.db.utils.ProgrammingError: syntax error at or near "IF"
Can i somehow check, in migration, that table exists, and only then execute query, like DROP FROM tablename ?

The easiest way to check if a table exists is to use django.db.connection.introspection.table_names():
from django.db import connection
...
all_tables = connection.introspection.table_names()
old_tables = set('old_table_1', 'old_table_2')
existing_old_tables = old_tables.union(all_tables)
# clean tables in existing_old_tables with migrations.RunSQL() as suggested above

Solved it using django.db.connection. Code:
from django.db import migrations
from django.db import connection
class Migration(migrations.Migration):
db_cursor = connection.cursor()
check_exists_query = "SELECT relname FROM pg_class WHERE relname=%s;"
base_query = "DELETE FROM {table} WHERE condition;"
tables = [tables]
existing_tables = []
for table in tables:
db_cursor.execute(check_exists_query, [table])
result = db_cursor.fetchone()
if result:
existing_tables.append(table)
operations = [
migrations.RunSQL(base_query.format(table=existing_table)) for existing_table in existing_tables
]

Related

create a copy of all tables with its key and other constraints from sql-server database to another database

I try to copy the sql-server's table with its keys and other constraints through SQLAlchemy in python
`from sqlalchemy import *
DATABASE_CONN = f'mssql://#{SERVER}/{DATABASE}?driver={DRIVER}'
DATABASE_CONN2 = f'mssql://#{SERVER}/{DB2}?driver={DRIVER}'
engine = create_engine(DATABASE_CONN)
engine2 = create_engine(DATABASE_CONN2)
connection = engine.connect() connection2 = engine2.connect()
metadata = MetaData() table = Table('table_name', metadata, autoload=True, autoload_with=engine) table.create(engine2)
ERROR:
sqlalchemy.exc.NoSuchTableError: table_name
if i putting the specific table name instead of 'table_name' then it create that table in new databse but i have to do it for all table so any technique to resolve this issue and make the copy of all table with its keys and other constraint at one go.
You recreate all the table schema from one database in a second by reflecting the first into a metadata object and then creating the tables in the second:
meta = MetaData()
meta.reflect(engine)
meta.create_all(engine2)
It's possible to control precisely which tables get reflected - review the docs for Reflecting Database Objects and metadata.reflect.

How to pass qs.query to a custom plpgsql function with Django ORM?

I have some queryset qs = MyModel.objects.filter(fieldname='fieldname') and a custom plpgsql function my_function(), that I can use in pg_shell like:
SELECT my_function('SELECT * FROM my_model');
How to do this with Django ORM? I tried to:
with connection.cursor() as cursor:
query, params = qs.query.sql_with_params()
query = f"SELECT my_function('{query}')"
cursor.execute(query, params)
But qs.query.sql_with_params() is not intended to return valid SQL.
Is something like this even possible with Django ORM?
First of all, if you want to call function/procedure do it like
from django.db import connections
connection = connections['default']
with connection.cursor() as cursor:
cursor.callproc('name_of_your_function', ('SELECT * FROM some_table WHERE whatever_you_want', ))
result = cursor.fetchone()
Next thing is about real sql query from django orm, not poor query that brings queryset.query or qs.query.sql_with_params especially in case you work with dates but the real one that will pass to psycopg
from .models import SomeModel
qs = SomeModel.objects.filter(create_at__gt='2021-04-20')
from django.db import connections
conn = connections['default']
from django.db.models.sql.compiler import SQLCompiler
compiler = SQLCompiler(qs.query, conn, using=qs.using)
sql, params = compiler.as_sql()

Multiple Django Database Connection [duplicate]

I have a Django project that utilizes multiple databases. https://docs.djangoproject.com/en/dev/topics/db/multi-db/
I perform a lot of raw queries like this:
cursor = connection.cursor()
cursor.execute("select * from my_table")
....
transaction.commit_unless_managed()
How can I specify which database to use?
Refer django docs on executing custom query directly. Specify database in your connection as given below:
from django.db import connections
cursor = connections['db_alias'].cursor()
cursor.execute("select * from my_table")
And then commit using
from django.db import transaction
transaction.commit_unless_managed(using='db_alias')
try this may be it should works.
from django.db import connections
cursor = connections[’my_db_name’].cursor()
# Your code here...
transaction.commit_unless_managed(using=’my_db_name’)

ProgrammingError, Flask with postgres and sqlalchemy

I am trying to get this setup to work, the database is created correctly, but trying to insert data I get the following error:
On sqlite:
sqlalchemy.exc.OperationalError
OperationalError: (sqlite3.OperationalError) no such column: Author [SQL: u'SELECT count(*) AS count_1 \nFROM (SELECT Author) AS anon_1']
On postgres:
sqlalchemy.exc.ProgrammingError
ProgrammingError: (psycopg2.ProgrammingError) column "author" does not exist
LINE 2: FROM (SELECT Author) AS anon_1
^
[SQL: 'SELECT count(*) AS count_1 \nFROM (SELECT Author) AS anon_1']
edit: Perhaps this has to do with it: I don't understand why it says "anon_1", as I am using credentials clearly?
I have inspected postgres and sqlite and the tables are created correctly. It seems to be an ORM configuration error, as it only seems to happend on inspecting or creating entries, any suggestion would be welcome!
class Author(CommonColumns):
__tablename__ = 'author'
author = Column(String(200))
author_url = Column(String(2000))
author_icon = Column(String(2000))
comment = Column(String(5000))
registerSchema('author')(Author)
SETTINGS = {
'SQLALCHEMY_TRACK_MODIFICATIONS': True,
'SQLALCHEMY_DATABASE_URI': 'sqlite:////tmp/test.db',
# 'SQLALCHEMY_DATABASE_URI': 'postgresql://xxx:xxx#localhost/test',
}
application = Flask(__name__)
# bind SQLAlchemy
db = application.data.driver
Base.metadata.bind = db.engine
db.Model = Base
db.create_all()
if __name__ == "__main__":
application.run(debug=True)
What is the query you're using to insert data?
I think the error messages may be a bit more opaque than they need to be because you're using Author/author in three very similar contexts:
the Class name
the table name
the column name
For easier debugging, the first thing I'd do is temporarily make each one unique (AuthorClass, author_table, author_column) so you can check which 'Author' is actually being referred to by the error message.
Since you're using the ORM, I suspect the underlying issue is that your insert statement uses Author (the object) when it should actually be using Author.author (the attribute/column name). The SELECT statements are complaining that they can't find the column 'author', but because you use author for both the table and column name, it's unclear what's actually being passed into the SQL statement.

copy contents of one table to another table in django

How do I copy all the rows in a django table to a new table while retaining the old table contents?
Execute raw SQL directly:
from django.db import connection, transaction
cursor = connection.cursor()
# commit required
cursor.execute("SELECT * INTO %s FROM %s" % (newtable, oldtable))
transaction.commit_unless_managed()
Working from model objects, the table name will be stored in _meta.db_table

Categories