How to create a table with schema in SQLModel? - python

I want to create a table with schema "fooschema" and name "footable". Based on this GitHub issue and to some extent the docs, I tried
fooMetadata = MetaData(schema="fooschema")
class Foo(SQLModel, table=True):
__tablename__ = "footable"
metadata = fooMetadata
id_: int = Field(primary_key=True)
engine = create_engine("<url>")
Foo.metadata.create_all(engine)
with Session(engine) as session:
row = Foo(id_=0)
session.add(row)
session.commit()
session.refresh(row)
and tried replacing metadata = fooMetadata with __table_args__ = {"schema": "fooSchema"}, and replacing Foo.metadata.create_all with SQLModel.metadata.create_all but I'm always getting an error like
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedTable) relation "fooschema.footable" does not exist
or
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.InvalidSchemaName) schema "fooschema" does not exist
Oddly __table_args__ works for reading an existing table, just not creating it.

I solved this by explicitly creating the schema with
with engine.connect() as connection:
connection.execute(CreateSchema("fooschema"))
connection.commit()
Note this is sort of answered in this answer but it's not particularly clear and this question is about SQLModel specifically.

Related

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unknown database "mydb"

I have a problem to use flask_alchemy for my unit test function
In the production environment I use a postgresql database
"SQLALCHEMY_DATABASE_URI": "postgresql://login:passwd#dburl:1234/mydatabase",
To work with the postgresql schema, in my entity definition i declare a _table_args to specify a schema
class MyTable(Base):
__tablename__ = 'my_tablename'
__table_args__ = {'schema': 'mydbschema'}
my_id = Column('my_id', Date, primary_key=True)
....
But in my unittest i would like to use a memory database
"SQLALCHEMY_DATABASE_URI": "sqlite://",
When i run my function i have this error :
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unknown database "mydbschema"
Somebody knows a workarround in this case ?
I know that OP found a workaround (though, I'm not sure how it works), but for those who still searching for proper solution - you need to use ATTACH DATABASE statement in order to select specific schema, for example if you are using pytest you can use following fixture to create test's setup:
from sqlalchemy import create_engine
#pytest.fixture
def setup_db():
engine = create_engine('sqlite:///:memory:')
with engine.connect() as conn:
conn.execute('ATTACH DATABASE \':memory:\' AS mydbschema;')
yield conn
Of course, you can define fixture scope by your needs.
Similarly, you can use DETACH DATABASE statement to detach and dissociate a named database from a database connection (if necessary), which will destroy DB in the case of in-memory databases.
I found a workaround ...
class MyTable(Base):
__tablename__ = 'my_tablename'
__table_args__ = {'schema': 'mydbschema'}
my_id = Column('my_id', Date, primary_key=True) if os.environ.get('TESTVAR') is None else {}
....

How to recognise a sqlite database created with sqlalchemy using pydal?

I create a very simple database with sqlalchemy as follows:
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Person(Base):
__tablename__ = 'person'
id = Column(Integer, primary_key=True)
name = Column(String(250), nullable=False)
engine = create_engine('sqlite:///sqlalchemy_example.db')
# Create all tables in the engine. This is equivalent to "Create Table"
# statements in raw SQL.
Base.metadata.create_all(engine)
Base.metadata.bind = engine
DBSession = sessionmaker(bind=engine)
session = DBSession()
# Insert a Person in the person table
new_person = Person(name='new person')
session.add(new_person)
session.commit()
and then I tried to read it using pyDAL reference.
from pydal import DAL, Field
db = DAL('sqlite://sqlalchemy_example.db', auto_import=True)
db.tables
>> []
db.define_table('person', Field('name'))
>> OperationalError: table "person" already exists
How do I access the table using pyDAL?
thank you
First, do not set auto_import=True, as that is only relevant if pyDAL *.table migration metadata files exist for the tables, which will not be the case here.
Second, pyDAL does not know the table already exists, and because migrations are enabled by default, it attempts to create the table. To prevent this, you can simply disable migrations:
# Applies to all tables.
db = DAL('sqlite://sqlalchemy_example.db', migrate_enabled=False)
or:
# Applies to this table only.
db.define_table('person', Field('name'), migrate=False)
If you would like pyDAL to take over migrations for future changes to this table, then you should run a "fake migration", which will cause pyDAL to generate a *.table migration metadata file for this table without actually running the migration. To do this, temporarily make the following change:
db.define_table('person', Field('name'), fake_migrate=True)
After leaving the above in place for a single request, the *.table file will be generated, and you can remove the fake_migrate=True argument.
Finally, note that pyDAL expects the id field to be an auto-incrementing integer primary key field.

Get existing table using SQLAlchemy MetaData

I have a table that already exists:
USERS_TABLE = Table("users", META_DATA,
Column("id", Integer, Sequence("user_id_seq"), primary_key=True),
Column("first_name", String(255)),
Column("last_name", String(255))
)
I created this table by running this:
CONN = create_engine(DB_URL, client_encoding="UTF-8")
META_DATA = MetaData(bind=CONN, reflect=True)
# ... table code
META_DATA.create_all(CONN, checkfirst=True)
the first time it worked and I was able to create the table. However, the 2nd time around I got this error:
sqlalchemy.exc.InvalidRequestError: Table 'users' is already defined for this MetaData instance. Specify 'extend_existing=True' to redefine options and columns on an existing Table object.
which makes sense since the table users already exists. I'm able to see if the table exists like so:
TABLE_EXISTS = CONN.dialect.has_table(CONN, "users")
However, how do I actually get the existing table object? I can't find this anywhere in the documentation. Please help.
We have 3 different approaches here:
assume that required tables have been created already, reflecting them and getting with MetaData.tables dictionary field like
from sqlalchemy import MetaData, create_engine
CONN = create_engine(DB_URL, client_encoding="UTF-8")
META_DATA = MetaData(bind=CONN, reflect=True)
USERS_TABLE = META_DATA.tables['users']
removing reflect flag from MetaData object initialization, because we don't use it and moreover – trying to create tables that've been already reflected:
from sqlalchemy import (MetaData, Table, Column, Integer, String, Sequence,
create_engine)
CONN = create_engine('sqlite:///db.sql')
META_DATA = MetaData(bind=CONN)
USERS_TABLE = Table("users", META_DATA,
Column("id", Integer, Sequence("user_id_seq"),
primary_key=True),
Column("first_name", String(255)),
Column("last_name", String(255)))
META_DATA.create_all(CONN, checkfirst=True)
assuming that we are keeping reflected table if it was previously created by setting in Table object initializer keep_existing flag to True:
from sqlalchemy import (MetaData, Table, Column, Integer, String, Sequence,
create_engine)
CONN = create_engine('sqlite:///db.sql')
META_DATA = MetaData(bind=CONN, reflect=True)
USERS_TABLE = Table("users", META_DATA,
Column("id", Integer, Sequence("user_id_seq"),
primary_key=True),
Column("first_name", String(255)),
Column("last_name", String(255)),
keep_existing=True)
META_DATA.create_all(CONN, checkfirst=True)
Which one to choose? Depends on your use case, but I prefer second one since it looks like you aren't using reflection, also it is simplest modification: just removing flag from MetaData initializer.
P. S.
we can always make reflection after initialization of MetaData object with MetaData.reflect method:
META_DATA.reflect()
also we can specify which tables to reflect with only parameter (may be any iterable of str objects):
META_DATA.reflect(only=['users'])
and many more.
This works for me pretty well -
import sqlalchemy as db
engine = db.create_engine("your_connection_string")
meta_data = db.MetaData(bind=engine)
db.MetaData.reflect(meta_data)
USERS = meta_data.tables['users']
# View the columns present in the users table
print(USERS.columns)
# You can run sqlalchemy queries
query = db.select([
USERS.c.id,
USERS.c.first_name,
USERS.c.last_name,
])
result = engine.execute(query).fetchall()
Note that using reflect parameter in Metadata(bind=engine, reflect=True) is deprecated and will be removed in a future release. Above code takes care of it.
__table_args__ = {'extend_existing': True}
right below __tablename__
If you're using async Sqlalchemy, you can use
metadata = MetaData()
async with engine.connect() as conn:
await conn.run_sync(metadata.reflect, only=["harshit_table"])
harshit_table = Table("harshit_table", metadata, autoload_with=engine)
print("tables: ", harshit_table, type(harshit_table))
I'm quite new to this, but what worked for me was this (variables are declared in the original question)
USERS_TABLE_NEW = Table("users", META_DATA, autoload_with=CONN)

Sqlalchemy drop_all(engine) Oracle sequence is not dropped

With Python + Sqlalchemy + Oracle, trying to drop all tables and recreate them. Using oracle sequence in Id column for autoincreament,but drop all is not dropping sequence.
engine = create_engine('oracle://user:pass#host:port/db',
implicit_returning=False,
echo=False)
Base = declarative_base(bind=engine)
if DROP_AND_CREATE:
Base.metadata.drop_all(bind=engine)
meta_data = MetaData()
meta_data = Base.metadata
from domains import users
meta_data.create_all(engine, checkfirst=False)
domain package sample,
class Users(Base):
__tablename__ = 'users'
id = Column(Integer, Sequence('users_id_seq'), primary_key=True)
name = Column(String(255))
in the above all tables are dropped except I can see the sequences I am using are still present in oracle db. if i manually delete them and run again they are running fine.
The user_id_seq created in oracle is not getting dropped. please help.
Error message:
sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00955: name is already used by an existing object
[SQL: 'CREATE SEQUENCE user_queries_id_seq']
Finally after few hours of googling... still learning the language and framework., but this code works and does exactly what i want in my application...
Thanks all
Base = declarative_base(bind=engine)
from domains import users
if DROP_AND_CREATE:
Base.metadata.drop_all(bind=engine, checkfirst=True)
logger.info('Creating all registered tables.')
Base.metadata.create_all(bind=engine, checkfirst=True)

Getting SQLAlchemy to issue CREATE SCHEMA on create_all

I have a SqlAlchemy model with a schema argument like so:
Base = declarative_base()
class Road(Base):
__tablename__ = "roads"
__table_args__ = {'schema': 'my_schema'}
id = Column(Integer, primary_key=True)
When I use Base.metadata.create_all(engine) it correctly issues a CREATE TABLE with the schema name on the front like so CREATE TABLE my_schema.roads ( but Postgresql rightly complains that the schema doesn't exist.
Am I missing a step to get SqlAlchemy to issue the CREATE SCHEMA my_schema or do I have to call this manually?
I have done it manually on my db init script like so:
from sqlalchemy.schema import CreateSchema
engine.execute(CreateSchema('my_schema'))
But this seems less magical than I was expecting.
I ran into the same issue and believe the "cleanest" way of issuing the DDL is something like this:
from sqlalchemy import event
from sqlalchemy.schema import CreateSchema
event.listen(Base.metadata, 'before_create', CreateSchema('my_schema'))
This will ensure that before anything contained in the metadata of your base is created, you have the schema for it. This does, however, not check if the schema already exists.
You can do CreateSchema('my_schema').execute_if(callback_=check_schema) if you can be bothered to write the check_schema callback ("Controlling DDL Sequences" on should_create in docs). Or, as an easy way out, just use DDL("CREATE SCHEMA IF NOT EXISTS my_schema") instead (for Postgres):
from sqlalchemy import DDL
event.listen(Base.metadata, 'before_create', DDL("CREATE SCHEMA IF NOT EXISTS my_schema"))
I wrote a function that creates the declared schemas based on the accepted answer. It uses the schema value from the __table_args__ dict from each mapped class.
from sqlalchemy import event, DDL
# Import or write your mapped classes and configuration here
def init_db():
for mapper in Base.registry.mappers:
cls = mapper.class_
if issubclass(cls, Base):
table_args = getattr(cls, '__table_args__', None)
if table_args:
schema = table_args.get('schema')
if schema:
stmt = f"CREATE SCHEMA IF NOT EXISTS {schema}"
event.listen(Base.metadata, 'before_create', DDL(stmt))
Base.metadata.create_all(bind=engine)

Categories