Something peculiar I've noticed is that any changes committed to the DB outside of the session (such as ones made in MySQL's Workbench) are not recognised in the sqlAlchemy session. I have to close and open a new session for sqlAlchemy to recognise it.
For example, a row I deleted manually is still fetched from sqlAlchemy.
This is how I initialise the session:
engine = create_engine('mysql://{}:{}#{}/{}'.format(username, password, host, schema), pool_recycle=3600)
Session = sessionmaker(bind=engine)
session = Session()
metadata = MetaData()
How can I get sqlAlchemy to recognise them?
My sqlAlchemy version is 0.9.4 and my MySQL version is 5.5.34. We use only sqlAlchemy's Core (no ORM).
To be able to read committed data from others transactions you'll need to set transaction isolation level to READ COMMITTED. For sqlalchemy and mysql:
To set isolation level using create_engine():
engine = create_engine(
"mysql://scott:tiger#localhost/test",
isolation_level="READ COMMITTED")
To set using per-connection execution options:
connection = engine.connect()
connection = connection.execution_options(
isolation_level="READ COMMITTED")
source
Related
I'm trying to test my FASTAPI app. Seems to me, all settings are correct.
test_users.py
engine = create_engine(
f"postgresql"
f"://{settings.database_username}"
f":{settings.database_password}"
f"#{settings.database_hostname}"
f":{settings.database_port}"
f"/test_{settings.database_name}"
)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base.metadata.create_all(bind=engine)
def override_get_db():
try:
db = TestingSessionLocal()
yield db
finally:
db.close()
app.dependency_overrides[get_db] = override_get_db
client = TestClient(app)
def test_create_user():
response = client.post(
"/users/",
json={"email": "nikita#gmail.com", "password": "password"}
)
new_user = schemas.UserOutput(**response.json())
assert response.status_code == 201
assert new_user.email == "nikita#gmail.com"
When I run pytest, I get this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: FATAL: database "test_social_media_api" does not exist
Why is the code not creating the database?
With engine = create_engine("postgresql://...") you define a connection to an existing PostgreSql database.
And with Base.metadata.create_all(bind=engine) you create the tables - according to your models - in the existing database.
So the code that you written does not create a database, it expects that you give it an already existing database.
And that has to do with PostgreSQL itself.
PostgreSQL runs as a server, and a PostgreSQL server can run multiple databases. And each database has to be created explicitly.
Just telling SQLAlchemy the connection string is not enough.
It's possible to create a new database from Python itself by connecting to the PostgreSQL server (see https://www.tutorialspoint.com/python_data_access/python_postgresql_create_database.htm), or alternatively you can create it manually before you run your script. E.g. by running CREATE DATABASE databasename; inside psql (or any other database tool).
However if you want to test using a running database, I would suggest using testcontainers. They will spawn a new PostgreSQL server with an empty database everytime you run the tests.
Notice, that the example from the FastAPI documentation works differently.
They just use
SQLALCHEMY_DATABASE_URL = "sqlite:///./test.db"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
Base.metadata.create_all(bind=engine)
which creates the database.
This works, because SQLite doesn't run as a server. It's just one file that represents the full database, and if the file doesn't exist, the sqlite database adapter will assume that the database is just empty, and create a new file for you. PostgreSQL doesn't work like this though.
I'm fairly new to the SQLAlchemy ORM. Im using a mySQL database whose schema I imported in a .sql file. I created the engine, connected to the database. I bound both the MetaData and the Session objects to the engine. But when I ran:
for t in metadata.tables:
print(t.name)
I got the following error:
fkey["referred_table"] = rec["TABLENAME"]
KeyError: 'TABLENAME'
So what am I doing wrong here? It is something elementary?
Below is the full code:
import sqlalchemy
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy import *
engine = create_engine('mysql://sunnyahlawat:miq182#localhost/sqsunny')
engine.connect()
Session = sessionmaker(bind=engine)
session = Session()
metadata = MetaData(bind = engine, reflect = True)
#metadata.reflect(bind = engine)
for t in metadata.tables:
print(t.name)
#print(engine.table_names())
If the database being referred is a data dump and the table in question has foreign keys linked to an external database which has not been exported and is not on the same server, this error can come up.
The foreign key constraint fails in such a case.
A possible solution is to drop the constraint - if this is being tried out just in a test environment.
My application does not update the database - all queries are SELECT statements. I'm struggling how best to handle direct changes to the database (i.e. opening MySQLWorkbench and changing data there). Without session.commit(), my Flask application is returning stale data.
My solution right now is to have a session.commit() as the first line of each Flask endpoint, but I feel this is the incorrect way of handling this.
Session creation at start of app:
engine = db.create_engine('mysql+pymysql://...')
connection = engine.connect()
metadata = db.MetaData()
Base = declarative_base()
Session = sessionmaker(autoflush=True)
Session.configure(bind=engine)
session = Session()
session.expire_all() to mark all session data as expired. Then when you are trying to access something, it will be fetched from the database.
session.expire(object) does the same but for objects only
db.session.refresh(some_object) expires and reloads all object data
Nice article about that can be found here: https://www.michaelcho.me/article/sqlalchemy-commit-flush-expire-refresh-merge-whats-the-difference
I have a code that runs a query from a query list. These query are long and take quite a long time to execute. Since I am executing these query in a loop, the session seems to expire and I get a error telling me that the connection to the server was lost.
Then I created the session as well as engine inside the loop (I closed the session and disposed the engine at the end of the loop.) I have understood that creating new connection is an expensive operation.
How can I re-use the connection in this case so that I do not have to create the session and engine each time?
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
# an Engine, which the Session will use for connection
# resources
some_engine = create_engine('mysql://user:password#localhost/')
# create a configured "Session" class
Session = sessionmaker(bind=some_engine)
# create a Session
session = Session()
for long_query in long_query_list:
# work with sess
session.execute(long_query)
session.commit()
I am trying to create a database with pg8000 driver of postgressql , but unable to create. Creating a db manually and then connecting to it works fine with me, but i need to create db with my code. I am getting error "sqlalchemy.exc.ProgrammingError: (pg8000.ProgrammingError)". I have tried below code for creating a db.
from sqlalchemy import create_engine
dburl = "postgresql+pg8000://user:pswd#myip:5432/postgres/"
engine = create_engine(dburl)
conn = engine.connect()
conn.execute("COMMIT")
conn.execute("CREATE DATABASE qux")
I also tried with below -
from sqlalchemy import create_engine
from sqlalchemy.engine import url
settings ={"drivername" : "postgresql+pg8000", "host" : "myip","port" : 5432,"username" : "user","password" : "pswd","database" : "MyTestDB"}
db=create_engine(url.URL(**settings))
db.execute("commit")
This is the exact Error i am getting """sqlalchemy.exc.ProgrammingError: (pg8000.ProgrammingError) ('ERROR', '25001', 'CREATE DATABASE cannot run inside a transaction block') [SQL: 'create database workDB']""""
Please suggest as to how i can create this db...
Here's a solution:
from sqlalchemy import create_engine
dburl = "postgresql+pg8000://user:pswd#myip:5432/postgres/"
engine = create_engine(dburl)
conn = engine.connect()
con.rollback() # Make sure we're not in a transaction
con.autocommit = True # Turn on autocommit
conn.execute("CREATE DATABASE qux")
con.autocommit = False # Turn autocommit back off again
The docs talk about this problem of executing commands that can't be run in a transaction. The thing is that pg8000 automatically executes a begin transaction before any execute() if there isn't already a transaction in progress. This is fine until you come to execute a command that can't be executed inside a transaction. In that case you have to enter autocommit mode, which implicitly starts a transaction before the statement and commits it afterwards, but automatically avoids doing this if (like CREATE DATABASE) the statement can't be executed within a transaction.