FASTAPI testing database not creating database - python

I'm trying to test my FASTAPI app. Seems to me, all settings are correct.
test_users.py
engine = create_engine(
f"postgresql"
f"://{settings.database_username}"
f":{settings.database_password}"
f"#{settings.database_hostname}"
f":{settings.database_port}"
f"/test_{settings.database_name}"
)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base.metadata.create_all(bind=engine)
def override_get_db():
try:
db = TestingSessionLocal()
yield db
finally:
db.close()
app.dependency_overrides[get_db] = override_get_db
client = TestClient(app)
def test_create_user():
response = client.post(
"/users/",
json={"email": "nikita#gmail.com", "password": "password"}
)
new_user = schemas.UserOutput(**response.json())
assert response.status_code == 201
assert new_user.email == "nikita#gmail.com"
When I run pytest, I get this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: FATAL: database "test_social_media_api" does not exist
Why is the code not creating the database?

With engine = create_engine("postgresql://...") you define a connection to an existing PostgreSql database.
And with Base.metadata.create_all(bind=engine) you create the tables - according to your models - in the existing database.
So the code that you written does not create a database, it expects that you give it an already existing database.
And that has to do with PostgreSQL itself.
PostgreSQL runs as a server, and a PostgreSQL server can run multiple databases. And each database has to be created explicitly.
Just telling SQLAlchemy the connection string is not enough.
It's possible to create a new database from Python itself by connecting to the PostgreSQL server (see https://www.tutorialspoint.com/python_data_access/python_postgresql_create_database.htm), or alternatively you can create it manually before you run your script. E.g. by running CREATE DATABASE databasename; inside psql (or any other database tool).
However if you want to test using a running database, I would suggest using testcontainers. They will spawn a new PostgreSQL server with an empty database everytime you run the tests.
Notice, that the example from the FastAPI documentation works differently.
They just use
SQLALCHEMY_DATABASE_URL = "sqlite:///./test.db"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
Base.metadata.create_all(bind=engine)
which creates the database.
This works, because SQLite doesn't run as a server. It's just one file that represents the full database, and if the file doesn't exist, the sqlite database adapter will assume that the database is just empty, and create a new file for you. PostgreSQL doesn't work like this though.

Related

SQLAlchemy, attempt to write a readonly database

I have an API that I built using python fastAPI that pulls from a sqlite database using sqlalchemy, and it works fine in dev but after I deployed it I'm getting--
qlalchemy.exc.OperationalError: (sqlite3.OperationalError) attempt to write a readonly database
The code right now is--
SQLALCHEMY_DATABASE_URL = 'sqlite:///./etfs.db'
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args = {'check_same_thread':False})
How do I connect to the database so that it isn't read-only? Or how do I create the database in sqlite so that it isn't read-only to begin with?

Having Trouble Connecting to Cloud SQL (PostgreSQL) using Python's SQLALCHEMY

I set up the Cloud SQL instance on Google Cloud Platform and followed the official instructions, but don't seem to be able to connect to the Cloud SQL instance. When I try to do a sanity check and access the PostgreSQL db through Cloud Shell, I'm able to connect successfully though.
Could someone please help - I would be much obliged.
Code:
from sqlalchemy import create_engine
engine = create_engine('postgresql+psycopg2://<user>:<pass>#<public IP Address/<table>')
engine.connect()
Error:
Is the server running on host "XX.XX.XXX.XX" and accepting
TCP/IP connections on port XXXX?
I found another way to connect to a PostgreSQL GCP instance without using the Cloud SQL Proxy.
Code:
import sqlalchemy
username = '' # DB username
password = '' # DB password
host = '' # Public IP address for your instance
port = '5432'
database = '' # Name of database ('postgres' by default)
db_url = 'postgresql+psycopg2://{}:{}#{}:{}/{}'.format(
username, password, host, port, database)
engine = sqlalchemy.create_engine(db_url)
conn = engine.connect()
I whitelisted my IP address before trying to connect. (https://cloud.google.com/sql/docs/postgres/connect-external-app#appaccessIP)
Use the Cloud SQL proxy to connect to Cloud SQL from external applications.
In order to achieve this please follow the relevant documentation.
The steps described would consist of:
Enabling the Cloud SQL Admin API on your Cloud Console.
Installing the relevant proxy client according to your OS.
Use any of the available methods to authenticate the Cloud SQL Proxy.
Invoke the proxy with ./cloud_sql_proxy -instances=INSTANCE_CONNECTION_NAME=tcp:5432 & ond your terminal and connect the proxy by changing your code and using SQLALCHEMY:
from sqlalchemy import create_engine
engine = create_engine('postgresql+psycopg2://DATABASE_USER:PASSWORD#localhost:5432/')
NOTE: the code above assumes you are not trying to connect to the proxy in a production environment and are using an authenticated Cloud SDK client in order to connect to the proxy.
This worked to me using the Cloud SQL Proxy on my personal computer and uploading the code to Google App Engine standard.
db_user = os.environ.get('CLOUD_SQL_USERNAME')
db_pass = os.environ.get('CLOUD_SQL_PASSWORD')
db_name = os.environ.get('CLOUD_SQL_DATABASE_NAME')
db_connection_name = os.environ.get('CLOUD_SQL_CONNECTION_NAME')
if os.environ.get('GAE_ENV') == 'standard':
db_uri = f'postgresql+psycopg2://{db_user}:{db_pass}#/{db_name}?host=/cloudsql/{db_connection_name}'
else:
db_uri = f'postgresql+psycopg2://{db_user}:{db_pass}#127.0.0.1:1234/{db_name}'
app = Flask(__name__)
app.config["SQLALCHEMY_DATABASE_URI"] = db_uri
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
Depending on the database client library, the socket (/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432) needs to be specified.
The docs have this example for SQLAlchemy:
db_user = os.environ["DB_USER"]
db_pass = os.environ["DB_PASS"]
db_name = os.environ["DB_NAME"]
db_socket_dir = os.environ.get("DB_SOCKET_DIR", "/cloudsql")
cloud_sql_connection_name = os.environ["CLOUD_SQL_CONNECTION_NAME"]
pool = sqlalchemy.create_engine(
# Equivalent URL:
# postgresql+pg8000://<db_user>:<db_pass>#/<db_name>
# ?unix_sock=<socket_path>/<cloud_sql_instance_name>/.s.PGSQL.5432
sqlalchemy.engine.url.URL.create(
drivername="postgresql+pg8000",
username=db_user, # e.g. "my-database-user"
password=db_pass, # e.g. "my-database-password"
database=db_name, # e.g. "my-database-name"
query={
"unix_sock": "{}/{}/.s.PGSQL.5432".format(
db_socket_dir, # e.g. "/cloudsql"
cloud_sql_connection_name) # i.e "<PROJECT-NAME>:<INSTANCE-REGION>:<INSTANCE-NAME>"
}
),
**db_config
)
Be aware that this example is with pg8000 that uses unix_sock instead of unix_socket as socket identifier.

Google App Engine and Cloud SQL: Lost connection to MySQL server at 'reading initial communication packet' SQL 2nd Gen

I'm getting an error similar to other posts in this subject.
I tried switching from 1st gen to 2nd gen SQL server (both on us-central1), but it still doesn't work.
I copied my CLOUDSQL_PROJECT from the url on the top of my project.
I copied my CLOUDSQL_INSTANCE from the proprieties part in the SQL page.
In my main.py, I'm trying to run Google sample code, and it doesn't work (locally it does, of course):
if os.getenv('SERVER_SOFTWARE', '').startswith('Google App Engine/'):
db = MySQLdb.connect(
unix_socket='/cloudsql/{}:{}'.format(
CLOUDSQL_PROJECT,
CLOUDSQL_INSTANCE),
user=user,passwd=password)
# When running locally, you can either connect to a local running
# MySQL instance, or connect to your Cloud SQL instance over TCP.
else:
db = MySQLdb.connect(host=host,user=user,passwd=password)
cursor = db.cursor()
cursor.execute('SHOW VARIABLES')
for r in cursor.fetchall():
self.response.write('{}\n'.format(r))
The documentation is slightly outdated. You should be able to always use the "Instance connection name" property from the SQL properties page to construct the unix socket path; just append that value after the "/cloudsql/" prefix.
For second generation, the connection format is project:region:name. In your example, it maps to "hello-world-123:us-central1:sqlsomething3", and the unix socket path is "/cloudsql/hello-world-123:us-central1:sqlsomething3".

How to close a connection to MongoDB

I have a class that create the a MongoClient inside:
db = MongoDB ('mydb' , 'config')
I am successfully able to connect to 'mydb' database and 'config' collection - but after querying the collection I do need this connection to database again. I proceed to create connection with another database and collection
db = MongoDB ('mapping' , 'box_details')
In such a case how can I close the connection to DB previously - is it that it would automatically get closed when app exits?
I'd recommend you to open connection using pymongo.MongoClient which will return mongo_client object. mongo_cient has instance method close allowing you to close connection manually.
Please see documentation about mongo_client

sqlAlchemy does not recognise changes to DB made outside of session

Something peculiar I've noticed is that any changes committed to the DB outside of the session (such as ones made in MySQL's Workbench) are not recognised in the sqlAlchemy session. I have to close and open a new session for sqlAlchemy to recognise it.
For example, a row I deleted manually is still fetched from sqlAlchemy.
This is how I initialise the session:
engine = create_engine('mysql://{}:{}#{}/{}'.format(username, password, host, schema), pool_recycle=3600)
Session = sessionmaker(bind=engine)
session = Session()
metadata = MetaData()
How can I get sqlAlchemy to recognise them?
My sqlAlchemy version is 0.9.4 and my MySQL version is 5.5.34. We use only sqlAlchemy's Core (no ORM).
To be able to read committed data from others transactions you'll need to set transaction isolation level to READ COMMITTED. For sqlalchemy and mysql:
To set isolation level using create_engine():
engine = create_engine(
"mysql://scott:tiger#localhost/test",
isolation_level="READ COMMITTED")
To set using per-connection execution options:
connection = engine.connect()
connection = connection.execution_options(
isolation_level="READ COMMITTED")
source

Categories