I am trying to configure the tests. According to the tortoise orm documentation I create this test configuration file:
import pytest
from fastapi.testclient import TestClient
from tortoise.contrib.test import finalizer, initializer
import app.main as main
from app.core.config import settings
#pytest.fixture(scope="session", autouse=True)
def initialize_tests(request):
db_url = "postgres://USERNAME_HERE:SECRET_PASS_HERE#127.0.0.1:5432/test"
initializer(
[
"app.models",
],
db_url=db_url,
app_label="models"
)
print("initialize_tests")
request.add_finaliser(finalizer)
#pytest.fixture(scope="session")
def client():
app = main.create_application()
with TestClient(app) as client:
print("client")
yield client
And the test file looks like this:
def test_get(client):
response = client.get("/v1/url/")
assert response.status_code == 200
I try to run the tests, but I get this error:
asyncpg.exceptions._base.InterfaceError: cannot perform operation: another operation is in progress
I have found that some users don't use initializer and finalizer and do everything manually.
Testing in FastAPI using Tortoise-ORM
https://stackoverflow.com/a/66907531
But that doesn't look like the clear solution.
Question: Is there a way to make the tests work using initializer and finalizer?
Related
I have a FastAPI application where I have several tests written with pytest.
Two particular tests are causing me issues. test_a calls a post endpoint that creates a new entry into the database. test_b gets these entries. test_b is including the created entry from test_a. This is not desired behaviour.
When I run the test individually (using VS Code's testing tab) it runs fine. However when running all the tests together and test_a runs before test_b, test_b fails.
My conftest.py looks like this:
import pytest
from fastapi.testclient import TestClient
from sqlmodel import Session, SQLModel, create_engine
from application.core.config import get_database_uri
from application.core.db import get_db
from application.main import app
#pytest.fixture(scope="module", name="engine")
def fixture_engine():
engine = create_engine(
get_database_uri(uri="postgresql://user:secret#localhost:5432/mydb")
)
SQLModel.metadata.create_all(bind=engine)
yield engine
SQLModel.metadata.drop_all(bind=engine)
#pytest.fixture(scope="function", name="db")
def fixture_db(engine):
connection = engine.connect()
transaction = connection.begin()
session = Session(bind=connection)
yield session
session.close()
transaction.rollback()
connection.close()
#pytest.fixture(scope="function", name="client")
def fixture_client(db):
app.dependency_overrides[get_db] = lambda: db
with TestClient(app) as client:
yield client
The file containing test_a and test_b also has a module-scoped pytest fixture that seeds the data using the engine fixture:
#pytest.fixture(scope="module", autouse=True)
def seed(engine):
connection = test_db_engine.connect()
seed_data_session = Session(bind=connection)
seed_data(seed_data_session)
yield
seed_data_session.rollback()
All tests use the client fixture, like so:
def test_a(client):
...
SQLAlchemy version is 1.4.41, FastAPI version is 0.78.0, and pytest version is 7.1.3.
My Observations
It seems the reason tests run fine on their own is due to SQLModel.metadata.drop_all(bind=engine) being called at the end of testing. However I would like to avoid having to do this, and instead only use rollback between tests.
What worked really well for me is using testcontainers: https://github.com/testcontainers/testcontainers-python.
#pytest.fixture(scope="module", name="session_for_db_in_testcontainer")
def db_engine():
"""
Creates testcontainer with Postgres db
"""
pg_container = PostgresContainer('postgres:latest')
pg_container.start()
# Fireup the SQLModel engine with the uri of the container
db_engine = create_engine(pg_container.get_connection_url())
sqlmodel_metadata.create_all(db_engine)
with Session(db_engine) as session_for_db_in_testcontainer:
# add some rows to start, for test get requests and posting existing data
add_data_to_test_db(database_input_path, session_for_db_in_testcontainer)
yield session_for_db_in_testcontainer
# Will be executed after the last test
session_for_db_in_testcontainer.close()
pg_container.stop()
Like this during the test run a (Postgres) DB is created it only runs during a session, module or function depending on the scope of the fixture. If you want, you can add test data to the db as well like in the example.
In your case you might want to set the scope of this fixture as function. Than test_a and test_b should run independently.
I'm trying to build FastAPI application fully covered with test using python 3.9
For this purpose I've chosen stack:
FastAPI, uvicorn, SQLAlchemy, asyncpg, pytest (+ async, cov plugins), coverage and httpx AsyncClient
Here is my minimal requirements.txt
All tests run smoothly and I get the expected results.
But I've faced the problem, coverage doesn't properly collected. It breaks after a first await keyword, when coroutine returns control back to the event loop
Here is a minimal set on how to reproduce this behavior (it's also available on a GitHub).
Appliaction code main.py:
import sqlalchemy as sa
from fastapi import FastAPI
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from starlette.requests import Request
app = FastAPI()
DATABASE_URL = 'sqlite+aiosqlite://?cache=shared'
#app.on_event('startup')
async def startup_event():
engine = create_async_engine(DATABASE_URL, future=True)
app.state.session = AsyncSession(engine, expire_on_commit=False)
app.state.engine = engine
#app.on_event('shutdown')
async def shutdown_event():
await app.state.session.close()
#app.get('/', name="home")
async def get_home(request: Request):
res = await request.app.state.session.execute(sa.text('SELECT 1'))
# after this line coverage breaks
row = res.first()
assert str(row[0]) == '1'
return {"message": "OK"}
test setup conftest.py looks like this:
import asyncio
import pytest
from asgi_lifespan import LifespanManager
from httpx import AsyncClient
#pytest.fixture(scope='session')
async def get_app():
from main import app
async with LifespanManager(app):
yield app
#pytest.fixture(scope='session')
async def get_client(get_app):
async with AsyncClient(app=get_app, base_url="http://testserver") as client:
yield client
#pytest.fixture(scope="session")
def event_loop():
loop = asyncio.new_event_loop()
yield loop
loop.close()
test is simple as it is (just check status code is 200) test_main.py:
import pytest
from starlette import status
#pytest.mark.asyncio
async def test_view_health_check_200_ok(get_client):
res = await get_client.get('/')
assert res.status_code == status.HTTP_200_OK
pytest -vv --cov=. --cov-report term-missing --cov-report html
As a result coverage I get:
Name Stmts Miss Cover Missing
--------------------------------------------
conftest.py 18 0 100%
main.py 20 3 85% 26-28
test_main.py 6 0 100%
--------------------------------------------
TOTAL 44 3 93%
Example code above uses aiosqlite instead of asyncpg but coverage failure also reproduces persistently
I've concluded this problem is with SQLAlchemy, because this example with asyncpg without using the SQLAlchemy works like charm
it's an issue with SQLAlchemy 1.4 in coveragepy: https://github.com/nedbat/coveragepy/issues/1082, https://github.com/nedbat/coveragepy/issues/1012
you can try with --concurrency==greenlet option
That is what helped me:
You can add coverage concurrency greenlet settings into your configs setup.cfg or .coveragerc file:
[coverage:run]
branch = True
concurrency =
greenlet
thread
I am developing a service with FastAPI and Tortoise-ORM.
When I use the interface generated by Swagger UI or curl, I can add and read the data successfully.
However, when I run pytest, tests fail with the following error message: tortoise.exceptions.ConfigurationError: No DB associated to model
Bearing in mind that the error only occurs when pytest is used, I believe that the problem is some configuration that is wrong or is missing from the test scripts, but I can't find the cause.
Does anyone have any ideas ?
My structure is as follows:
src /
+--api /
| +-__ init__.py
| +-app.py
| +-main.py
| +-models.py
| +-routers.py
| +-schemas.py
+--tests /
+-__ init__.py
+-test_subjects.py
The test_1.py file is as follows:
import pytest
from fastapi.testclient import TestClient
from api.main import app
client = TestClient(app)
def test_create_subject():
response = await client.post(
'/api/subject/',
json={
'name': 'Programming',
},
)
def test_read_subjects():
response = client.get("/api/subjects/")
assert response.status_code == 200
app.py:
from fastapi import FastAPI
from tortoise.contrib.fastapi import register_tortoise
from tortoise import Tortoise
def get_application():
_app = FastAPI(title='MyProject')
_app.add_middleware(
CORSMiddleware,
allow_credentials=True,
allow_methods=['*'],
allow_headers=['*'],
)
return _app
app = get_application()
#app.on_event('startup')
async def startup():
register_tortoise(
app,
db_url='sqlite://db.sqlite3',
modules={'models': ['api.models']},
generate_schemas=True,
add_exception_handlers=True,
)
main.app:
import uvicorn
from .app import app
from .routers import subjects
from .schemas.subjects import SubjectSchema
app.include_router(subjects.router)
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
In my case it wasn't the register_tortoise function that needed to be in the on_event('startup') function, rather another part of my code that was trying to use the db before it was initialised. I moved this piece of code (the instantiation of the class that had the query) inside an on_event('startup') block and everything started working. Basically, if you have any db queries that fire before register_tortoise, it will fail. (That's why it works with Swagger)
I am trying to add some monitoring to a simple REST web service with flask and mongoengine and have come across what I think is a lack of understanding on my part of how imports and mongoengine is working in flask applications.
I'm following pymongo's documentation on monitoring : https://pymongo.readthedocs.io/en/3.7.2/api/pymongo/monitoring.html
I defined the following CommandListener in a separate file:
import logging
from pymongo import monitoring
log = logging.getLogger('my_logger')
class CommandLogger(monitoring.CommandListener):
def started(self, event):
log.debug("Command {0.command_name} with request id "
"{0.request_id} started on server "
"{0.connection_id}".format(event))
monitoring.register(CommandLogger())
I made an application_builder.py file to create my flask App, code looks something like this:
from flask_restful import Api
from flask import Flask
from command_logger import CommandLogger # <----
from db import initialize_db
from routes import initialize_routes
def create_app():
app = Flask(__name__)
api = Api(app)
initialize_db(app)
initialize_routes(api)
return app
The monitoring only seems to works if I import : CommandLogger in application_builder.py. I'd like to understand what is going on here, how does the import affect the monitoring registration?
Also I'd like to extract monitoring.register(CommandLogger()) as a function and call it at a latter stage in my code something like def register(): monitoring.register(CommandLogger())
But this doesn't seem to work, "registration' only works when it is in the same file as the CommandLogger class...
From the MongoEngine's doc, it seems important that the listener gets registered before connecting mongoengine
To use pymongo.monitoring with MongoEngine, you need to make sure that
you are registering the listeners before establishing the database
connection (i.e calling connect)
This worked for me. I'm just initializing/registering it the same way as I did other modules to avoid circular imports.
# admin/logger.py
import logging
from pymongo import monitoring
log = logging.getLogger()
log.setLevel(logging.DEBUG)
logging.basicConfig(level=logging.DEBUG)
class CommandLogger(monitoring.CommandListener):
# def methods...
class ServerLogger(monitoring.ServerListener):
# def methods
class HeartbeatLogger(monitoring.ServerHeartbeatListener):
# def methods
def initialize_logger():
monitoring.register(CommandLogger())
monitoring.register(ServerLogger())
monitoring.register(HeartbeatLogger())
monitoring.register(TopologyLogger())
# /app.py
from flask import Flask
from admin.toolbar import initialize_debugtoolbar
from admin.admin import initialize_admin
from admin.views import initialize_views
from admin.logger import initialize_logger
from database.db import initialize_db
from flask_restful import Api
from resources.errors import errors
app = Flask(__name__)
# imports requiring app
from resources.routes import initialize_routes
api = Api(app, errors=errors)
# Logger before db
initialize_logger()
# Database and Routes
initialize_db(app)
initialize_routes(api)
# Admin and Development
initialize_admin(app)
initialize_views()
initialize_debugtoolbar(app)
# /run.py
from app import app
app.run(debug=True)
then in any module...
from admin.logger import log
from db.models import User
# inside some class/view/queryset or however your objects are written...
log.info('Saving an item through MongoEngine...')
User(name='Foo').save()
What I'm trying to figure out now is how to integrate Flask DebuggerToolbar's Logging panel with the monitoring messages from these listeners...
I'm using a application factory pattern, and when I tried to run my test, I get "Attempted to generate a URL without the application context being". I created a fixture to create the application:
#pytest.fixture
def app():
yield create_app()
but when I run my test
def test_get_activation_link(self, app):
user = User()
user.set_password(self.VALID_PASS)
generated_link = user.get_activation_link()
I get the above error (from the line of code url = url_for("auth.activate")). I'm also trying to figure out to have the app creation run for every test, without having to import it into every test, but I can't seem to find if that's possible.
This works for my app
import pytest
from xxx import create_app
#pytest.fixture
def client():
app = create_app()
app.config['TESTING'] = True
with app.app_context():
with app.test_client() as client:
yield client
def smoke_test_homepage(client):
"""basic tests to make sure test setup works"""
rv = client.get("/")
assert b"Login" in rv.data
So, you missed the application context.
At this year's Flaskcon there was an excellent talk about the Flask context - I highly recommend this video.
https://www.youtube.com/watch?v=fq8y-9UHjyk