How does pytest deal with fixtures calling other fixtures? - python

I have two pytest fixtures, client and app. client calls app.
The test function test_register has arguments client and app and hence calls both fixtures.
My question is if the instance of app used in test_register is always going to be the one that client called, and if this is how pytest works in general (the assertion in test_register passes, so it is true in this case) .
In other words, does pytest generate unrelated instances for each argument in a test function that calls a fixture or does it call the fixtures and the instances returned also reference each other?
Here's the code:
#pytest.fixture
def app():
app = create_app({
'TESTING': True,
})
yield app
#pytest.fixture
def client(app):
return app.test_client()
def test_register(client, app):
assert client.application is app

All fixtures have a scope, the implicit scope being function but there's also class, module and session scopes. Within each scope there will only ever be one instance created of a fixture.
So in your example both app and client are using the function-scope. When executing test_register it enters the function-scope of this test and creates the fixture intances. Hence both test_register and client get the same instance of app.
See the docs for more details on how this all works.

Related

Pytest session scoped fixture yields multiple times - help me understand

Below is an example of a pytest session and function scoped fixtures.
The first fixture sets up a single db connection, whilst the second one deletes the DB state to have a clean slate for each test function.
My understanding of the yield in the session scoped fixture is wonky. I thought yield gives control over to testing functions once and then continues with tear down. But if we mix up function scoped fixture that depends on the session scope fixture, it keeps re-using the same instance of the db, implying yield is yielding control multiple times (however does not create the db multiple times). This is confusing to me. Can you help me out here with understanding this?
#pytest.fixture(scope="session")
def db():
"""CardsDB object connected to a temporary database"""
with TemporaryDirectory() as db_dir:
db_path = Path(db_dir)
db_ = cards.CardsDB(db_path)
yield db_
db_.close()
#pytest.fixture(scope="function")
def cards_db(db):
"""CardsDB object that's empty"""
db.delete_all()
return db

Local import as pytest fixture?

I need to import some functions locally within my tests (yes, the code base can be designed better to avoid this necessity, but let's assume we can't do that).
That means the first line of all my tests within a module looks like in this example:
def test_something():
from worker import process_message
process_message()
Now I wanted to make this more DRY by creating the following fixture:
#pytest.fixture(scope="module", autouse=True)
def process_message():
from worker import process_message
return process_message
But I always get the error
Fixture "process_message" called directly. Fixtures are not meant to
be called directly, but are created automatically when test functions
request them as parameters. See
https://docs.pytest.org/en/stable/explanation/fixtures.html for more
information about fixtures, and
https://docs.pytest.org/en/stable/deprecations.html#calling-fixtures-directly
about how to update your code.
The linked documentation doesn't help me much.
How can I achieve what I want? I'd like to return the function handle obviously.
The reason this happens is that you are calling the fixture directly from one of your tests. I assume that your test code with the fixture looks something like this:
import pytest
#pytest.fixture(scope="module", autouse=True)
def process_message():
from worker import process_message
return process_message
def test_something():
process_message()
And then test_something fails with the specified exception.
The way to fix it is to add process_message as an argument to the test function, indicating that you are using it as a fixture:
def test_something(process_message):
process_message()
btw, since you have to specify process_message fixture in every test in order to call it, means there is no point in using the autouse=True and it can be removed.

What are the functional differences in these 3 pytest fixtures?

I'm relatively new to pytest-style unit testing, and I'm trying to learn more about pytest fixtures. I'm not passing a scope argument to the fixture, so I know that the scope is "function". Are there any functional differences in these 3 styles of simple fixtures? Why would one approach be favored over the others?
#pytest.fixture()
#patch('a.b.c.structlog.get_logger')
def fixture_classQ(mock_log):
handler = MagicMock(spec=WidgetHandler)
return ClassQ(handler)
#pytest.fixture()
def fixture_classQ():
with patch('a.b.c.structlog.get_logger'):
handler = MagicMock(spec=WidgetHandler)
return ClassQ(handler)
#pytest.yield_fixture()
def fixture_classQ():
with patch('a.b.c.structlog.get_logger'):
handler = MagicMock(spec=WidgetHandler)
yield ClassQ(handler)
Simple example usage of the fixture:
def test_classQ_str(fixture_classQ):
assert str(fixture_classQ) == "This is ClassQ"
Thanks.
fixture 1
Starting with the first one, this creates a plain-data fixture. The mock is (imo misleadingly) only alive for the duration of the fixture function because it uses return.
In order ~roughly what happens for that:
pytest notices your fixture is used for the test function
it calls the fixture function
the mock decorator starts the patch
the mock decorator calls your actual function (which returns a value)
the mock decorator undoes the patch
pytest notices it wasn't a generator and so that's the value of your fixture
fixture 2
the second is identical in behaviour to the first, except it uses the context manager form of mock instead of the decorator. personally I don't like the decorator form but that's just me :D
fixture 3
(first before I continue, pytest.yield_fixture is a deprecated alias for pytest.fixture -- you can just use #pytest.fixture)
The third does something different! The patch is alive for the duration of the test because it has "yielded" during the fixture. This is a kind of way to create a setup + teardown fixture all in one. Here's roughly the execution here
pytest notices your fixture is used for the test function
pytest calls the fixture function
since it is a generator, it returns immediately without executing code
pytest notices it is a generator, calls next(...) on it
this causes the code to execute until the yield and then "pausing". you can think of it kind of as a co-routine
the __enter__ of the mock is called making the patch active
the value that is yielded is used as the value of the fixture
pytest then executes your test function
pytest then calls next(...) again on the generator to exhaust the fixture
this __exit__s the with statement, undoing the patch
which to choose?
the best answer is it depends. Since 1 and 2 are functionally equivalent it's up to personal preference. Pick 3. if you need the patch to be active during the entire duration of your test. And don't use pytest.yield_fixture, just use pytest.fixture.

Autodiscover python decorators

I was wondering if there is a standardized approach or best practice to scan/ autodiscover decorators like it is done here but also in several other libs like Django, Flask. Usually a decorator provides extra/ wrapped functionality right at the time the inner func is called.
In the example shown below but also in Flask/ Django (route decorators) the decorator is rather used to add overarching functionalities, e.g. spawning of a tcp client initially within the decorator logic and then call the inner func when there is a message received to process it.
Flask/ Django register for an url route where the inner func is only called later when the url is requested. All examples require an initial registration (scan/ discover) of decorator logic to also initially start the overarching functionality. To me this seems to be an alternative use of decorators and I would like to understand the best practice approach if there is any.
See Faust example below where decorator app.agent() automatically triggers a listening (kafka stream) client within asyncio event loop and incoming message is then processed by the inner function hello() later, only when there is a message received, requiring an initial check/ scan/ discovery of related decorator logic first at the start of the script.
import faust
class Greeting(faust.Record):
from_name: str
to_name: str
app = faust.App('hello-app', broker='kafka://localhost')
topic = app.topic('hello-topic', value_type=Greeting)
#app.agent(topic)
async def hello(greetings):
async for greeting in greetings:
print(f'Hello from {greeting.from_name} to {greeting.to_name}')
#app.timer(interval=1.0)
async def example_sender(app):
await hello.send(
value=Greeting(from_name='Faust', to_name='you'),
)
if __name__ == '__main__':
app.main()
Nothing is "discovered". When you import a module from a package, all of that code is executed. This is why we have if __name__ == '__main__' to stop certain code being executed on import. The decorators will be "discovered" when you run your code.
I think the Flask blueprint is a nice example. Here you can see how it registers the url endpoints when you import modules. All it's doing is appending to a list:
def route(self, rule, **options):
"""Like :meth:`Flask.route` but for a blueprint. The endpoint for the
:func:`url_for` function is prefixed with the name of the blueprint.
"""
def decorator(f):
endpoint = options.pop("endpoint", f.__name__)
self.add_url_rule(rule, endpoint, f, **options)
return f
return decorator
The code runs, the decorators are evaluated and they need only keep some internal list of all the functions they decorate. These are stored in the Blueprint object.

pytest fixture with scope session running for every test

Correct me if I'm wrong, but if a fixture is defined with scope="session", shouldn't it be run only once per the whole pytest run?
For example:
import pytest
#pytest.fixture
def foo(scope="session"):
print('foooooo')
def test_foo(foo):
assert False
def test_bar(foo):
assert False
I have some tests that rely on data retrieved from some APIs, and instead of querying the API in each test, I rather have a fixture that gets all the data at once, and then each test uses the data it needs. However, I was noticing that for every test, a request was made to the API.
That's because you're declaring the fixture wrong. scope should go into the pytest.fixture decoraror parameters:
#pytest.fixture(scope="session")
def foo():
print('foooooo')
In your code, the scope is left to default value function, that's why the fixture is being ran for each test.

Categories