Consider the following fixture
#pytest.fixture(params=['current', 'legacy'])
def baseline(request):
return request.param
I wonder if there is a way to launch pytest so it overrides the fixture parameter list with the value(s) given on the command line, i.e.:
pytest --baseline legacy tests/
The above should effectively result in params=['legacy'].
Go with dynamic parametrization via Metafunc.parametrize:
# conftest.py
import pytest
#pytest.fixture
def baseline(request):
return request.param
def pytest_addoption(parser):
parser.addoption('--baseline', action='append', default=[],
help='baseline (one or more possible)')
def pytest_generate_tests(metafunc):
default_opts = ['current', 'legacy']
baseline_opts = metafunc.config.getoption('baseline') or default_opts
if 'baseline' in metafunc.fixturenames:
metafunc.parametrize('baseline', baseline_opts, indirect=True)
Usage without parameters yields two default tests:
$ pytest test_spam.py -sv
...
test_spam.py::test_eggs[current] PASSED
test_spam.py::test_eggs[legacy] PASSED
Passing --baseline overwrites the defaults:
$ pytest test_spam.py -sv --baseline=foo --baseline=bar --baseline=baz
...
test_spam.py::test_eggs[foo] PASSED
test_spam.py::test_eggs[bar] PASSED
test_spam.py::test_eggs[baz] PASSED
You can also implement "always-in-use" defaults, so additional params are always added to them:
def pytest_addoption(parser):
parser.addoption('--baseline', action='append', default=['current', 'legacy'],
help='baseline (one or more possible)')
def pytest_generate_tests(metafunc):
baseline_opts = metafunc.config.getoption('baseline')
if 'baseline' in metafunc.fixturenames and baseline_opts:
metafunc.parametrize('baseline', baseline_opts, indirect=True)
Now the test invocation will always include current and legacy params:
$ pytest test_spam.py -sv --baseline=foo --baseline=bar --baseline=baz
...
test_spam.py::test_eggs[current] PASSED
test_spam.py::test_eggs[legacy] PASSED
test_spam.py::test_eggs[foo] PASSED
test_spam.py::test_eggs[bar] PASSED
test_spam.py::test_eggs[baz] PASSED
Related
I have pytest tests which result may depend on environmental variable. I want to test them for multiple values of this environmental variable.
I want to have only one fixture which sets this environment variable but I want to be able to configure those values for each test, not per fixture.
How can I do it?
It can be achieved by using fixtures with indirect parametrization:
conftest.py
import pytest, os
#pytest.fixture(scope="function")
def my_variable(request, monkeypatch):
"""Set MY_VARIABLE environment variable, this fixture must be used with `parametrize`"""
monkeypatch.setenv("MY_VARIABLE", request.param)
yield request.param
test_something.py
import pytest, os
#pytest.mark.parametrize("my_variable", ["value1", "value2", "abc"], indirect=True)
class TestSomethingClassTests:
"""a few test with the same `parametrize` values"""
def test_aaa_1(self, my_variable):
"""test 1"""
assert os.environ["MY_VARIABLE"] == my_variable
def test_aaa_2(self, my_variable):
"""test 2"""
assert True
#pytest.mark.parametrize("my_variable", ["value2", "value5", "qwerty"], indirect=True)
def test_bbb(my_variable):
"""test bbb"""
assert os.environ["MY_VARIABLE"] == my_variable
How it looks in VSCode:
Try this in conftest.py:
def pytest_addoption(parser):
parser.addoption("--env", action="store", default="sit")
#pytest.fixture(scope="session")
def env(request):
return request.config.getoption("--env")
Run the tests with --env=xxx as the command line argument:
python -m pytest foo_test.py --env=sit
Use the env variable in the test
Use case: In a pytest test suite I have a #fixture which raises exceptions if command line options for its configuration are missing. I've written a test for this fixture using xfail:
import pytest
from <module> import <exception>
#pytest.mark.xfail(raises=<exception>)
def test_fixture_with_missing_options_raises_exception(rc_visard):
pass
However the output after running the tests does not state the test as passed but "xfailed" instead:
============================== 1 xfailed in 0.15 seconds ========================
In addition to that I am not able to test if the fixture raises the exception for specific missing command line options.
Is there a better approach to do this? Can I mock the pytest command line options somehow that I do not need to call specific tests via pytest --<commandline-option-a> <test-file-name>::<test-name>.
initial setup
Suppose you have a simplified project with conftest.py containing the following code:
import pytest
def pytest_addoption(parser):
parser.addoption('--foo', action='store', dest='foo', default='bar',
help='--foo should be always bar!')
#pytest.fixture
def foo(request):
fooval = request.config.getoption('foo')
if fooval != 'bar':
raise ValueError('expected foo to be "bar"; "{}" provided'.format(fooval))
It adds a new command line arg --foo and a fixture foo returning the passed arg, or bar if not specified. If anything else besides bar passed via --foo, the fixture raises a ValueError.
You use the fixture as usual, for example
def test_something(foo):
assert foo == 'bar'
Now let's test that fixture.
preparations
In this example, we need to do some simple refactoring first. Move the fixture and related code to some file called something else than conftest.py, for example, my_plugin.py:
# my_plugin.py
import pytest
def pytest_addoption(parser):
parser.addoption('--foo', action='store', dest='foo', default='bar',
help='--foo should be always bar!')
#pytest.fixture
def foo(request):
fooval = request.config.getoption('foo')
if fooval != 'bar':
raise ValueError('expected foo to be "bar"; "{}" provided'.format(fooval))
In conftest.py, ensure the new plugin is loaded:
# conftest.py
pytest_plugins = ['my_plugin']
Run the existing test suite to ensure we didn't break anything, all tests should still pass.
activate pytester
pytest provides an extra plugin for writing plugin tests, called pytester. It is not activated by default, so you should do that manually. In conftest.py, extend the plugins list with pytester:
# conftest.py
pytest_plugins = ['my_plugin', 'pytester']
writing the tests
Once pytester is active, you get a new fixture available called testdir. It can generate and run pytest test suites from code. Here's what our first test will look like:
# test_foo_fixture.py
def test_all_ok(testdir):
testdata = '''
def test_sample(foo):
assert True
'''
testconftest = '''
pytest_plugins = ['my_plugin']
'''
testdir.makeconftest(testconftest)
testdir.makepyfile(testdata)
result = testdir.runpytest()
result.assert_outcomes(passed=1)
It should be pretty obvious what happens here: we provide the tests code as string and testdir will generate a pytest project from it in some temporary directory. To ensure our foo fixture is available in the generated test project, we pass it in the generated conftest same way as we do in the real one. testdir.runpytest() starts the test run, producing a result that we can inspect.
Let's add another test that checks whether foo will raise a ValueError:
def test_foo_valueerror_raised(testdir):
testdata = '''
def test_sample(foo):
assert True
'''
testconftest = '''
pytest_plugins = ['my_plugin']
'''
testdir.makeconftest(testconftest)
testdir.makepyfile(testdata)
result = testdir.runpytest('--foo', 'baz')
result.assert_outcomes(error=1)
result.stdout.fnmatch_lines([
'*ValueError: expected foo to be "bar"; "baz" provided'
])
Here we execute the generated tests with --foo baz and verify afterwards if one test ended with an error and the error output contains the expected error message.
Is there a way to save the value of parameter, provided by pytest fixture:
Here is an example of conftest.py
# content of conftest.py
import pytest
def pytest_addoption(parser):
parser.addoption("--parameter", action="store", default="default",
help="configuration file path")
#pytest.fixture
def param(request):
parameter = request.config.getoption("--parameter")
return parameter
Here is an example of pytest module:
# content of my_test.py
def test_parameters(param):
assert param == "yes"
OK - everything works fine, but is there a way to get the value of param outside the test - for example with some build-in pytest function pytest.get_fixture_value["parameter"]
EDITED - DETAILED EXPLANATION WHAT I WANT TO ACHIEV
I am writing an module, that deploys and after that provides parameters to tests, writen in pytest. My idea is if someones test looks like that:
class TestApproachI:
#load_params_as_kwargs(parameters_A)
def setup_class(cls, param_1, param_2, ... , param_n):
# code of setup_class
def teardown_class(cls):
# some code
def test_01(self):
# test code
And this someone gives me a configuration file, that explains with what parameters to run his code, I will analyze those parameters (in some other script) and I will run his tests with the command pytest --parameters=path_to_serialized_python_tuple test_to_run where this tuple will contain the provided values for this someone parameters in the right order. And I will tell that guy (with the tests) to add this decorator to all the tests he wants me to provide parameters. This decorator would look like this:
class TestApproachI:
# this path_to_serialized_tuple should be provided by 'pytest --parameters=path_to_serialized_python_tuple test_to_run'
#load_params(path_to_serialized_tuple)
def setup_class(cls, param_1, param_2, ... , param_n):
# code of setup_class
def teardown_class(cls):
# some code
def test_01(self):
# test code
The decorator function should look like that:
def load_params(parameters):
def decorator(func_to_decorate):
#wraps(func_to_decorate)
def wrapper(self):
# deserialize the tuple and decorates replaces the values of test parameters
return func_to_decorate(self, *parameters)
return wrapper
return decorator
Set that parameter as os environment variable, and than use it anywhere in your test through os.getenv('parameter')
So, you can use like,
#pytest.fixture
def param(request):
parameter = request.config.getoption("--parameter")
os.environ["parameter"]=parameter
return parameter
#pytest.mark.usefixtures('param')
def test_parameters(param):
assert os.getenv('parameter') == "yes"
I am using pytest-lazy-fixture to get the value any fixture:
first install it using pip install pytest-lazy-fixture or pipenv install pytest-lazy-fixture
then, simply assign the fixture to a variable like this if you want:
fixture_value = pytest.lazy_fixture('fixture')
the fixture has to wrapped with quotations
You can use the pytest function config.cache, like this
def function_1(request):
request.config.cache.set("user_data", "name")
...
def function_2(request):
request.config.cache.get("user_data", None)
...
Here is more info about it
https://docs.pytest.org/en/latest/reference/reference.html#std-fixture-cache
https://docs.pytest.org/en/6.2.x/cache.html
Admittedly it is not the best way to do it to start with and more importantly the fixture parameters are resolved i.e. Options.get_option() is called before everything else.
Recommendations and suggestions would be appreciated.
From config.py
class Options(object):
option = None
#classmethod
def get_option(cls):
return cls.option
From conftest.py
#pytest.yield_fixture(scope='session', autouse=True)
def session_setup():
Options.option = pytest.config.getoption('--remote')
def pytest_addoption(parser):
parser.addoption("--remote", action="store_true", default=False, help="Runs tests on a remote service.")
#pytest.yield_fixture(scope='function', params=Options.get_option())
def setup(request):
if request.param is None:
raise Exception("option is none")
Don't use custom Options class but directly ask for option from config.
pytest_generate_tests may be used for parametrizing fixture-like argument for tests.
conftest.py
def pytest_addoption(parser):
parser.addoption("--pg_tag", action="append", default=[],
help=("Postgres server versions. "
"May be used several times. "
"Available values: 9.3, 9.4, 9.5, all"))
def pytest_generate_tests(metafunc):
if 'pg_tag' in metafunc.fixturenames:
tags = set(metafunc.config.option.pg_tag)
if not tags:
tags = ['9.5']
elif 'all' in tags:
tags = ['9.3', '9.4', '9.5']
else:
tags = list(tags)
metafunc.parametrize("pg_tag", tags, scope='session')
#pytest.yield_fixture(scope='session')
def pg_server(pg_tag):
# pg_tag is parametrized parameter
# the fixture is called 1-3 times depending on --pg_tag cmdline
Edit: Replaced old example with metafunc.parametrize usage.
There is an example in the latest docs on how to do this. It's a little buried and honestly I glazed over it the first time reading through the documentation: https://docs.pytest.org/en/latest/parametrize.html#basic-pytest-generate-tests-example.
Basic pytest_generate_tests example
Sometimes you may want to implement your own parametrization scheme or
implement some dynamism for determining the parameters or scope of a
fixture. For this, you can use the pytest_generate_tests hook which is
called when collecting a test function. Through the passed in metafunc
object you can inspect the requesting test context and, most
importantly, you can call metafunc.parametrize() to cause
parametrization.
For example, let’s say we want to run a test taking string inputs
which we want to set via a new pytest command line option. Let’s first
write a simple test accepting a stringinput fixture function argument:
# content of test_strings.py
def test_valid_string(stringinput):
assert stringinput.isalpha()
Now we add a conftest.py file containing the addition of a command
line option and the parametrization of our test function:
# content of conftest.py
def pytest_addoption(parser):
parser.addoption("--stringinput", action="append", default=[],
help="list of stringinputs to pass to test functions")
def pytest_generate_tests(metafunc):
if 'stringinput' in metafunc.fixturenames:
metafunc.parametrize("stringinput",
metafunc.config.getoption('stringinput'))
If we now pass two stringinput values, our test will run twice:
$ pytest -q --stringinput="hello" --stringinput="world" test_strings.py`
..
2 passed in 0.12 seconds
I want to implement the following using external data (arguments) via pytest_generate_tests. This example works:
#pytest.mark.parametrize('case', [1,2,3,4])
def test_regression(case):
print case
assert True
Imagine, i retrieve test data via argv option. So, i've created conftest.py, added option --data, added fixture data and added pytest_generate_tests hook. Please pay attention, that if i do not declare data fixture this will not work (but in the example there is not fixture declaration): http://pytest.org/latest/example/parametrize.html#generating-parameters-combinations-depending-on-command-line
import pytest
def pytest_addoption(parser):
parser.addoption('--data', action='store', default='', help='Specify testing data')
#pytest.fixture
def data(request):
return request.config.getoption('--data')
def pytest_generate_tests(metafunc):
if 'data' in metafunc.funcargnames:
# imagine data.cases = [1,2,3,4,5]
metafunc.parametrize('case', [1,2,3,4,5])
For exampple, i have argument data, that containts itself some test data & some test cases. So, i define conftest.py the following way:
# conftest.py
import pytest
def pytest_addoption(parser):
parser.addoption('--data', action='store', default='', help='Specify testing data')
#pytest.fixture
def data(request):
return request.config.getoption('--data')
def pytest_generate_tests(metafunc):
if 'data' in metafunc.fixturenames:
# lets imagine data.cases = [1,2,3,4,5]
metafunc.parametrize('case', [1,2,3,4,5])
# test.py (just removed #pytest.mark.parametrize line)
def test_regression(case):
print case
assert True
The example above will give an error: fixture 'case' not found. But if i substitute case with data it will work:
# conftest.py
import pytest
def pytest_addoption(parser):
parser.addoption('--data', action='store', default='', help='Specify testing data')
#pytest.fixture
def data(request):
return request.config.getoption('--data')
def pytest_generate_tests(metafunc):
if 'data' in metafunc.fixturenames:
# lets imagine data.cases = [1,2,3,4,5]
metafunc.parametrize('data', [1,2,3,4,5])
# test.py (just removed #pytest.mark.parametrize line)
def test_regression(data):
print case
assert True
But i need test parameter named case. What i am doing wrong?
I faced the nearly same problem today:
I can not give you the actual root cause but the problem seems to be that the function parameter which gets passed to the test function is expected to be a fixture. So if you are using data it is working as your are using a fixture.
If you are using case there is no fixture found for case.
I solved this by doing the following:
def pytest_generate_tests(metafunc):
if 'func' in metafunc.fixturenames:
# do some stuff
metafunc.parametrize('func', all_combinations)
def test_function_prototypes(func):
assert func
This will throw the fixture 'func' not found error. I solved it by adding the following lines:
#pytest.fixture
def func(request):
return request.param # pass the param to the test function
I didnt find something in the docs concerning the need to supply this function.
I also observed that uncommenting the fixture again lets the code still work. I guess its related to caching...