General issue:
I have an abstract model that I want to test with a real model instance, however I don't want to have to completely restructure my project/test format. See these two answers for reference: First answer Second answer
I want to
A) Define models inside each test app folder and not define them inside the actual apps
B) Not have an additional/separate apps.py and settings.py configuration for each test folder.
I know this is possible because the django project has a very similar test structure like this, but they use a test run script that I can't entirely decipher.
The test/ folder mirrors the app structure I have (which are blog/, projects/, richeditable/).
backend
├── projects
├── blog
├── richeditable
├── tests
│ ├── __init__.py
│ ├── conftest.py
│ ├── blog
│ │ ├── ...
│ ├── projects
│ │ ├── ...
│ └── richeditable
│ │ ├── __init__.py
│ │ ├── test_model.py
│ │ ├── models.py <-------------- This defines a model that inherits from richeditable.models
# richeditable/models.py
class Draftable(models.Model):
blah = models.IntegerField()
class Meta:
abstract = True
# tests/richeditable/models.py
class DummyDraftable(Draftable):
additional_field = models.BooleanField()
# class Meta:
# app_label = "some_app"
# ^--- Django does not seem to like having this blank and I don't know what to put to make this work
# tests/richeditable/test_model.py
import pytest
#pytest.fixture
def add_app(settings):
settings.INSTALLED_APPS += ['backend.tests.richeditable']
# presumably the above fixture would affect the apps/models
# settings before the database is created, but that does not seems to be the case
def test_please_work(add_app, db):
assert DummyDraftable.objects.create(blah=1, additional_field=True) is not None
My best guess from what I've seen of the django project script is that they load in each folder for testing as a module and add it to INSTALLED_APPS at run time before the test cases. However, you can't just simply change INSTALLED_APPS because models are being added and migrations have to be made to the test database beforehand as well as there seems to be a need to define an AppConfig (because Django loses it's mind if you don't). I've tried to include the app_label Meta field for models but it didn't work, or I may have done something wrong. But the point is, I don't see the script creating an AppConfig and they somehow don't have to declare Meta in their models.py
Pytest specific stuff:
Things get further complicated with pytest-django because it doesn't use Django's TestRunner interface. This is how you would do it if that were the case (note the order of operations). I have already tried modifying the settings pytest fixture before instantiating the db with the associated fixtures but this doesn't end up loading the module no matter what I do. From looking at the source code, it seems like the settings are fixed in place based on what the settings.py specifies and modifying the settings fixture makes no difference to app loading.
So I ended up solving my own issue. As #hoefling mentioned, the best way is to create a separate settings file that extends your normal settings file and specify that as your pytest settings file.
// pytest.ini
[pytest]
addopts = --ds=config.settings.test
An important thing to note is that you cannot have your test modules be the same name as already existing apps, as I found out. So the structure I had with backend/tests/richeditable is not allowed no matter what you do. So I prepended each folder with test_ and it works fine. This also solves the issue of having to include app_label in your models.
# testsettings.py
from .settings import * # noqa
INSTALLED_APPS += [
'backend.tests.test_projects',
'backend.tests.test_blog',
'backend.tests.test_richeditable',
]
backend
├── projects
├── blog
├── richeditable
├── tests
│ ├── __init__.py
│ ├── conftest.py
│ ├── test_blog
│ │ ├── ...
│ ├── test_projects
│ │ ├── ...
│ └── test_richeditable
│ │ ├── __init__.py
│ │ ├── test_model.py
│ │ ├── models.py
I have an API that uses FastAPI. In a single file (main.py), I have the call to the function that creates the API
from fastapi import FastAPI
# ...
app = FastAPI()
As well as all the endpoints:
#app.post("/sum")
async def sum_two_numbers(number1: int, number2: int):
return {'result': number1 + number2}
But as the application gets larger, the file is becoming messy and hard to maintain. The obvious solution would be to keep function definitions in separate modules and just import them and use them in main.py, like this:
from mymodules.operations import sum_two_numbers
# ...
#app.post("/sum")
sum_two_numbers(number1: int, number2: int)
Only that doesn't work. I don't know if I'm doing it wrong or it can't be done, but I get this error from VSCode:
Expected function or class declaration after decorator | Pylance
(My program has so many errors that I haven't seen the actual interpreter complaint, but if that's important, I can try debug it and post it here)
So is this impossible to do and I have to stick to the one-file API, or it is possible to do, but I'm doing it wrong? If the second, what is the right way?
The common solution is to split your application into subrouters. Instead of using app directly when registering your views, you create an instance APIRouter (from fastapi import APIRouter) inside each of your modules, then you register these subrouters into your main application.
Inside a dedicated api module, such as api/pages.py:
from fastapi import APIRouter
router = APIRouter()
#router.get('')
async def get_pages():
return ...
from .api import (
pages,
posts,
users,
)
app.include_router(pages.router, prefix='/pages')
app.include_router(posts.router, prefix='/posts')
app.include_router(users.router, prefix='/users')
Another powerful construct you can use is to have two dedicated base routers, one that requires authentication and one that doesn't:
unauthenticated_router = APIRouter()
authenticated_router = APIRouter(dependencies=[Depends(get_authenticated_user)])
.. and you can then register the different routes under each router, depending on whether you want to guard the route with an authenticated user or not. You'd have two subrouters inside each module, one for endpoints that require authentication and one for those that doesn't, and name them appropriately (and if you have no public endpoints, just use authenticated_router as the single name).
unauthenticated_router.include_router(authentication.router, prefix='/authenticate')
unauthenticated_router.include_router(users.unauthenticated_router, prefix='/users', tags=['users'])
authenticated_router.include_router(users.router, prefix='/users')
Any sub router registered under authenticated_router will have the get_authenticated_user dependency evaluated first, which in this case would throw a 401 error if the user wasn't logged in. You can then authorized further based on roles etc. in the dependencies for the view function - but this makes it very explicit whether you want your endpoint to end up in a chain that requires authentication or not.
So is this impossible to do and I have to stick to the one-file API, or it is possible to do, but I'm doing it wrong?
One file api is just for demo/test purposes, in the real world you always do multi-file api especialy with framework like fastapi where you use different type of file like pydantic models, db models, etc.
If the second, what is the right way?
There is no "right way", there is ways that fit your needs.
You can follow the advenced user guide, to see a good example.
what the doc suggest:
.
├── app
│ ├── __init__.py
│ ├── main.py
│ ├── dependencies.py
│ └── routers
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── internal
│ ├── __init__.py
│ └── admin.py
what i use when i have db, pydantic models, etc
.
├── app
│ ├── __init__.py
│ ├── main.py
│ ├── dependencies.py
│ └── routers
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── models
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── schemas
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── internal
│ │ ├── __init__.py
│ │ └── admin.py
here models represent db models and schemas represent pydantic models
I am building a python package, which has a module called "config", in which I have defined different files that contain some global variables that are used as configuration in other modules.
.
└── mypackage
├── base.py
├── config
│ ├── completion.py
│ ├── __init__.py
│ ├── locals.py
│ └── queries.py
├── encoders.py
├── exceptions.py
├── functions
│ ├── actions.py
│ ├── dummy.py
│ ├── __init__.py
│ ├── parsers.py
│ └── seekers.py
├── __init__.py
├── query.py
└── utils
├── dict.py
├── __init__.py
└── retry.py
For example, the file mypackage/config/queries.py has the following content:
INCLUDE_PARENTS = False
Whereas in the main file mypackage/base.py, I have a function which takes this config variable as a default argument:
import mypackage.config.queries as conf
def query(include_parent_=conf.INCLUDE_PARENTS, **kwargs):
# do stuff depending on include_parent_ argument
What I want, and what I haven't been able to find in other similar questions, is to be able to dynamically modify these variables in a Python/Ipython console session. That is, I should be able to do the following on Ipython:
In [1]: import mypackage as mp
In [2]: mp.config.INCLUDE_PARENTS = True # Its default value was False
In [3]: mp.query()
Out[3]: # result with include_parent_ argument set to True
In [4]: mp.config.INCLUDE_PARENTS = False # Now I set the value back to False
In [5]: mp.query()
Out[5]: # result with include_parent_ argument set to False
But I don't understand why I am not able to achieve it. I have tried importing the configuration variables in init.py with their associated namespace, but I never manage to be able to change the global configuration variables dynamically, as Pandas does, for example.
The issue is that you are using conf.INCLUDE_PARENTS as a default parameter of a function. A default parameter is evaluated when the function is created not when it is called. Thus, when you change your code later, the value inside the function does not change. The following should work as you expect.
def query(include_parent_=None, **kwargs):
if include_parent_ is None:
include_parent_ = conf.INCLUDE_PARENTS
# do stuff depending on include_parent_ argument
I am making a scraper function with tidy folder structure. But when I try to import scraper class into views.py it's giving an error:
'module' object is not callable
This is the Tree:
├── api_services
│ ├── spiders
│ │ ├── spiderAtom.py
│ │ └── spiderEbis.py
│ └── views
│ └── viewApi.py
In spiders folder I have this class:
class spiderAtom:
def atom():
string = "return this method"
return string
and trying import it in viewApi
from ..spiders import spiderAtom
def atomApi(request):
spider = spiderAtom()
response = spider.atom()
return HttpResponse(response)
But with the the way I am doing is not working.
Just add __init__.py file to api_services and then call from api_services.spiders.spiderAtom import spiderAtom.
NOTE: All Details about my setup (python version, modules etc) listed at bottom of question.
Apologies in advance if this issue is blatant, but i've been wrestling with it for several days now. Hopefully someone can shed some new light.
I'm in the process of converting unit tests for my personal project from unittest -> pytest. Previously I was using the built-in unittest.mock module, but now i'm trying to use the pytest-mock plugin instead.
I have a sneaking feeling that my tests are leaking mock objects into one another.
Here's why:
High-level details:
# Python version
Python 3.5.2
# Pytest version ( and plugins )
pytest==3.0.7
pytest-benchmark==3.1.0a2
pytest-catchlog==1.2.2
pytest-cov==2.4.0
pytest-ipdb==0.1.dev2
pytest-leaks==0.2.2
pytest-mock==1.6.0
pytest-rerunfailures==2.1.0
pytest-sugar==0.8.0
pytest-timeout==1.2.0
python-dateutil==2.6.0
python-dbusmock==0.16.7
When I run my tests using the following command:
py.test --pdb --showlocals -v -R : -k test_subprocess.py
Everything is fine till we get to test_subprocess_check_command_type. At which point I get the following error:
# Set mock return types
# mock_map_type_to_command.return_value = int
# action
with pytest.raises(TypeError) as excinfo:
scarlett_os.subprocess.Subprocess(test_command,
name=test_name,
fork=test_fork,
> run_check_command=True)
E Failed: DID NOT RAISE <class 'TypeError'>
excinfo = <[AttributeError("'ExceptionInfo' object has no attribute 'typename'") raised in repr()] ExceptionInfo object at 0x7f8c380f9dc0>
mock_fork = <Mock name='mock_fork' id='140240122195184'>
mock_logging_debug = <Mock name='mock_logging_debug' id='140240128747640'>
mock_map_type_to_command = <Mock name='mock_map_type_to_command' id='140240122785112'>
mocker = <pytest_mock.MockFixture object at 0x7f8c329f07a8>
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f8c329f0810>
self = <tests.test_subprocess.TestScarlettSubprocess object at 0x7f8c32aaac20>
test_command = ['who', '-b']
test_fork = False
test_name = 'test_who'
tests/test_subprocess.py:267: Failed
tests/test_subprocess.py::TestScarlettSubprocess.test_subprocess_check_command_type ⨯ 100% ██████████
BUT!
If I filter out all of the other tests except for the problematic one then I get:
via py.test --pdb --showlocals -v -R : -k test_subprocess_check_command_type
pi#0728af726f1f:~/dev/bossjones-github/scarlett_os$ py.test --pdb --showlocals -v -R : -k test_subprocess_check_command_type
/usr/local/lib/python3.5/site-packages/_pdbpp_path_hack/pdb.py:4: ResourceWarning: unclosed file <_io.TextIOWrapper name='/usr/local/lib/python3.5/site-packages/pdb.py' mode='r' encoding='UTF-8'>
os.path.dirname(os.path.dirname(__file__)), 'pdb.py')).read(), os.path.join(
Test session starts (platform: linux, Python 3.5.2, pytest 3.0.7, pytest-sugar 0.8.0)
cachedir: .cache
benchmark: 3.1.0a2 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/pi/dev/bossjones-github/scarlett_os, inifile: setup.cfg
plugins: timeout-1.2.0, sugar-0.8.0, rerunfailures-2.1.0, mock-1.6.0, leaks-0.2.2, ipdb-0.1.dev2, cov-2.4.0, catchlog-1.2.2, benchmark-3.1.0a2
timeout: 60.0s method: signal
NOTE: DBUS_SESSION_BUS_ADDRESS environment var not found!
[DBUS_SESSION_BUS_ADDRESS]: unix:path=/tmp/dbus_proxy_outside_socket
tests/test_subprocess.py::TestScarlettSubprocess.test_subprocess_check_command_type ✓ 100% ██████████
Results (8.39s):
1 passed
190 deselected
pi#0728af726f1f:~/dev/bossjones-github/scarlett_os$
I also tried manually commenting out the following 2 tests and they allowed me to successfully run all the tests again:
test_subprocess_init
test_subprocess_map_type_to_command
Can anyone see anything blatently wrong with my setup? I've read several blog posts on "where to mock", and looked at the docs themselves several times, not sure what i'm missing. https://docs.python.org/3/library/unittest.mock.html
My Setup Details
Here is everything that might be required to solve this. Let me know if I need to provide any more information!
Also ... please excuse how messy my code looks and all of the comment blocks. I'm a big note taker when i'm learning something new ... I'll make everything more pythonic and cleaner in the near future :)
My code:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Scarlett Dbus Service. Implemented via MPRIS D-Bus Interface Specification."""
from __future__ import with_statement, division, absolute_import
import os
import sys
from scarlett_os.exceptions import SubProcessError
from scarlett_os.exceptions import TimeOutError
import logging
from scarlett_os.internal.gi import GObject
from scarlett_os.internal.gi import GLib
logger = logging.getLogger(__name__)
def check_pid(pid):
"""Check For the existence of a unix pid."""
try:
os.kill(pid, 0)
except OSError:
return False
else:
return True
class Subprocess(GObject.GObject):
"""
GObject API for handling child processes.
:param command: The command to be run as a subprocess.
:param fork: If `True` this process will be detached from its parent and
run independent. This means that no excited-signal will be emited.
:type command: `list`
:type fork: `bool`
"""
__gtype_name__ = 'Subprocess'
__gsignals__ = {
'exited': (GObject.SignalFlags.RUN_LAST, None, (GObject.TYPE_INT, GObject.TYPE_INT))
}
def __init__(self, command, name=None, fork=False, run_check_command=True):
"""Create instance of Subprocess."""
GObject.GObject.__init__(self)
self.process = None
self.pid = None
if not fork:
self.stdout = True
self.stderr = True
else:
self.stdout = False
self.stderr = False
self.forked = fork
# Verify that command is properly formatted
# and each argument is of type str
if run_check_command:
self.check_command_type(command)
self.command = command
self.name = name
logger.debug("command: {}".format(self.command))
logger.debug("name: {}".format(self.name))
logger.debug("forked: {}".format(self.forked))
logger.debug("process: {}".format(self.process))
logger.debug("pid: {}".format(self.pid))
if fork:
self.fork()
# TODO: Add these arguments so we can toggle stdout
# def spawn_command(self, standard_input=False, standard_output=False, standard_error=False):
def spawn_command(self):
# DO_NOT_REAP_CHILD
# Don't reap process automatically so it is possible to detect when it is closed.
return GLib.spawn_async(self.command,
flags=GLib.SpawnFlags.SEARCH_PATH | GLib.SpawnFlags.DO_NOT_REAP_CHILD
)
def map_type_to_command(self, command):
"""Return: Map after applying type to several objects in an array"""
# NOTE: In python3, many processes that iterate over iterables return iterators themselves.
# In most cases, this ends up saving memory, and should make things go faster.
# cause of that, we need to call list() over the map object
return list(map(type, command))
def check_command_type(self, command):
types = self.map_type_to_command(command)
if type(types) is not list:
raise TypeError("Variable types should return a list in python3. Got: {}".format(types))
# NOTE: str is a built-in function (actually a class) which converts its argument to a string.
# string is a module which provides common string operations.
# source: http://stackoverflow.com/questions/2026038/relationship-between-string-module-and-str
for t in types:
if t is not str:
raise TypeError("Executables and arguments must be str objects. types: {}".format(t))
logger.debug("Running Command: %r" % " ".join(command))
return True
def run(self):
"""Run the process."""
# NOTE: DO_NOT_REAP_CHILD: the child will not be automatically reaped;
# you must use g_child_watch_add yourself (or call waitpid or handle `SIGCHLD` yourself),
# or the child will become a zombie.
# source:
# http://valadoc.org/#!api=glib-2.0/GLib.SpawnFlags.DO_NOT_REAP_CHILD
# NOTE: SEARCH_PATH: argv[0] need not be an absolute path, it will be looked for in the user's PATH
# source:
# http://lazka.github.io/pgi-docs/#GLib-2.0/flags.html#GLib.SpawnFlags.SEARCH_PATH
self.pid, self.stdin, self.stdout, self.stderr = self.spawn_command()
logger.debug("command: {}".format(self.command))
logger.debug("stdin: {}".format(self.stdin))
logger.debug("stdout: {}".format(self.stdout))
logger.debug("stderr: {}".format(self.stderr))
logger.debug("pid: {}".format(self.pid))
# close file descriptor
self.pid.close()
print(self.stderr)
# NOTE: GLib.PRIORITY_HIGH = -100
# Use this for high priority event sources.
# It is not used within GLib or GTK+.
watch = GLib.child_watch_add(GLib.PRIORITY_HIGH,
self.pid,
self.exited_cb)
return self.pid
def exited_cb(self, pid, condition):
if not self.forked:
self.emit('exited', pid, condition)
def fork(self):
"""Fork the process."""
try:
# first fork
pid = os.fork()
if pid > 0:
logger.debug('pid greater than 0 first time')
sys.exit(0)
except OSError as e:
logger.error('Error forking process first time')
sys.exit(1)
# Change the current working directory to path.
os.chdir("/")
# Description: setsid() creates a new session if the calling process is not a process group leader.
# The calling process is the leader of the new session,
# the process group leader of the new process group,
# and has no controlling terminal.
# The process group ID and session ID of the calling process are set to the PID of the calling process.
# The calling process will be the only process in this new process group and in this new session.
# Return Value: On success, the (new) session ID of the calling process is returned.
# On error, (pid_t) -1 is returned, and errno is set to indicate the error.
os.setsid()
# Set the current numeric umask and return the previous umask.
os.umask(0)
try:
# second fork
pid = os.fork()
if pid > 0:
logger.debug('pid greater than 0 second time')
sys.exit(0)
except OSError as e:
logger.error('Error forking process second time')
sys.exit(1)
My Test:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_subprocess
----------------------------------
"""
import os
import sys
import pytest
import scarlett_os
# import signal
# import builtins
# import re
class TestScarlettSubprocess(object):
'''Units tests for Scarlett Subprocess, subclass of GObject.Gobject.'''
def test_check_pid_os_error(self, mocker):
# Feels like mocks are leaking into other tests,
# stop mock before starting each test function
mocker.stopall()
# Setup mock objects
kill_mock = mocker.MagicMock(name=__name__ + "_kill_mock_OSError")
kill_mock.side_effect = OSError
# patch things
mocker.patch.object(scarlett_os.subprocess.os, 'kill', kill_mock)
# When OSError occurs, throw False
assert not scarlett_os.subprocess.check_pid(4353634632623)
# Verify that os.kill only called once
assert kill_mock.call_count == 1
def test_check_pid(self, mocker):
# Feels like mocks are leaking into other tests,
# stop mock before starting each test function
mocker.stopall()
# Setup mock objects
kill_mock = mocker.MagicMock(name=__name__ + "_kill_mock")
mocker.patch.object(scarlett_os.subprocess.os, 'kill', kill_mock)
result = scarlett_os.subprocess.check_pid(123)
assert kill_mock.called
# NOTE: test against signal 0
# sending the signal 0 to a given PID just checks if any
# process with the given PID is running and you have the
# permission to send a signal to it.
kill_mock.assert_called_once_with(123, 0)
assert result is True
# FIXME: I THINK THIS GUYS IS LEAKING MOCK OBJECTS
def test_subprocess_init(self, mocker):
# Feels like mocks are leaking into other tests,
# stop mock before starting each test function
mocker.stopall()
mock_check_command_type = MagicMock(name="mock_check_command_type")
mock_check_command_type.return_value = True
mock_fork = mocker.MagicMock(name="mock_fork")
mock_logging_debug = mocker.MagicMock(name="mock_logging_debug")
# mock
mocker.patch.object(scarlett_os.subprocess.logging.Logger, 'debug', mock_logging_debug)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'check_command_type', mock_check_command_type)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'fork', mock_fork)
# NOTE: On purpose this is an invalid cmd. Should be of type array
test_command = ['who']
test_name = 'test_who'
test_fork = False
s_test = scarlett_os.subprocess.Subprocess(test_command,
name=test_name,
fork=test_fork)
# action
assert s_test.check_command_type(test_command) is True
mock_check_command_type.assert_called_with(['who'])
assert not s_test.process
assert not s_test.pid
assert s_test.name == 'test_who'
assert not s_test.forked
assert s_test.stdout is True
assert s_test.stderr is True
mock_logging_debug.assert_any_call("command: ['who']")
mock_logging_debug.assert_any_call("name: test_who")
mock_logging_debug.assert_any_call("forked: False")
mock_logging_debug.assert_any_call("process: None")
mock_logging_debug.assert_any_call("pid: None")
mock_fork.assert_not_called()
# FIXME: I THINK THIS GUYS IS LEAKING MOCK OBJECTS
def test_subprocess_map_type_to_command(self, mocker):
"""Using the mock.patch decorator (removes the need to import builtins)"""
# Feels like mocks are leaking into other tests,
# stop mock before starting each test function
mocker.stopall()
mock_check_command_type = mocker.MagicMock(name="mock_check_command_type")
mock_check_command_type.return_value = True
mock_fork = mocker.MagicMock(name="mock_fork")
mock_logging_debug = mocker.MagicMock(name="mock_logging_debug")
# mock
mocker.patch.object(scarlett_os.subprocess.logging.Logger, 'debug', mock_logging_debug)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'check_command_type', mock_check_command_type)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'fork', mock_fork)
# NOTE: On purpose this is an invalid cmd. Should be of type array
test_command = ["who", "-b"]
test_name = 'test_who'
test_fork = False
# create subprocess object
s_test = scarlett_os.subprocess.Subprocess(test_command,
name=test_name,
fork=test_fork)
mocker.spy(s_test, 'map_type_to_command')
assert isinstance(s_test.map_type_to_command(test_command), list)
assert s_test.map_type_to_command.call_count == 1
assert s_test.check_command_type(test_command)
assert s_test.check_command_type(
test_command) == mock_check_command_type.return_value
def test_subprocess_check_command_type(self, mocker):
"""Using the mock.patch decorator (removes the need to import builtins)"""
# Feels like mocks are leaking into other tests,
# stop mock before starting each test function
mocker.stopall()
test_command = ["who", "-b"]
test_name = 'test_who'
test_fork = False
# mock
mock_map_type_to_command = mocker.MagicMock(name="mock_map_type_to_command")
# mock_map_type_to_command.return_value = int
mock_map_type_to_command.side_effect = [int, [int, int]]
mock_fork = mocker.MagicMock(name="mock_fork")
mock_logging_debug = mocker.MagicMock(name="mock_logging_debug")
mocker.patch.object(scarlett_os.subprocess.logging.Logger, 'debug', mock_logging_debug)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'map_type_to_command', mock_map_type_to_command)
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'fork', mock_fork)
# action
with pytest.raises(TypeError) as excinfo:
scarlett_os.subprocess.Subprocess(test_command,
name=test_name,
fork=test_fork,
run_check_command=True)
assert str(
excinfo.value) == "Variable types should return a list in python3. Got: <class 'int'>"
with pytest.raises(TypeError) as excinfo:
scarlett_os.subprocess.Subprocess(test_command,
name=test_name,
fork=test_fork,
run_check_command=True)
assert str(
excinfo.value) == "Executables and arguments must be str objects. types: <class 'int'>"
My folder structure( Note I removed a couple things since it was overly verbose ):
pi#0728af726f1f:~/dev/bossjones-github/scarlett_os$ tree -I *.pyc
.
├── requirements_dev.txt
├── requirements_test_experimental.txt
├── requirements_test.txt
├── requirements.txt
├── scarlett_os
│ ├── automations
│ │ ├── __init__.py
│ │ └── __pycache__
│ ├── commands.py
│ ├── compat.py
│ ├── config.py
│ ├── const.py
│ ├── core.py
│ ├── emitter.py
│ ├── exceptions.py
│ ├── __init__.py
│ ├── internal
│ │ ├── debugger.py
│ │ ├── deps.py
│ │ ├── encoding.py
│ │ ├── formatting.py
│ │ ├── gi.py
│ │ ├── __init__.py
│ │ ├── path.py
│ │ ├── __pycache__
│ │ └── system_utils.py
│ ├── listener.py
│ ├── loader.py
│ ├── logger.py
│ ├── log.py
│ ├── __main__.py
│ ├── mpris.py
│ ├── player.py
│ ├── __pycache__
│ ├── receiver.py
│ ├── speaker.py
│ ├── subprocess.py
│ ├── tasker.py
│ ├── tools
│ │ ├── __init__.py
│ │ ├── package.py
│ │ ├── __pycache__
│ │ └── verify.py
│ └── utility
│ ├── audio.py
│ ├── dbus_runner.py
│ ├── dbus_utils.py
│ ├── distance.py
│ ├── dt.py
│ ├── file.py
│ ├── generators.py
│ ├── gnome.py
│ ├── __init__.py
│ ├── location.py
│ ├── __pycache__
│ ├── temperature.py
│ ├── threadmanager.py
│ ├── thread.py
│ ├── unit_system.py
│ └── yaml.py
├── setup.cfg
├── setup.py
├── tests
│ ├── common_integration.py
│ ├── common.py
│ ├── helpers
│ │ ├── __init__.py
│ │ ├── __pycache__
│ │ ├── test_config_validation.py
│ │ ├── test_entity.py
│ │ └── test_init.py
│ ├── __init__.py
│ ├── integration
│ │ ├── baseclass.py
│ │ ├── conftest.py
│ │ ├── __init__.py
│ │ ├── __pycache__
│ │ ├── README.md
│ │ ├── stubs.py
│ │ ├── test_integration_end_to_end.py
│ │ ├── test_integration_listener.py
│ │ ├── test_integration_mpris.py
│ │ ├── test_integration_player.py
│ │ ├── test_integration_tasker.py
│ │ ├── test_integration_tasker.py.enable_sound.diff
│ │ └── test_integration_threadmanager.py
│ ├── internal
│ │ ├── __init__.py
│ │ ├── __pycache__
│ │ ├── test_deps.py
│ │ ├── test_encoding.py
│ │ └── test_path.py
│ ├── performancetests
│ │ ├── baseclass.py
│ │ ├── __init__.py
│ │ └── __pycache__
│ ├── __pycache__
│ ├── run_all_tests
│ ├── run_dbus_tests.sh
│ ├── test_cli.py
│ ├── test_commands.py
│ ├── testing_config
│ │ └── custom_automations
│ │ ├── light
│ │ │ └── test.py
│ │ └── switch
│ │ └── test.py
│ ├── test_listener.py
│ ├── test_mpris.py
│ ├── test_player.py
│ ├── test_scarlett_os.py
│ ├── test_speaker.py
│ ├── test_subprocess.py
│ ├── test_tasker.py
│ ├── test_threadmanager.py
│ ├── tools_common.py
│ ├── unit_scarlett_os.py
│ └── utility
│ ├── __init__.py
│ ├── __pycache__
│ ├── test_dbus_utils.py
│ ├── test_distance.py
│ ├── test_dt.py
│ ├── test_gnome.py
│ ├── test_init.py
│ ├── test_location.py
│ ├── test_unit_system.py
│ └── test_yaml.py
67 directories, 256 files
pi#0728af726f1f:~/dev/bossjones-github/scarlett_os$
Other details( Extended pip freeze just in case of incompatibilities ):
# Python version
Python 3.5.2
# Pytest version ( and plugins )
pytest==3.0.7
pytest-benchmark==3.1.0a2
pytest-catchlog==1.2.2
pytest-cov==2.4.0
pytest-ipdb==0.1.dev2
pytest-leaks==0.2.2
pytest-mock==1.6.0
pytest-rerunfailures==2.1.0
pytest-sugar==0.8.0
pytest-timeout==1.2.0
python-dateutil==2.6.0
python-dbusmock==0.16.7
# Pip Freeze ( Just in case )
alabaster==0.7.10
appdirs==1.4.3
argh==0.26.2
asn1crypto==0.22.0
astroid==1.5.2
Babel==2.4.0
bleach==2.0.0
bumpversion==0.5.3
cffi==1.10.0
click==6.7
click-plugins==1.0.3
colorama==0.3.7
colorlog==2.10.0
coverage==4.3.4
coveralls==1.1
cryptography==1.8.1
Cython==0.25.2
decorator==4.0.11
docopt==0.6.2
docutils==0.13.1
ecdsa==0.13
entrypoints==0.2.2
Fabric3==1.12.post1
fancycompleter==0.7
fields==5.0.0
flake8==3.3.0
flake8-docstrings==1.0.3
flake8-polyfill==1.0.1
freezegun==0.3.8
gnureadline==6.3.3
graphviz==0.6
html5lib==0.999999999
hunter==1.4.1
idna==2.5
imagesize==0.7.1
ipdb==0.10.2
ipykernel==4.6.1
ipython==6.0.0
ipython-genutils==0.2.0
ipywidgets==6.0.0
isort==4.2.5
jedi==0.10.2
Jinja2==2.9.6
jsonschema==2.6.0
jupyter==1.0.0
jupyter-client==5.0.1
jupyter-console==5.1.0
jupyter-core==4.3.0
lazy-object-proxy==1.2.2
MarkupSafe==1.0
mccabe==0.6.1
mistune==0.7.4
mock==2.0.0
mock-open==1.3.1
mypy-lang==0.4.6
nbconvert==5.1.1
nbformat==4.3.0
notebook==5.0.0
objgraph==3.1.0
ordereddict==1.1
packaging==16.8
pandocfilters==1.4.1
paramiko==1.18.2
pathtools==0.1.2
pbr==1.10.0
pdbpp==0.8.3
pexpect==4.2.1
pickleshare==0.7.4
pluggy==0.4.0
plumbum==1.6.3
prompt-toolkit==1.0.14
psutil==5.2.2
ptyprocess==0.5.1
py==1.4.33
py-cpuinfo==3.2.0
pyasn1==0.2.3
pycodestyle==2.3.1
pycparser==2.17
pycrypto==2.6.1
pydbus==0.6.0
pydocstyle==2.0.0
pyflakes==1.5.0
pygal==2.3.1
pygaljs==1.0.1
Pygments==2.2.0
pygobject==3.22.0
pylint==1.7.1
pyparsing==2.2.0
pystuck==0.8.5
pytest==3.0.7
pytest-benchmark==3.1.0a2
pytest-catchlog==1.2.2
pytest-cov==2.4.0
pytest-ipdb==0.1.dev2
pytest-leaks==0.2.2
pytest-mock==1.6.0
pytest-rerunfailures==2.1.0
pytest-sugar==0.8.0
pytest-timeout==1.2.0
python-dateutil==2.6.0
python-dbusmock==0.16.7
pytz==2017.2
PyYAML==3.12
pyzmq==16.0.2
qtconsole==4.3.0
requests==2.13.0
requests-mock==1.3.0
rpyc==3.3.0
-e git+git#github.com:bossjones/scarlett_os.git#c14ffcde608da12f5c2d4d9b81a63c7e618b3eed#egg=scarlett_os
simplegeneric==0.8.1
six==1.10.0
snowballstemmer==1.2.1
Sphinx==1.5.5
stevedore==1.18.0
termcolor==1.1.0
terminado==0.6
testpath==0.3
tornado==4.5.1
tox==2.7.0
traitlets==4.3.2
typing==3.6.1
virtualenv==15.0.3
virtualenv-clone==0.2.6
virtualenvwrapper==4.7.2
voluptuous==0.9.3
watchdog==0.8.3
wcwidth==0.1.7
webencodings==0.5.1
widgetsnbextension==2.0.0
wmctrl==0.3
wrapt==1.10.10
xdot==0.7
Edit: ( One more detail, why didn't I just us the patch context manager or decorators ? )
pytest-mock has a pretty good section on their design choice, and why they decided to move away from nested with statements and decorators piled on top of each other. Link is here, but let me mention a couple here just in case:
- excessive nesting of with statements breaking the flow of test
- receiving the mocks as parameters doesn't mix nicely with pytest's approach of naming fixtures as parameters, or pytest.mark.parametrize;
So, if it is possible to make my code a bit cleaner using this plugin, I would like to make that happen. If that isn't possible, then maybe I need to reconsider things.
The error you get is that code under test hit AttributeError instead of TypeError.
The detail is that some object was assumed to have a .typename member, and it didn't.
I suspect once you solve that riddle, the rest will be just fine.
I see someone opened https://github.com/pytest-dev/pytest-mock/issues/84 (you?), let's wait for pytest devs to analyse it in case there's an incompatibility between 2 plugins.
Why not run your mocks with function decorators or context managers to make sure they get closed out? For example, in test_subprocess_map_type_to_command,
instead of doing all of this to mock scarlett_os.subprocess.Subprocess.check_command_type:
mock_check_command_type = mocker.MagicMock(name="mock_check_command_type")
mock_check_command_type.return_value = True
mocker.patch.object(scarlett_os.subprocess.Subprocess, 'check_command_type', mock_check_command_type)
Why not instead just use a context manager and do:
with mock.patch.object(
scarlett_os.subprocess.Subprocess,
'check_command_type',
return_value=True):
It'll be much terser, and will make sure your mock doesn't leak.
Even better, if your mocks apply to the whole function (I think some of them do), you can use a decorator at the top of the function:
#mock.patch('scarlett_os.subprocess.Subprocess.check_command_type',
return_value=True)