I want to test that a specific celery task call logger.info exactly once when the task is invoked with the delay() API.
And I want to do the test by patching logger.info .
I want to test as described here for the Product.order case https://docs.celeryproject.org/en/latest/userguide/testing.html.
The setup is : python 2.7 on ubuntu 16.04 . celery 4.3.0 , pytest 4.0.0, mock 3.0.3.
I have the following file system structure:
poc/
prj/
-celery_app.py
-tests.py
celery_app.py
from __future__ import absolute_import
from celery import Celery
from celery.utils.log import get_task_logger
app = Celery('celery_app')
logger = get_task_logger(__name__)
#app.task(bind=True)
def debug_task(self):
logger.info('Request: {0!r}'.format(self.request))
tests.py
from __future__ import absolute_import
from mock import patch
from prj.celery_app import debug_task
#patch('logging.Logger.info')
def test_log_info_is_called_only_once_when_called_sync(log_info):
debug_task()
log_info.assert_called_once()
#patch('logging.Logger.info')
def test_log_info_is_called_only_once_when_called_async(log_info):
debug_task.delay()
log_info.assert_called_once()
I expect both tests to have success.
Instead the first has success, while the second fails with AssertionError: Expected 'info' to have been called once. Called 0 times.
I expect that the evaluation of the expression logger.info inside debug_task() context evaluates to <MagicMock name='info' id='someinteger'> in both cases, instead it evaluates to <bound method Logger.info of <logging.Logger object at 0x7f894eeb1690>> in the second case, showing no patching.
I know that in the second case the celery worker executes the task inside a thread.
I ask for a way to patch the logger.info call when debug_task.delay() is executed.
Thanks in advance for any answer.
For sure I'm missing something in Flask and unit test integration (or logger configuration maybe)
But when I'm trying to unittest some class methods that have some app.logger I'm having troubles with RuntimeError: working outside of the application context
So a practical example:
utils.py
import boto3
from flask import current_app as app
class CustomError(BaseException):
type = "boto"
class BotoManager:
def upload_to_s3(self, file):
try:
# do something that can trigger a boto3 error
except boto3.exceptions.Boto3Error as e:
app.logger.error(e)
raise CustomError()
test_utils.py
import pytest
from utils.py import CustomError, BotoManager
def test_s3_manager_trigger_error():
boto_manager = BotoManager()
with pytest.raises(CustomError):
boto_manager.upload_to_s3('file.txt') # file doesn't exist so trigger error
So the thing is that when I run it show me the error:
RuntimeError: Working outside of application context.
Becuase the app is not created and I'm not working with the app, so have sense.
So I only see two possible solutions (spoiler I don't like any of them):
Don't log anything with app.logger outside of the views (I think I can use the python logging system, but this is not the desired behaviour)
Don't unittest the parts that use app.logger
Did someone face this problem already? How did you solve it? Any other possible solution?
I have a simple flask app, i want to test it using pytest.
my conftest.py:
#pytest.fixture
def app(self):
app = create_app(TestingConfig)
return app
my test_view.py:
class TestMainView:
def test_ping(self, client):
res = client.get(url_for('main.index'))
assert res.status_code == 200
when i run the test's using pytest it's throwing an error saying:
fixture 'client' not found
> available fixtures: app, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_xml_property, recwarn, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.
i have another testing file where i test default configs and it passes:
class TestTestingClass(object):
app = create_app(TestingConfig)
def test_app_is_test(self):
''' test the test enviroment is set okay and works'''
assert self.app.config['SECRET_KEY'] == 'something hard'
assert self.app.config['DEBUG'] == True
assert self.app.config['TESTING'] == True
assert current_app is not None
assert self.app.config['SQLALCHEMY_DATABASE_URI'] == 'sqlite:////tmp/testing.db'
edit:
i was able to pass an empty test by changing to:
conftest.py:
#pytest.fixture
def app():
app = create_app(TestingConfig)
return app
#pytest.fixture
def client(app):
return app.test_client()
and my test file to:
def test_index(client):
assert True
but, still i can't pass the test_index if it was:
def test_index(client):
assert client.get(url_for('main.index')).status_code == 200
but this time, I'm getting an error stating that says:
RuntimeError: Attempted to generate a URL without the application context being pushed. This has to be executed when application context is available.
i tried so many different things. i updated pip for this project's virtualenvironment and updated the pytest and pytest-flask. but none did work.
I was able to pass the tests by:
removed the pytest and pytest-flask from virtualenvironment.
removed my system-wide installations of them.
strangely, i had a package named flask-pytest i removed it(in the env)
installed them again system-wide.
installed them again on virtualenvironment.
i don't know how this had anything with the tests, but it worked. the only thing different is that i didn't installed the said flask-pytest thing.
You need to delete your incorrect client fixture (see the implementation of pytest-flask for the correct one).
Afterwards you need to install pytest-flask and pytest inside of the virtualenv, you better remove the system wide ones to avoid confusion.
Afterwards you should be able to run your tests.
I just had the same issue. The solution was:
pip uninstall pytest-flask
I have configured celery and the backend:
cleryapp = Celery(
'tasks_app', brocker='amqp://guest#localhost//',
backend='db+postgresql://guest#localhost:5432'
)
'results' appears disabled when i start the worker, but I read on another question here that that's not the issue.
The database is getting all the data correctly, but
result = AsyncResult(task_id)
raises
AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'
I found a more convenient way to do that.
result = celery.AsyncResult(task_id)
celery is the Celery instance of your application, not the celery module.
try using this instead where task is the name of your task function:
result = task.AsyncResult(task_id)
you can try:
from celery import result, Celery
app = Celery(backend='redis://localhost:6379/0')
res = result.AsyncResult(id='7037247e-f528-43ba-bce5-ee0e30704c58', app=app)
print(res.info)
just like it said celery , you should specify the value of backend,
just like: app = Celery("tasks", broker='mongodb://localhost:27017/test',backend='mongodb://localhost:27017/test1')
Try to import the task also in your AyscResult script to let celery know the backend setting. I have faced the similar issue (AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for') with the backend well configured and this help me a lot.
from <your celery app> import <tasks> # add this one
from celery.result AsyncResult
result = AsyncResult(task_id)
print(result.state) # check if it worked or not, it should
For those of you who are also coming from django background might be tempted to use : from celery.result import AsyncResult in the shell.
However, remember that in django we are using python manage.py shell. There is a lot of configuration settings that django does behind the scene.
In other applications that might not be the case, especially with normal python shell. That's why we have to explicitly specially our own celery app.
e.g. if your main.py looks like this:
from celery import current_app
def create_app() -> FastAPI:
app = FastAPI()
celery_app = current_app
celery_app.config_from_object(config.settings, namespace="CELERY")
You can use the below code in a normal python shell.
(env) ✘ ⚙ ss#nofoobar ~/Documents/fastapi-celery python
>>> from main import celery
>>> from celery.result import AsyncResult
>>> AsyncResult("e3d3ef1c-65a5-4045-87c1-014aa159f52f")
The Celery documentation mentions testing Celery within Django but doesn't explain how to test a Celery task if you are not using Django. How do you do this?
It is possible to test tasks synchronously using any unittest lib out there. I normaly do 2 different test sessions when working with celery tasks. The first one (as I'm suggesting bellow) is completely synchronous and should be the one that makes sure the algorithm does what it should do. The second session uses the whole system (including the broker) and makes sure I'm not having serialization issues or any other distribution, comunication problem.
So:
from celery import Celery
celery = Celery()
#celery.task
def add(x, y):
return x + y
And your test:
from nose.tools import eq_
def test_add_task():
rst = add.apply(args=(4, 4)).get()
eq_(rst, 8)
Here is an update to my seven years old answer:
You can run a worker in a separate thread via a pytest fixture:
https://docs.celeryq.dev/en/v5.2.6/userguide/testing.html#celery-worker-embed-live-worker
According to the docs, you should not use "always_eager" (see the top of the page of the above link).
Old answer:
I use this:
with mock.patch('celeryconfig.CELERY_ALWAYS_EAGER', True, create=True):
...
Docs: https://docs.celeryq.dev/en/3.1/configuration.html#celery-always-eager
CELERY_ALWAYS_EAGER lets you run your task synchronously, and you don't need a celery server.
Depends on what exactly you want to be testing.
Test the task code directly. Don't call "task.delay(...)" just call "task(...)" from your unit tests.
Use CELERY_ALWAYS_EAGER. This will cause your tasks to be called immediately at the point you say "task.delay(...)", so you can test the whole path (but not any asynchronous behavior).
For those on Celery 4 it's:
#override_settings(CELERY_TASK_ALWAYS_EAGER=True)
Because the settings names have been changed and need updating if you choose to upgrade, see
https://docs.celeryproject.org/en/latest/history/whatsnew-4.0.html?highlight=what%20is%20new#lowercase-setting-names
unittest
import unittest
from myproject.myapp import celeryapp
class TestMyCeleryWorker(unittest.TestCase):
def setUp(self):
celeryapp.conf.update(CELERY_ALWAYS_EAGER=True)
py.test fixtures
# conftest.py
from myproject.myapp import celeryapp
#pytest.fixture(scope='module')
def celery_app(request):
celeryapp.conf.update(CELERY_ALWAYS_EAGER=True)
return celeryapp
# test_tasks.py
def test_some_task(celery_app):
...
Addendum: make send_task respect eager
from celery import current_app
def send_task(name, args=(), kwargs={}, **opts):
# https://github.com/celery/celery/issues/581
task = current_app.tasks[name]
return task.apply(args, kwargs, **opts)
current_app.send_task = send_task
As of Celery 3.0, one way to set CELERY_ALWAYS_EAGER in Django is:
from django.test import TestCase, override_settings
from .foo import foo_celery_task
class MyTest(TestCase):
#override_settings(CELERY_ALWAYS_EAGER=True)
def test_foo(self):
self.assertTrue(foo_celery_task.delay())
Since Celery v4.0, py.test fixtures are provided to start a celery worker just for the test and are shut down when done:
def test_myfunc_is_executed(celery_session_worker):
# celery_session_worker: <Worker: gen93553#mymachine.local (running)>
assert myfunc.delay().wait(3)
Among other fixtures described on http://docs.celeryproject.org/en/latest/userguide/testing.html#py-test, you can change the celery default options by redefining the celery_config fixture this way:
#pytest.fixture(scope='session')
def celery_config():
return {
'accept_content': ['json', 'pickle'],
'result_serializer': 'pickle',
}
By default, the test worker uses an in-memory broker and result backend. No need to use a local Redis or RabbitMQ if not testing specific features.
reference
using pytest.
def test_add(celery_worker):
mytask.delay()
if you use flask, set the app config
CELERY_BROKER_URL = 'memory://'
CELERY_RESULT_BACKEND = 'cache+memory://'
and in conftest.py
#pytest.fixture
def app():
yield app # Your actual Flask application
#pytest.fixture
def celery_app(app):
from celery.contrib.testing import tasks # need it
yield celery_app # Your actual Flask-Celery application
In my case (and I assume many others), all I wanted was to test the inner logic of a task using pytest.
TL;DR; ended up mocking everything away (OPTION 2)
Example Use Case:
proj/tasks.py
#shared_task(bind=True)
def add_task(self, a, b):
return a+b;
tests/test_tasks.py
from proj import add_task
def test_add():
assert add_task(1, 2) == 3, '1 + 2 should equal 3'
but, since shared_task decorator does a lot of celery internal logic, it isn't really a unit tests.
So, for me, there were 2 options:
OPTION 1: Separate internal logic
proj/tasks_logic.py
def internal_add(a, b):
return a + b;
proj/tasks.py
from .tasks_logic import internal_add
#shared_task(bind=True)
def add_task(self, a, b):
return internal_add(a, b);
This looks very odd, and other than making it less readable, it requires to manually extract and pass attributes that are part of the request, for instance the task_id in case you need it, which make the logic less pure.
OPTION 2: mocks
mocking away celery internals
tests/__init__.py
# noinspection PyUnresolvedReferences
from celery import shared_task
from mock import patch
def mock_signature(**kwargs):
return {}
def mocked_shared_task(*decorator_args, **decorator_kwargs):
def mocked_shared_decorator(func):
func.signature = func.si = func.s = mock_signature
return func
return mocked_shared_decorator
patch('celery.shared_task', mocked_shared_task).start()
which then allows me to mock the request object (again, in case you need things from the request, like the id, or the retries counter.
tests/test_tasks.py
from proj import add_task
class MockedRequest:
def __init__(self, id=None):
self.id = id or 1
class MockedTask:
def __init__(self, id=None):
self.request = MockedRequest(id=id)
def test_add():
mocked_task = MockedTask(id=3)
assert add_task(mocked_task, 1, 2) == 3, '1 + 2 should equal 3'
This solution is much more manual, but, it gives me the control I need to actually unit test, without repeating myself, and without losing the celery scope.
I see a lot of CELERY_ALWAYS_EAGER = true in unit tests methods as a solution for unit tests, but since the version 5.0.5 is available there are a lot of changes which makes most of the old answers deprecated and for me a time consuming nonsense, so for everyone here searching a Solution, go to the Doc and read the well documented unit test examples for the new Version:
https://docs.celeryproject.org/en/stable/userguide/testing.html
And to the Eager Mode with Unit Tests, here a quote from the actual docs:
Eager mode
The eager mode enabled by the task_always_eager setting is by
definition not suitable for unit tests.
When testing with eager mode you are only testing an emulation of what
happens in a worker, and there are many discrepancies between the
emulation and what happens in reality.
Another option is to mock the task if you do not need the side effects of running it.
from unittest import mock
#mock.patch('module.module.task')
def test_name(self, mock_task): ...