Here is the tasks.py
from celery.task import task
#task
def add(x, y):
return x + y
Here is the celeryconfig.py
print 'importing ' + __file__
BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_RESULT_BACKEND = "amqp"
CELERY_IMPORTS = ("tasks", )
Here is the file that i run.
tasks.py:
from tasks import add
result = add.delay(4, 4)
print result.wait()
The program just stuck in the wait method.
Celeryd prints the following error:
Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.
The full contents of the message body was:
{'retries': 0, 'task': 'tasks.add', 'args': (4, 4), 'expires': None, 'eta': None
, 'kwargs': {}, 'id': '8c973638-4a87-4afa-8a78-958153066215'}
Traceback (most recent call last):
File "d:\python26\lib\site-packages\celery-2.4.5-py2.6.egg\celery\worker\consu
mer.py", line 427, in receive_message
eventer=self.event_dispatcher)
File "d:\python26\lib\site-packages\celery-2.4.5-py2.6.egg\celery\worker\job.p
y", line 297, in from_message
on_ack=on_ack, delivery_info=delivery_info, **kw)
File "d:\python26\lib\site-packages\celery-2.4.5-py2.6.egg\celery\worker\job.p
y", line 261, in __init__
self.task = registry.tasks[self.task_name]
File "d:\python26\lib\site-packages\celery-2.4.5-py2.6.egg\celery\registry.py"
, line 66, in __getitem__
raise self.NotRegistered(key)
NotRegistered: 'tasks.add'
When I run celeryd.py status, I the tasks.add is not here.
D:\Python26\Lib\site-packages\celery-2.4.5-py2.6.egg\celery\bin>celeryctl.py in
pect registered
<- registered
-> onfirenbpc: OK
* celery.backend_cleanup
* celery.chord
* celery.chord_unlock
* celery.ping
I run on windows, and linux as well. There are the same problem.
Does anyone know why?
Did you set you add method as task? One of variant to do that would use decorator:
from celery.decorators import task
#task
def add():
pass
Related
I need to use Celery to schedule a Fabric task on a set of hosts.
My code is:
tasks.py
from fabric_tasks import poll
from api.client import APIClient as client
from celery import Celery
celery = Celery(broker='redis://')
#celery.task
def poll_all():
actions = client().get_actions(status='SCHEDULED)
ids = [a['id'] for a in actions]
execute(poll, ids, hosts=hosts)
CELERYBEAT_SCHEDULE = {
'every-5-second': {
'task': 'tasks.poll_all',
'schedule': timedelta(seconds=5),
},
}
celery -A tasks.celery -B -l info
The task fail with the following exception:
[2016-12-11 17:54:07,928: ERROR/PoolWorker-3] Task tasks.poll_actions[7b26a083-b450-4a90-8971-97f3dd2f4f5d] raised unexpected: AttributeError("'Process' object has no attribute '_authkey'",)
Traceback (most recent call last):
File "/Users/ocervell/.virtualenvs/ndc-v3.3/lib/python2.7/site-packages/celery/app/trace.py", line 367, in trace_task
R = retval = fun(*args, **kwargs)
File "/Users/ocervell/.virtualenvs/ndc-v3.3/lib/python2.7/site-packages/celery/app/trace.py", line 622, in __protected_call__
return self.run(*args, **kwargs)
File "/Users/ocervell/Drive/workspace/ccc/svn/build-support/branches/development-ocervell/modules/ndc/v3.3/tasks.py", line 44, in poll_all
return execute(poll, 'ls', hosts=hosts)
File "/Users/ocervell/.virtualenvs/ndc-v3.3/lib/python2.7/site-packages/fabric/tasks.py", line 387, in execute
multiprocessing
File "/Users/ocervell/.virtualenvs/ndc-v3.3/lib/python2.7/site-packages/fabric/tasks.py", line 269, in _execute
p = multiprocessing.Process(target=inner, kwargs=kwarg_dict)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/process.py", line 98, in __init__
self._authkey = _current_process._authkey
AttributeError: 'Process' object has no attribute '_authkey'
Temporary fix
The following hack put in tasks.py helped me fix this bug (essentially it sets the attributes that caused the AttributeError exception):
from celery.signals import worker_process_init
#worker_process_init.connect
def fix_multiprocessing(**kwargs):
from multiprocessing import current_process
keys = {
'_authkey': '',
'_daemonic': False,
'_tempdir': ''
}
for k, v in keys.items():
try:
getattr(current_process(), k)
except AttributeError:
setattr(current_process(), k, v)
Related:
https://github.com/celery/celery/issues/1709
Python multiprocessing job to Celery task but AttributeError
Alternatives ?
I was wondering if there is a better way for Celery to schedule Fabric tasks without writing dirty hacks around multiprocessing library ?
I'm new to celery and was trying to use it in my app. Below is my basic app structure
my_app
|-run.py
|-app
|-mod1
|-mod2
|-tasks
|-__init__.py
|-email
|-other_tasks_file
I want to confine all my background tasks to my tasks module . In the init.py of tasks i have
from celery import Celery
celery = Celery('my_app', broker='redis://localhost:6379/0')
within my tasks/email i have
from app.tasks import celery
#celery.task
def send_email():
#do stuff
from the terminal i start a worker using
celery -A app.tasks worker --loglevel=DEBUG
But my task does not show up in celery's task list . Also, once i run my task from the interpreter like so
>>from app.tasks import email
>>email_result = email.send_email.delay()
When i do this i get the following response in my celery terminal
Received unregistered task of type 'app.tasks.emails.send_email'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see url for more information.
The full contents of the message body was:
{'kwargs': {}, 'taskset': None, 'id': '51e8f766-e772-4d85-bad0-5a6774ea541a', 'eta': None, 'timelimit': (None, None), 'args': [], 'retries': 0, 'task': 'app.tasks.emails.send_email', 'utc': True, 'errbacks': None, 'chord': None, 'expires': None, 'callbacks': None} (283b)
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/celery/app/utils.py", line 235, in find_app
sym = symbol_by_name(app, imp=imp)
File "/usr/local/lib/python3.4/site-packages/celery/bin/base.py", line 492, in symbol_by_name
return symbol_by_name(name, imp=imp)
File "/usr/local/lib/python3.4/site-packages/kombu/utils/__init__.py", line 101, in symbol_by_name
return getattr(module, cls_name) if cls_name else module
AttributeError: 'module' object has no attribute 'tasks'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/celery/worker/consumer.py", line 456, in on_task_received
strategies[name](message, body,
KeyError: 'app.tasks.channels.send_email'
I am using python 3.4 and celery 3.1.23
If anyone needs it, I finally got it working .
What I needed to do was run a celery worker for the actual file containing the task inorder for celery to register the task -
celery -A app.tasks.emails worker --loglevel=DEBUG
because simply running
celery -A app.tasks worker --loglevel=DEBUG
(this is my wild guess) would not actually import my send_email() task . If anyone can give me an explanation for this please do.
I try to start a Celery worker server from a command line:
celery -A server application worker --loglevel=info
The code and folder path:
server.py
application/controllers/routes.py
server.py
app = Flask(__name__)
from application.controllers import routes
app.run(host='127.0.0.1',port=5051,debug=True)
route.py
from flask import Flask,
from celery import Celery
from server import app
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
#celery.task()
def add_together(self, count):
return "First success"
#app.route("/queing")
def testsfunction():
count = 1
add_together.delay(count)
return "cool"
Trace back:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/2.7/bin/celery", line 11, in <module>
sys.exit(main())
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/__main__.py", line 30, in main
main()
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/celery.py", line 770, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/base.py", line 309, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/base.py", line 477, in setup_app_from_commandline
user_preload = tuple(self.app.user_options['preload'] or ())
AttributeError: 'Flask' object has no attribute 'user_options'
I got this error when I'm running a celery worker in terminal.
just run the celery with this command instead of yours:
celery -A application.controllers.routes:celery worker --loglevel=info
this will solve your current problem however your codes have a plenty of mistakes for example if you want to have a self argument inside your add_together function you you should declare a task like this:
#celery.task(bind=True)
It seems like you have typo mistake:
def add_together(self, count):
to
def add_together(count):
I've been reading through the answers to some previous questions as well as the celery docs and I just can't quite fix this issue. My task.py, celery.py, and settings.py are all contained within the same app RBWebfiles which is contained within the project called recruiting board. I'm attempting to creating a periodic-task that will run using celery beat's scheduler.
This is the portion of my settings.py that deals with celery:
from __future__ import absolute_import
import os
from datetime import timedelta
CELERY_IMPORTS = ('RBWebfiles.tasks')
CELERYBEAT_SCHEDULE = 'djcelery.schedulers.DatabaseScheduler'
CELERYBEAT_SCHEDULE = {
'schedule-name':{
'task': 'RBWebfiles.tasks.requestRefresher',
'schedule': timedelta(seconds = 30),
},
}
BROKER_URL = 'django://'
I also added 'djcelery' to my installed apps within that file
This is my tasks.py file:
from __future__ import absolute_import
import datetime
from celery.task.base import periodic_task
from student.models import StudentAccount
from celery import Celery
from celery.utils.log import get_task_logger
import os
celery = Celery('tasks', broker='django://')
logger = get_task_logger(__name__)
os.environ['DJANGO_SETTINGS_MODULE'] = 'RBWebfiles.settings'
#periodic_task(run_every=datetime.timedelta(seconds=30))
def requestRefresher(self):
logger.info("start task")
for s in StudentAccount.objects.all():
s.requestLimit = 0
s.save()
return None
and lastly this is my celery.py file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'conf.settings')
app = Celery('RBWebfiles.celery')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda : settings.INSTALLED_APPS)
app.conf.update(
CELERY_RESULT_BACKEND ='djcelery.backends.database:DatabaseBackend',
)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I've tried uninstalling and reinstalling both celery and django-celery, I'm not sure if I just don't understand something or I'm making a massive error. I attempt to run it using the command : celery beat -A RBWebfiles
this is the traceback:
C:\Users\Lexie Infantino\PycharmProjects\recruitingboard>celery beat -A RBWebfiles
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\celery\app\utils.py", line 235, in find_app
found = sym.app
AttributeError: 'module' object has no attribute 'app'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\runpy.py", line 170, in _run_module_as_main
"__main__", mod_spec)
File "C:\Python34\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Python34\Scripts\celery.exe\__main__.py", line 9, in <module>
File "C:\Python34\lib\site-packages\celery\__main__.py", line 30, in main
main()
File "C:\Python34\lib\site-packages\celery\bin\celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "C:\Python34\lib\site-packages\celery\bin\celery.py", line 769, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 309, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 469, in setup_app_from_commandline
self.app = self.find_app(app)
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 489, in find_app
return find_app(app, symbol_by_name=self.symbol_by_name)
File "C:\Python34\lib\site-packages\celery\app\utils.py", line 240, in find_app
found = sym.celery
AttributeError: 'module' object has no attribute 'celery'
Make sure djcelery is in your INSTALLED_APPS and try starting celery beat with manage.py celerybeat instead.
I am trying to setup this basic example from the following doc:
http://flask.pocoo.org/docs/patterns/celery/
But so far I keep getting the below error:
AttributeError: 'Flask' object has no attribute 'user_options'
I am using celery 3.1.15.
from celery import Celery
def make_celery(app):
celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
Example:
from flask import Flask
app = Flask(__name__)
app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379'
)
celery = make_celery(app)
#celery.task()
def add_together(a, b):
return a + b
Traceback error:
Traceback (most recent call last):
File "/usr/local/bin/celery", line 11, in <module>
sys.exit(main())
File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 30, in main
main()
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 769, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 305, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 473, in setup_app_from_commandline
user_preload = tuple(self.app.user_options['preload'] or ())
AttributeError: 'Flask' object has no attribute 'user_options'
The Flask Celery Based Background Tasks page (http://flask.pocoo.org/docs/patterns/celery/) suggests this to start celery:
celery -A your_application worker
The your_application string has to point to your application’s package or module that creates the celery object.
Assuming the code resides in application.py, explicitly pointing to the celery object (not just the module name) avoided the error:
celery -A application.celery worker
This worked for me:
celery -A my_app_module_name.celery worker
rename app flask_app
It will work
like this:
celery -A your_application worker
where your_application stands:
your_application = Flask(\__name\__)
the python file name: your_application.py, it will work
By the way, celery v4 is unsupported in Windows