I'm trying to use celery for background process in my Django application. Django version is 1.4.8 and latest suitable celery version is 3.1.25.
I use Redis (3.1.0) as broker and backend, json as serializer.
When I start the worker
celery -A celery_app worker -l info I'm getting Attribute error 'unicode' object has no attribute 'iteritems'
My settings.py file:
BROKER_URL = 'redis://localhost'
CELERY_RESULT_BACKEND = 'redis://localhost/'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_STORE_ERRORS_EVEN_IF_IGNORED = True
celery_app.py:
import sys
from django.conf import settings
from celery import Celery
project_root = os.path.dirname(__file__)
sys.path.insert(0, os.path.join(project_root, '../env'))
sys.path.insert(0, os.path.join(project_root, '../'))
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('project')
app.config_from_object('project.settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=True)
tasks.py:
#celery_app.task
def sample_task(x):
return 'Test response'
and that's how I run this task:
sample_task.delay({'key': 'test'})
And I get the following error:
File "/Users/user/project/venv/lib/python2.7/site-packages/redis/_compat.py", line 94, in iteritems
return x.iteritems()
AttributeError: 'unicode' object has no attribute 'iteritems'
full traceback:
[2019-01-31 16:43:08,909: ERROR/MainProcess] Unrecoverable error: AttributeError("'unicode' object has no attribute 'iteritems'",)
Traceback (most recent call last):
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in start
self.blueprint.start(self)
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/bootsteps.py", line 374, in start
return self.obj.start()
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/worker/consumer.py", line 280, in start
blueprint.start(self)
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/worker/consumer.py", line 884, in start
c.loop(*c.loop_args())
File "/Users/user/project/venv/lib/python2.7/site-packages/celery/worker/loops.py", line 76, in asynloop
next(loop)
File "/Users/user/project/venv/lib/python2.7/site-packages/kombu/async/hub.py", line 340, in create_loop
cb(*cbargs)
File "/Users/user/project/venv/lib/python2.7/site-packages/kombu/transport/redis.py", line 1019, in on_readable
self._callbacks[queue](message)
File "/Users/user/project/venv/lib/python2.7/site-packages/kombu/transport/virtual/__init__.py", line 534, in _callback
self.qos.append(message, message.delivery_tag)
File "/Users/user/project/venv/lib/python2.7/site-packages/kombu/transport/redis.py", line 146, in append
pipe.zadd(self.unacked_index_key, delivery_tag, time()) \
File "/Users/user/project/venv/lib/python2.7/site-packages/redis/client.py", line 2320, in zadd
for pair in iteritems(mapping):
File "/Users/user/project/venv/lib/python2.7/site-packages/redis/_compat.py", line 94, in iteritems
return x.iteritems()
AttributeError: 'unicode' object has no attribute 'iteritems'
I tried to find the issue on the internet, tried to pass another params to task. I don't know how to debug celery process and could not find the solution by myself. Please help me
Seems that this Celery version doesn't support Redis 3. Try to install Redis 2.10.6.
Related
i am extremely new to django-celery, its doc have been confusing to me and i have been following tutorial, here is just a basic setup and i have encountered a untrackable error for me, the error is:
AttributeError: 'EntryPoint' object has no attribute 'module_name'
full traceback:
Traceback (most recent call last):
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/app/base.py", line 1250, in backend
return self._local.backend
AttributeError: '_thread._local' object has no attribute 'backend'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/worker/worker.py", line 203, in start
self.blueprint.start(self)
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/bootsteps.py", line 112, in start
self.on_start()
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/apps/worker.py", line 136, in on_start
self.emit_banner()
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/apps/worker.py", line 170, in emit_banner
' \n', self.startup_info(artlines=not use_image))),
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/apps/worker.py", line 232, in startup_info
results=self.app.backend.as_uri(),
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/app/base.py", line 1252, in backend
self._local.backend = new_backend = self._get_backend()
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/app/base.py", line 955, in _get_backend
backend, url = backends.by_url(
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/app/backends.py", line 69, in by_url
return by_name(backend, loader), url
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/app/backends.py", line 47, in by_name
aliases.update(load_extension_class_names(extension_namespace))
File "/home/muhammad/Desktop/celery/env/lib/python3.8/site-packages/celery/utils/imports.py", line 146, in load_extension_class_names
yield ep.name, ':'.join([ep.module_name, ep.attrs[0]])
AttributeError: 'EntryPoint' object has no attribute 'module_name'
the celery.py, init.py and and task are just the basic:
init.py:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
celery.py:
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
task.py:
from celery import shared_task
#shared_task(bind=True)
def add(x, y):
return x + y
settings conf for celery is:
CELERY_BROKER_URL = 'localhost'
CELERY_ACCEPT_CONTENT = ['json',]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'django-db'
also i have installed and added django_celery_result in INSTALLED APPS
INSTALLED_APPS = [
...
'celery_practice',
'django_celery_results'
]
For your information, rabbitmq-server is running on localhost thats why i have set BROKER_URL TO 'localhost':
rabbitmq-server.service - RabbitMQ Messaging Server
Loaded: loaded (/lib/systemd/system/rabbitmq-server.service; enabled; vendor preset: enabled)
Active: active (running) since Sun 2022-04-03 16:14:04 PKT; 24h ago
Main PID: 1005 (beam.smp)
Status: "Initialized"
Tasks: 91 (limit: 9090)
any help would be appreciated thanks!
I'm trying to write a celery application that passes numpy arrays (or any arbitrary objects) to the workers. As far as I can tell, this requires serialization to occur via pickle (NB: I'm aware of the security implications but this isn't a concern in this case).
However, even after trying every possible way I could find to allow pickle as a serializer, I keep getting the following kombu exception:
kombu.exceptions.ContentDisallowed: Refusing to deserialize untrusted
content of type pickle (application/x-python-serialize)
My current files are currently:
# tasks.py
from celery import Celery
app = Celery(
'tasks',
broker='redis://localhost',
accept_content=['pickle'],
task_serializer='pickle'
)
#app.task
def adding(x, y):
return x + y
if __name__ == '__main__':
import numpy as np
adding.apply_async((np.array([1]), np.array([1])), serializer='pickle')
In addition I have a config file:
# celeryconfig.py
print('configuring...')
accept_content = ['pickle', 'application/x-python-serialize']
task_serializer = 'pickle'
result_serializer = 'pickle'
from kombu import serialization
serialization.register_pickle()
serialization.enable_insecure_serializers()
However, if I run the worker (celery -A tasks worker --loglevel=info) and then execute the code that makes an async call (python tasks.py), I get the following traceback. Am I missing something?
[2018-06-16 11:46:23,617: CRITICAL/MainProcess] Unrecoverable error: ContentDisallowed('Refusing to deserialize untrusted content of type pickle (application/x-python-serialize)',)
Traceback (most recent call last):
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/worker.py", line 205, in start
self.blueprint.start(self)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/bootsteps.py", line 369, in start
return self.obj.start()
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 322, in start
blueprint.start(self)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/consufrom celery import Celery
mer/consumer.py", line 598, in start
c.loop(*c.loop_args())
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/loops.py", line 91, in asynloop
next(loop)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/asynchronous/hub.py", line 354, in create_loop
cb(*cbargs)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/transport/redis.py", line 1040, in on_readable
self.cycle.on_readable(fileno)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/transport/redis.py", line 337, in on_readable
chan.handlers[type]()
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/transport/redis.py", line 724, in _brpop_read
self.connection._deliver(loads(bytes_to_str(item)), dest)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/transport/virtual/base.py", line 983, in _deliver
callback(message)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/transport/virtual/base.py", line 633, in _callback
return callback(message)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/messaging.py", line 624, in _receive_callback
return on_m(message) if on_m else self.receive(decoded, message)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 572, in on_task_received
callbacks,
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/celery/worker/strategy.py", line 136, in task_message_handler
if body is None and 'args' not in message.payload:
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/message.py", line 207, in payload
return self._decoded_cache if self._decoded_cache else self.decode()
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/message.py", line 192, in decode
self._decoded_cache = self._decode()
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/message.py", line 197, in _decode
self.content_encoding, accept=self.accept)
File "/opt/anaconda/envs/Python3/lib/python3.6/site-packages/kombu/serialization.py", line 253, in loads
raise self._for_untrusted_content(content_type, 'untrusted')
kombu.exceptions.ContentDisallowed: Refusing to deserialize untrusted content of type pickle (application/x-python-serialize)
For anyone coming to this question:
The answer was to use the app.config_from_object method:
import celeryconfig
app.config_from_object(celeryconfig)
I try to start a Celery worker server from a command line:
celery -A server application worker --loglevel=info
The code and folder path:
server.py
application/controllers/routes.py
server.py
app = Flask(__name__)
from application.controllers import routes
app.run(host='127.0.0.1',port=5051,debug=True)
route.py
from flask import Flask,
from celery import Celery
from server import app
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
#celery.task()
def add_together(self, count):
return "First success"
#app.route("/queing")
def testsfunction():
count = 1
add_together.delay(count)
return "cool"
Trace back:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/2.7/bin/celery", line 11, in <module>
sys.exit(main())
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/__main__.py", line 30, in main
main()
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/celery.py", line 770, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/base.py", line 309, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/celery/bin/base.py", line 477, in setup_app_from_commandline
user_preload = tuple(self.app.user_options['preload'] or ())
AttributeError: 'Flask' object has no attribute 'user_options'
I got this error when I'm running a celery worker in terminal.
just run the celery with this command instead of yours:
celery -A application.controllers.routes:celery worker --loglevel=info
this will solve your current problem however your codes have a plenty of mistakes for example if you want to have a self argument inside your add_together function you you should declare a task like this:
#celery.task(bind=True)
It seems like you have typo mistake:
def add_together(self, count):
to
def add_together(count):
I've been reading through the answers to some previous questions as well as the celery docs and I just can't quite fix this issue. My task.py, celery.py, and settings.py are all contained within the same app RBWebfiles which is contained within the project called recruiting board. I'm attempting to creating a periodic-task that will run using celery beat's scheduler.
This is the portion of my settings.py that deals with celery:
from __future__ import absolute_import
import os
from datetime import timedelta
CELERY_IMPORTS = ('RBWebfiles.tasks')
CELERYBEAT_SCHEDULE = 'djcelery.schedulers.DatabaseScheduler'
CELERYBEAT_SCHEDULE = {
'schedule-name':{
'task': 'RBWebfiles.tasks.requestRefresher',
'schedule': timedelta(seconds = 30),
},
}
BROKER_URL = 'django://'
I also added 'djcelery' to my installed apps within that file
This is my tasks.py file:
from __future__ import absolute_import
import datetime
from celery.task.base import periodic_task
from student.models import StudentAccount
from celery import Celery
from celery.utils.log import get_task_logger
import os
celery = Celery('tasks', broker='django://')
logger = get_task_logger(__name__)
os.environ['DJANGO_SETTINGS_MODULE'] = 'RBWebfiles.settings'
#periodic_task(run_every=datetime.timedelta(seconds=30))
def requestRefresher(self):
logger.info("start task")
for s in StudentAccount.objects.all():
s.requestLimit = 0
s.save()
return None
and lastly this is my celery.py file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'conf.settings')
app = Celery('RBWebfiles.celery')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda : settings.INSTALLED_APPS)
app.conf.update(
CELERY_RESULT_BACKEND ='djcelery.backends.database:DatabaseBackend',
)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I've tried uninstalling and reinstalling both celery and django-celery, I'm not sure if I just don't understand something or I'm making a massive error. I attempt to run it using the command : celery beat -A RBWebfiles
this is the traceback:
C:\Users\Lexie Infantino\PycharmProjects\recruitingboard>celery beat -A RBWebfiles
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\celery\app\utils.py", line 235, in find_app
found = sym.app
AttributeError: 'module' object has no attribute 'app'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\runpy.py", line 170, in _run_module_as_main
"__main__", mod_spec)
File "C:\Python34\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Python34\Scripts\celery.exe\__main__.py", line 9, in <module>
File "C:\Python34\lib\site-packages\celery\__main__.py", line 30, in main
main()
File "C:\Python34\lib\site-packages\celery\bin\celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "C:\Python34\lib\site-packages\celery\bin\celery.py", line 769, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 309, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 469, in setup_app_from_commandline
self.app = self.find_app(app)
File "C:\Python34\lib\site-packages\celery\bin\base.py", line 489, in find_app
return find_app(app, symbol_by_name=self.symbol_by_name)
File "C:\Python34\lib\site-packages\celery\app\utils.py", line 240, in find_app
found = sym.celery
AttributeError: 'module' object has no attribute 'celery'
Make sure djcelery is in your INSTALLED_APPS and try starting celery beat with manage.py celerybeat instead.
When i use celery + gevent for tasks that uses subprocess module i'm getting following stacktrace:
Traceback (most recent call last):
File "/home/venv/admin/lib/python2.7/site-packages/celery/task/trace.py", line 228, in trace_task
R = retval = fun(*args, **kwargs)
File "/home/venv/admin/lib/python2.7/site-packages/celery/task/trace.py", line 415, in __protected_call__
return self.run(*args, **kwargs)
File "/home/webapp/admin/webadmin/apps/loggingquarantine/tasks.py", line 107, in release_mail_task
res = call_external_script(popen_obj.communicate)
File "/home/webapp/admin/webadmin/apps/core/helpers.py", line 42, in call_external_script
return func_to_call(*args, **kwargs)
File "/usr/lib64/python2.7/subprocess.py", line 740, in communicate
return self._communicate(input)
File "/usr/lib64/python2.7/subprocess.py", line 1257, in _communicate
stdout, stderr = self._communicate_with_poll(input)
File "/usr/lib64/python2.7/subprocess.py", line 1287, in _communicate_with_poll
poller = select.poll()
AttributeError: 'module' object has no attribute 'poll'
My manage.py looks following (doing monkeypatch there):
#!/usr/bin/env python
from gevent import monkey
import sys
import os
if __name__ == "__main__":
if not 'celery' in sys.argv:
monkey.patch_all()
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "webadmin.settings")
from django.core.management import execute_from_command_line
sys.path.append(".")
execute_from_command_line(sys.argv)
Is there a reason why celery tasks act like it wasn't patched properly?
p.s. strange thing that my local setup on Macos works fine while i getting such exceptions under Centos (all package versions are the same, init and config scripts too)
There's no simulation for poll in gevent so monkey.patch_all removes polling mechanisms that gevent.select does not simulate: poll, epoll, kqueue, kevent. See gevent.monkey – Make the standard library cooperative.