Celery can't find user module - python

I am using celery for for an app that downloads data. The directory structure is:
proj
__init__.py
celeryApp.py
tasks.py
startup.py
fetcher.py
I run celery with celery worker --app=celeryApp:app -l info from inside the proj folder.
I run my app which queues tasks by running: python startup.py. Both startup.py and fetcher.py import tasks.py. In startup.py I create an object of a class Fetcher defined in fetcher.py and pass that as an argument to a task runFetcher in tasks.py.
This results in an error
DecodeError: No module named fetcher
What changes would I need to make, so that I can safely pass such an object to a task?
Update: I added import fetcher to tasks.py which was fine to do as these tasks were closely related to the fetcher. Can anyone suggest a solution which does not require importing fetcher in tasks?

Related

Module 'project_name' has no attribute 'celery'

I'm trying to set up a background task using celery and rabbitmq on django but I'm getting an error saying that my project has no attribute celery. I'm using PyCharm and installed celery through that.
I'm new to celery but I've read a lot of articles similar to this issue (AttributeError: module 'module' has no attribute 'celery' this one seems the closest but not sure it's quite the same)
Project structure:
project_name
├──project_name
| ├── settings.py
├──app_name1
| └──celery.py
├──app_name2
| └──tasks.py
I run the following command:
celery -A project_name worker -l info --pool=solo
But I get the following error:
Error: Invalid value for "-A" / "--app":
Unable to load celery application.
Module 'project_name' has no attribute 'celery'
celery.py file:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project_name.settings')
app = Celery('project_name')
app.config_from_object('django.config:settings', namespace='CELERY')
app.autodiscover_tasks()
tasks.py file:
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task
def add(x, y):
return x + y
Just for the record: Put celery.py file into the main project file, not inside the app.
Try to enter into the project via the terminal. Write:
cd project_name
It worked with me

How to use Dynaconf to configure Celery

Recently I discovered Dynaconf which is a nice configuration management package that integrates nicely with Flask and Django. The Django app is running wonderfully with Dynaconf. However the Celery app that my App depends on to run background tasks is not.
Here is the code for the configuration of the Celery app that was working before using Dynaconf:
from celery import Celery
app = Celery('KillerApp')
app.config_from_object('django.conf:settings', namespace='CELERY')
It seems that I need to change 'django.conf:settings' to something else. Any ideas?
You can pass in a string representing a module to import, or just pass in the configuration object directly; see the Celery.config_from_object() method documentation.
You'll have a module that sets up the Dynaconf() instance, e.g. if you have a package named acme_project with a config.py file in it with
from dynaconf import Dynaconf
settings = Dynaconf(
settings_files=['settings.toml', '.secrets.toml'],
)
then you can import acme_project.config and find the settings object there. You can either import that object or let Celery do that by using 'acme_project.config:settings' as the value you pass to app.config_from_object(). The namespace argument tells Celery to expect all settings to be prefixed with CELERY_, exactly like the way this works with Django. Use this if you plan to use the Dynaconf-managed settings to configure multiple components, not just Celery.
E.g., if you used:
app.config_from_object('acme_project.config:settings', namespace='CELERY')
then your settings.toml or settings.yaml or whatever file format you picked would need to use CELERY_ as a prefix for all the settings.
If you are using the Django plugin for Dynaconf then you are able to just use the django.conf:settings directly as Dynaconf patches the django settings object.
If you still have problems I recommend opening an issue on dynaconf repo and try using the settings of your application directly.
Example, if you have an app called foo your DJANGO_SETTINGS_MODULE might be foo.settings then you can use for celery:
app.config_from_object('foo.settings:settings')

Importing python modules from different directories

I have a flask app with essentially the following structure:
app/
__init__.py
myapp.py
common/
tool1.py
tool2.py
web/
__init__.py
views.py
api/
api_impl.py
worker/
__init__.py
worker.py
tasks.py
I initialize in myapp.py an important object I use in several places and I can access it from common/tool1.py and web/api/api_impl.py with from myapp import object. I've been able to use tool1 and tool2 in multiple places in web/ and myapp.py importing with from common.tool1 import tool1_def.
Other relevant facts are in myapp.py there is an import web statement for the blueprints and app/__init__.py and worker/__init__.py are empty. web/__init__.py contains the blueprint definitions for the routes.
I can run the app with gunicorn with no issues, but when I try to run my worker with python app/worker/worker.py I get the error ModuleNotFoundError: No module named 'myapp'. The worker.py is trying to import the same object defined in myapp.py.
I just don't understand why I can run the app and it works but when I try to run the worker it doesn't! I'm definitely not fully understanding the import system in this case and everything I've read online doesn't seem to fully clarify this.
your working imports imply that the project root is the app folder. As such you need to lunch your worker from this folder (or add it to PYTHONPATH environment variable)
python worker/worker.py
Or
python -m worker.worker
In addition your __init__.py in the app folder should be removed as app is not a package but a project root.
One method we use regularly for development and debugging (but also works in production) is the following. You can add this to each (offending?) module, or all modules if you choose.
import os
import sys
# Add your project root to sys.path.
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
from common import tool1
from common import tool2
from worker import worker
etc ...
Here you are adding your project root to sys.path. When performing imports, Python starts by looking through sys.path for the module you're trying to import. So here, you are adding your project root as the first item in sys.path.
Another advantage is these imports are explicit to your project. For example, if you have a local workers module which you want to import, but also have a workers package in site-packages, this will look in your local project first.

Importing celery task within Django view

I am trying to execute a celery task that among other things, communicates with a specific Django view with a pipe.
I have been trying all day to import the celery tasks file (tasks.py) from the Django views (views.py) without any success.
I have also checked the file permissions but this was not the case.
I have added to sys.path the path of the file desired to import (tasks.py), and then import it, but I keep getting an ImportError. However, when trying to import another script on the same folder (script1), the import succeeds.
views.py
...
sys.path.append('/home/celery')
import script1 #SUCCEEDS
# Trying to import "/home/celery/tasks.py" here
import tasks #FAILS
...
tasks.py
...
#app.task
def start_operation(client,**kwargs):
# Pipe comes here
...
The celery directory structure is the following:
/home
├── celery
├── script1.py
tasks.py
And the Django proyect directory structure:
/myapp
├── myapp
├── views.py
Thanks in advance.

python: ImportError: cannot import name celery [duplicate]

This question already has answers here:
How to access a standard-library module in Python when there is a local module with the same name? [duplicate]
(2 answers)
Importing installed package from script with the same name raises "AttributeError: module has no attribute" or an ImportError or NameError
(2 answers)
Closed 4 years ago.
This might have nothing to do with celery but here is my problem:
I have an app structured like this:
/app
/__init__.py
/api_1.0/foo.py
/proj
/__init__.py
/celery.py
/tasks.py
So in celery.py I create a celery app:
flask = create_app(os.getenv('FLASKCONFIG') or None)
celery = Celery(__name__,
broker=flask.config['CELERY_BROKER_URL'],
include=['proj.tasks'])
celery.conf.update(flask.config)
and in tasks.py there are lists of celery tasks. One of them is list_users
In foo.py I try to use the task:
from proj import tasks
but this is causing an importation problem when I do:
celery -A proj worker --logleve=info
error message:
from proj.celery import celery
ImportError: cannot import name celery
Strange enough, if I remove the creation of the flask app and simply create a celery app, the problem goes away.
it looks like a circular import problem. How to avoid this?

Categories