Celery versus djcelery - python

I am confused between the differences between these two applications while trying to setup celery on my django project.
What are the differences between the two if any? When reading tutorials online I see them both used, and i'm not sure which would be best for me. It appears that djcelery is kinda like celery but tailored for django? But celery doesn't need to be included in intalled apps while djcelery does.
Thank you

Django-celery was a project that provided Celery integration for django, but it is no longer required.
You don't have to install django-celery anymore. Since version 3.1 django is supported out of the box.
So to install celery you can use pip:
pip install -U Celery
This is a note from Celery First Steps with Django Tutorial
Note:
Previous versions of Celery required a separate library to work with
Django, but since 3.1 this is no longer the case. Django is supported
out of the box now so this document only contains a basic way to
integrate Celery and Django. You will use the same API as non-Django
users so it’s recommended that you read the First Steps with Celery
tutorial first and come back to this tutorial. When you have a working
example you can continue to the Next Steps guide.

When using Django, you should install django-celery from PyPI. Celery will be installed as a dependency.
Djcelery hooks your django project in with Celery, which is a more general tool used with a variety of application stacks.
Here is Celery's getting started with Django guide, which describes installing django-celery and setting up your first tasks.

Previous versions of Celery required a separate library to work with Django, but since 3.1 this is no longer the case. Django is supported out of the box now so this document only contains a basic way to integrate Celery and Django. You’ll use the same API as non-Django users: https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#configuring-your-django-project-to-use-celery

Related

How to enable DjangoIntegration in sentry

I am using the "new" sentry-sdk 0.9.0
The sdk as initialized as follows
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
sentry_sdk.init(integrations=[DjangoIntegration(), ], dsn="...")
The events and exception do arrive at sentry.io. However, I'm getting the following warnings:
We recommend you update your SDK from version 0.9.0 to version 0.9.2
We recommend you enable the 'django' integration We recommend you
enable the 'tornado' integration
The first one is because I haven't upgraded to 0.9.2 yet. I'm not using tornado, so this warning surprises me. And when it comes to the django integration recommendation, I'm puzzled.
Any ideas or suggestions what I am missing?
Thanks!!
I'm the guy who implemented those alerts. OP and I had a private conversation on this and the verdict is that those alerts are just not 100% reliable and can be ignored if they make no sense.
The alerts just take the installed packages and look if there are any packages that we would have an integration for that is not enabled yet. This approach has problems when you e.g. use Django and Celery, but only enable the Django integration in the web worker and the Celery integration in the background worker (as far as I understood this is not what OP ran into though).
I think the way forward is to make those alerts permanently dismissable, because I don't see a way right now to make them accurate. The motivation to inform people about integrations they might want to use, not to tell them what they have to do.
That said, I am interested in cases where those alerts show nonsense. Feel free to post here or write me at markus#sentry.io.
in your case you need to install sentry-sdk[django]
pip3 install sentry-sdk[django]
if same error in flask, then
pip3 install sentry-sdk[flask]

Lightweight message queue to use with Celery 4.0

I'm currently using Celery 3.1 in a Django project on Python 2.7.
So far, we were using the Django ORM as a broker for development and staging environments. That was convenient, because you could pretty much just check out the sources, install the dependencies, run the migrations and celery worker would just work out of the box.
I'm thinking about how to set that up after upgrading to Celery 4.x due to the Django ORM broker having been removed. Are there any message queues that don't require any local setup (or can be pip-installed) and separate launching?

task runner/queue/scheduling on openshift with django

few days ago I have asked how to send email with django and openshift, but I guess it was too broad since it was closed.
So in this question I would like to know what task runner/queue/scheduling system should I use in general for django and openshift. Unfortunately I have not seen any tutorial for django and openshift.
Looks like celery is too complex to install easily on openshift.
Here are few tasks django-packages.
And openshift also provides IronWorker as scheduling app in marketplace. I tried to add free edition to my app, but I struggle setting it up.
So my question is, what queue system should I use with django and openshift? What do you use on openshift? Currently I only need to send personalised weekly emails, but my app works with few api providers and relatively large data, so I might use it more in the future.
There is no iron/openshift guide for Python, but there is one in Ruby that should give you a general concept- https://github.com/iron-io/ironmq-openshift-quickstart
You can also read our blogpost about the topic- http://www.iron.io/blog/2013/02/ironio-openshift-paas-20-for-enterprise.html

Can celery celerybeat use a Database Scheduler without Django?

I have a small infrastructure plan that does not include Django. But, because of my experience with Django, I really like Celery. All I really need is Redis + Celery to make my project. Instead of using the local filesystem, I'd like to keep everything in Redis. My current architecture uses Redis for everything until it is ready to dump the results to AWS S3. Admittedly I don't have a great reason for using Redis instead of the filesystem. I've just invested so much into architecting this with Docker and scalability in mind, it feels wrong not to.
I was searching for a non-Django database scheduler too a while back, but it looked like there's nothing else. So I took the Django scheduler code and modified it to use SQLAlchemy. Should be even easier to make it use Redis instead.
It turns out that you can!
First I created this little project from the tutorial on celeryproject.org.
That went great so I built a Dockerized demo as a proof of concept.
Things I learned from this project
Docker
using --link to create network connections between containers
running commands inside containers
Dockerfile
using FROM to build images iteratively
using official images
using CMD for images that "just work"
Celery
using Celery without Django
using Celerybeat without Django
using Redis as a queue broker
project layout
task naming requirements
Python
proper project layout for setuptools/setup.py
installation of project via pip
using entry_points to make console_scripts accessible
using setuid and setgid to de-escalate privileges for the celery deamon

How would I go about plugging mongoengine into pyramid?

I've created a basic mongoengine app using the pyramid_mongodb scaffold...however I'd like to include mongoengine. I'm wondering what I should actually keep from the scaffolds code.
Not a answer regarding the scaffold. I wouldn't recommend using the scaffold since it's not really usable for root_factory and so on, the subscribers isn't really needed too.
I wrote an addon for pyramid. It's called pyramid_mongo.
Documentation:
http://packages.python.org/pyramid_mongo/
Github:
https://github.com/llacroix/pyramid_mongo
I saw your question today and felt it could be a good addon to the plugin.
I just pushed it to github so you need to clone it from there for now, installing using pip will load the old version without support for mongoengine.
In other words in your config, do everything like in the docs and add something like:
mongo.mongoengine=true
It will attach mongo from the config to mongoengine. All other api will work with or without mongoengine and mongoengine should work. It just added it today, it doesn't support multiple connections and multiple dbs. I can also add support for multiple dbs too. But I feel mongoengine may do some things on his own that could conflict with my plugin like authorization.
Once I write tests, I'll push it to python packages and it will be possible to install from pip or easy_install. For now, pull it from github

Categories