I have made a small django app to fit all my needs. I will use it on my company level to track simple tasks of couple mehanical engineers. Now, only thing left is to send scheduled emails in my Django app (every day at noon, if someone is behind with work, he would get an email). Since I'm using Windows and I'll deploy this app on Windows, I can't use cron job (this only works on Linux, as I've seen on forums), which is simple and easy. Only way I found so far was using django-celery-beat. This is not so easy to set up, and I need to run 'worker' each time I run my server. This is a bit above my level and I would need to learn a lot more (and it needs to have a message broker, like RabbitMQ, which I also need to run and learn to implement).
I was wondering is there a more easy way to send a simple email every day at noon? I don't want to install additional apps, I wish to keep it simple as possible.
You can do it by Dockerizing Django with Redis and Celery.
Dockerizing is the process of packing, deploying, and running applications using Docker containers.
please use the below link to read more about dockerizing
Dockerizing
Dockerizing Django with Postgres, Redis and Celery
Related
I realize similar questions have been asked however they have all been about a sepficic problem whereas I don't even now how I would go about doing what I need to.
That is: From my Django webapp I need to scrape a website periodically while my webapp runs on a server. The first options that I found were "django-background-tasks" (which doesn't seem to work the way I want it to) and 'celery-beat' which recommends getting another server if i understood correctly.
I figured just running a seperate thread would work but I can't seem to make that work without it interrupting the server and vice-versa and it's not the "correct" way of doing it.
Is there a way to run a task periodically without the need for a seperate server and a request to be made to an app in Django?
'celery-beat' which recommends getting another server if i understood correctly.
You can host celery (and any other needed components) on the same server as your Django app. They would be separate processes entirely.
It's not an uncommon setup to have a Django app + celery worker(s) + message queue all bundled into the same server deployment. Deploying on separate servers may be ideal, just as it would be ideal to distribute your Django app across many servers, but is by no means necessary.
I'm not sure if this is the "correct" way but it was a cheap and easy way for me to do it. I just created custom Django Management Commands and have them run via a scheduler such as CRON or in my case I just utilized Heroku Scheduler for my app.
I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?
I have a web application that runs locally only (does not run on a remote server). The web application is basically just a ui to adjust settings and see some information about the main application. Web UI was used as opposed to a native application due to portability and ease of development. Now, in order to start and stop the main application, I want to achieve this through a button in the web application. However, I couldn't find a suitable way to start a asynchronous and managed task locally. I saw there is a library called celery, however that seems to be suitable to a distributed environment, which mine is not.
My main need to be able to start/stop the task, as well as the check if the task is running (so I can display that in the ui). Is there any way to achieve this?
celery can work just fine locally. Distributed is just someone else's computer after all :)
You will have to install all the same requirements and the like. You can kick off workers by hand, or as a service, just like in the celery docs.
I have a small infrastructure plan that does not include Django. But, because of my experience with Django, I really like Celery. All I really need is Redis + Celery to make my project. Instead of using the local filesystem, I'd like to keep everything in Redis. My current architecture uses Redis for everything until it is ready to dump the results to AWS S3. Admittedly I don't have a great reason for using Redis instead of the filesystem. I've just invested so much into architecting this with Docker and scalability in mind, it feels wrong not to.
I was searching for a non-Django database scheduler too a while back, but it looked like there's nothing else. So I took the Django scheduler code and modified it to use SQLAlchemy. Should be even easier to make it use Redis instead.
It turns out that you can!
First I created this little project from the tutorial on celeryproject.org.
That went great so I built a Dockerized demo as a proof of concept.
Things I learned from this project
Docker
using --link to create network connections between containers
running commands inside containers
Dockerfile
using FROM to build images iteratively
using official images
using CMD for images that "just work"
Celery
using Celery without Django
using Celerybeat without Django
using Redis as a queue broker
project layout
task naming requirements
Python
proper project layout for setuptools/setup.py
installation of project via pip
using entry_points to make console_scripts accessible
using setuid and setgid to de-escalate privileges for the celery deamon
I'd like to run periodic tasks on my django project, but I don't want all the complexity of celery/django-celery (with celerybeat) bundled in my project.
I'd like, also, to store the config with the times and which command to run within my SCM.
My production machine is running Ubuntu 10.04.
While I could learn and use cron, I feel like there should be a higher level (user friendly) way to do it. (Much like UFW is to iptables).
Is there such thing? Any tips/advice?
Thanks!
There are several Django-based scheduling apps, such as django-chronograph and django-chroniker and django-cron. I forked django-chronograph into django-chroniker to fix a few bugs and extend it for my own use case. I still use Celery in some projects, but like you point out, it's a bit overcomplicated and has a large stack.
In my personal opinion, i would learn how to use cron. This won't take more than 5 to 10 minutes, and it's an essential tool when working on a Linux server.
What you could do is set up a cronjob that requests one page of your django instance every minute, and have the django script figure out what time it is and what needs to be done, depending on the configuration stored in your database. This is the approach i've seen in other similar applications.