Python Django - Variables shared between multiple users - python

I am working on a project written in python and using django to build a website as well. I have a function which is pulling the information from a website and putting that information into a dictionary. When the users refresh the browser, the website will show the latest update of that dictionary, so far I am doing the updates being triggered by the browser but this is only for testing.
So, after several headaches I could finally install celery and make it work, so I have my website running with "python manage.py runserver", and at the same time I have two celery processes running: "celery -A tasks worker -l info pool=solo" and "celery -A tasks beat --loglevel=info". So far everything seems to be working until I realized that the dictionary is being updated but not for all users who access the website, looks like each user has his own instance of the dictionary.
So the idea is to have celery updating the dictionary with the information pulled from the website and all users just seeing what is inside the dictionary variable. Can I do this without a database or a file being written every time the update function is called?

Coming back on this, it seems my problem was Celery. Looks like Celery was running its own instance of my program when I ran the commands mentioned above. Then when I started the webserver using "python manager.py runserver" that had nothing to do with celery tasks. I stopped using celery and now I am using a background scheduler for doing the same, everything works now.

Related

Flask with Asyncrounus Cronjob in the Background

So what i am trying to do is, to host my Flask API and have a script running on the Server in the background all the time, even when there is no user that is accessing the API.
The script in the background should be executed once a minuite to update things in the Database.
What you're interested in are "cronjobs".
You can check out a library like Celery to get started. In particular, you'll want to look at Celery "beat".
Use Docker and create a ChronJob in it

Good way to run ONE indefinite process on Django Framework

I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?

Windows: Python Daemon Won't Run on IIS but Runs Locally

I have a Django web application hosted on IIS. I subprocess should be consistently running alongside the web application. When I run the application locally using
python manage.py runserver
the background task runs perfectly while the application is running. However, hosted on IIS the background task does not appear to run. How do I make the task run even when hosted on IIS?
In the manage.py file of Django I have the following code:
def run_background():
return subprocess.Popen(["python", "background.py"], creationflag=subprocess.CREATE_NEW_PROCESS_GROUP)
run_background()
execute_from_command_line(sys.argv)
What can be done to make the background task always run even on IIS?
Celery is a classic option for a background task manager.
https://pypi.org/project/celery/
Alternatively, I've used a library called schedule when I wanted something a little more lightweight. Note that schedule is still in its infancy. If you need something that is going to maintain support down the line, go with celery to be safe.
https://pypi.org/project/schedule/
Without knowing the context of your project, I can't say which I would choose, but they're both good options for task management.
On Windows, you can use Task Scheduler to automatically start your background process when Windows starts, using an arbitrary user account.
This was the "officially suggested" solution for Celery 3 on Windows until some years ago, and I believe it can be easily adapted to run any process.
You can find a detailed explanation here:
https://www.calazan.com/windows-tip-run-applications-in-the-background-using-task-scheduler/

Cron not doing my task without the command python manage.py runcrons

I managed to make a function that sends lots of emails to every user in my Django application, for that I used the django-cron package.
I need to send the emails in a particular hour of the day, so I added in my function the following:
RUN_AT_TIMES = ['14:00']
schedule = Schedule(run_at_times=RUN_AT_TIMES)
The problem is that this function is only called if I run the command:
python manage.py runcrons
What can I do to make the application work after one single call of the command python manage.py runcrons?
P.S.: I need this application to work in Heroku as well.
As described in the docs' installation guide at point 6, you need to set up a cron job to execute the command. The packages takes away the annoyance of setting up separate cron jobs for all your commands, but does not eliminate cron entirely.
EDIT: after seeing your update, as I understand working with crons on heroku depends on plan (really not sure about that), but there are some apps that help with that. Heroku Scheduler for example.

Docker/Django/Celery/RabbitMQ execute old versions of my code that was removed

I am running my Django app inside of a docker. I am using a background job as well as a periodic job with Celery + RabbitMQ, running in separate containers from the main app.
Everything works locally with "heroku local".
When running my app on an Ubuntu instance on Digital Ocean, I noticed that the background task and periodic task are executing old versions of my code. Specifically, I removed a field from my Django model last week, and the old code was referencing that deleted field so an error occurs. However, my new code no longer makes any reference to the missing field.
Here are a few things I tried:
rebuild and restart docker (didn't work)
delete all .pyc files (didn't work)
purge all celery tasks (didn't work)
Restart my digital ocean instance (didn't work)
Moving all my code and docker environment to a brand new digital ocean instance. (THIS WORKED!)
I have encountered this problems twice now, and I am hoping to find a better solution than moving to a new machine every time this error happens. I am guessing that Celery or RabbitMQ has cached the old code somewhere that I was not aware of.
Thanks in advance!
Related to this but non of the solutions worked for me: Celery/Rabbitmq/Django - Old tasks being executed without being called in my code

Categories