Flask with Asyncrounus Cronjob in the Background - python

So what i am trying to do is, to host my Flask API and have a script running on the Server in the background all the time, even when there is no user that is accessing the API.
The script in the background should be executed once a minuite to update things in the Database.

What you're interested in are "cronjobs".
You can check out a library like Celery to get started. In particular, you'll want to look at Celery "beat".

Use Docker and create a ChronJob in it

Related

Deploy the scheduler application on multiple servers without running all of them

I have a python app that have scheduler and i want deploy it on multiple server.
Problem:
If I deploy my app to multiple servers, all schedules run, but I only need one of them.
* I don't want to define a field in the database and find out through it whether the scheduler should run or not, I am looking for another solution to not save anything anywhere**
Thanks.
disable the scheduler and try to schedule it via a microservice from outside. As for example if you want to do this opensource you can use airflow and prefect. If you are on AWS you can use EventBridge, lambda
Microservice for this purpose.

Running "tasks" periodically with Django without seperate server

I realize similar questions have been asked however they have all been about a sepficic problem whereas I don't even now how I would go about doing what I need to.
That is: From my Django webapp I need to scrape a website periodically while my webapp runs on a server. The first options that I found were "django-background-tasks" (which doesn't seem to work the way I want it to) and 'celery-beat' which recommends getting another server if i understood correctly.
I figured just running a seperate thread would work but I can't seem to make that work without it interrupting the server and vice-versa and it's not the "correct" way of doing it.
Is there a way to run a task periodically without the need for a seperate server and a request to be made to an app in Django?
'celery-beat' which recommends getting another server if i understood correctly.
You can host celery (and any other needed components) on the same server as your Django app. They would be separate processes entirely.
It's not an uncommon setup to have a Django app + celery worker(s) + message queue all bundled into the same server deployment. Deploying on separate servers may be ideal, just as it would be ideal to distribute your Django app across many servers, but is by no means necessary.
I'm not sure if this is the "correct" way but it was a cheap and easy way for me to do it. I just created custom Django Management Commands and have them run via a scheduler such as CRON or in my case I just utilized Heroku Scheduler for my app.

Good way to run ONE indefinite process on Django Framework

I'm building a web app using a Django framework. There isn't much user interaction with only a few static links, navbar, and a few plots which come from my app itself. The main part of the app comes from a python script which reads data from an external source, does some data processing on it, and then writes to my django database. Then after writing to the database a new page is created with information about the database entry. Note that there is no user interaction so no starting or stopping the task. I want the task to run in the background 24/7.
Thus far I've looked into celery and django-background-tasks. Celery seems like a bit much for my use case. I don't think I need a broker service as I just want to run 1 task which the user will never interact with. Additionally, I don't need multiple workers. Django-background-tasks seems like a good lightweight alternative but it seems it does not support indefinite tasks without having to refresh the task every once in a while (ideally I don't want this). Is there a tool that is better suited for this use case? Or am I just completely misunderstanding celery and django-background-tasks.
Update
Thanks for the comments, everyone! So I looked up what was mentioned by #D Malan and I think a tool like Supervisord might fit my use case. That is I can run a python script separately from the django application in the background and then have the python script interact with the django application. The only problem I have now is getting the process to interact with the django app. #deceze mentioned invoking a manage command from the python script. So this would involve creating a subprocess from the python script calling a custom django management command which updates my database? Or can I use the django.core.management.call_command but from a python file separate from the django application. If this is the case how would it know where to get the command from?

Windows: Python Daemon Won't Run on IIS but Runs Locally

I have a Django web application hosted on IIS. I subprocess should be consistently running alongside the web application. When I run the application locally using
python manage.py runserver
the background task runs perfectly while the application is running. However, hosted on IIS the background task does not appear to run. How do I make the task run even when hosted on IIS?
In the manage.py file of Django I have the following code:
def run_background():
return subprocess.Popen(["python", "background.py"], creationflag=subprocess.CREATE_NEW_PROCESS_GROUP)
run_background()
execute_from_command_line(sys.argv)
What can be done to make the background task always run even on IIS?
Celery is a classic option for a background task manager.
https://pypi.org/project/celery/
Alternatively, I've used a library called schedule when I wanted something a little more lightweight. Note that schedule is still in its infancy. If you need something that is going to maintain support down the line, go with celery to be safe.
https://pypi.org/project/schedule/
Without knowing the context of your project, I can't say which I would choose, but they're both good options for task management.
On Windows, you can use Task Scheduler to automatically start your background process when Windows starts, using an arbitrary user account.
This was the "officially suggested" solution for Celery 3 on Windows until some years ago, and I believe it can be easily adapted to run any process.
You can find a detailed explanation here:
https://www.calazan.com/windows-tip-run-applications-in-the-background-using-task-scheduler/

Openshift, python, mongodb, and cron guidance needed

I have a python web app that essentially allows 2 computers to talk with one another. If a session ends abruptly the record is still stored in pymongo, I want to be able to run a cron job to clean up old records, but I am not clear on how to do that, can't figure how to use bash to talk to pymongo...
What else could I do, call python from the cron job?
You could write a python script using pymongo (or any other mongodb client library) that does the necessary cleanup and configure cron to run it regularly.
Here is the article on OpenShift on how to get Cron up and running
https://www.redhat.com/openshift/community/blogs/getting-started-with-cron-jobs-on-openshift

Categories