Deploy the scheduler application on multiple servers without running all of them - python

I have a python app that have scheduler and i want deploy it on multiple server.
Problem:
If I deploy my app to multiple servers, all schedules run, but I only need one of them.
* I don't want to define a field in the database and find out through it whether the scheduler should run or not, I am looking for another solution to not save anything anywhere**
Thanks.

disable the scheduler and try to schedule it via a microservice from outside. As for example if you want to do this opensource you can use airflow and prefect. If you are on AWS you can use EventBridge, lambda
Microservice for this purpose.

Related

Flask with Asyncrounus Cronjob in the Background

So what i am trying to do is, to host my Flask API and have a script running on the Server in the background all the time, even when there is no user that is accessing the API.
The script in the background should be executed once a minuite to update things in the Database.
What you're interested in are "cronjobs".
You can check out a library like Celery to get started. In particular, you'll want to look at Celery "beat".
Use Docker and create a ChronJob in it

Send scheduled email django

I have made a small django app to fit all my needs. I will use it on my company level to track simple tasks of couple mehanical engineers. Now, only thing left is to send scheduled emails in my Django app (every day at noon, if someone is behind with work, he would get an email). Since I'm using Windows and I'll deploy this app on Windows, I can't use cron job (this only works on Linux, as I've seen on forums), which is simple and easy. Only way I found so far was using django-celery-beat. This is not so easy to set up, and I need to run 'worker' each time I run my server. This is a bit above my level and I would need to learn a lot more (and it needs to have a message broker, like RabbitMQ, which I also need to run and learn to implement).
I was wondering is there a more easy way to send a simple email every day at noon? I don't want to install additional apps, I wish to keep it simple as possible.
You can do it by Dockerizing Django with Redis and Celery.
Dockerizing is the process of packing, deploying, and running applications using Docker containers.
please use the below link to read more about dockerizing
Dockerizing
Dockerizing Django with Postgres, Redis and Celery

Running "tasks" periodically with Django without seperate server

I realize similar questions have been asked however they have all been about a sepficic problem whereas I don't even now how I would go about doing what I need to.
That is: From my Django webapp I need to scrape a website periodically while my webapp runs on a server. The first options that I found were "django-background-tasks" (which doesn't seem to work the way I want it to) and 'celery-beat' which recommends getting another server if i understood correctly.
I figured just running a seperate thread would work but I can't seem to make that work without it interrupting the server and vice-versa and it's not the "correct" way of doing it.
Is there a way to run a task periodically without the need for a seperate server and a request to be made to an app in Django?
'celery-beat' which recommends getting another server if i understood correctly.
You can host celery (and any other needed components) on the same server as your Django app. They would be separate processes entirely.
It's not an uncommon setup to have a Django app + celery worker(s) + message queue all bundled into the same server deployment. Deploying on separate servers may be ideal, just as it would be ideal to distribute your Django app across many servers, but is by no means necessary.
I'm not sure if this is the "correct" way but it was a cheap and easy way for me to do it. I just created custom Django Management Commands and have them run via a scheduler such as CRON or in my case I just utilized Heroku Scheduler for my app.

Deploy a stand-alone python script on PaaS service

I have Python script that is supposed to run once every few days to annotate some data on a remote database.
Which PaaS services (GAE, Heroku, etc.) allows for a stand-alone Python script to be deployed and executed via some sort of cron scheduler?
GAE has a module called cron jobs and Heroku has Heroku Scheduler. Both are fairly easy to use and configure. You can check the documentation of both. As I do not have any other information on what you want to do I don’t know if one would be more suitable to you than the other.

running scheduled job in django app deployed on heroku

I have deployed a django app on heroku. So far it works fine. Now I have to schedule a task (its in the form of python script) once a day. The job would take the data from heroku database perform some calculations and post the results back in the database. I have looked at some solutions for this usually they are using rails in heroku. I am confused whether I should do it using the cron jobs extension available in django or using the scheduled jobs option in heroku. Since the application is using using heroku I thought of using that only but I dont get any help how to add python jobs in it. Kindly help.
I suggest you to create a Django management command for your project like python mananage.py run_this_once_a_day. And you can use Heroku schedular to trigger this scheduling.

Categories