I managed to make a function that sends lots of emails to every user in my Django application, for that I used the django-cron package.
I need to send the emails in a particular hour of the day, so I added in my function the following:
RUN_AT_TIMES = ['14:00']
schedule = Schedule(run_at_times=RUN_AT_TIMES)
The problem is that this function is only called if I run the command:
python manage.py runcrons
What can I do to make the application work after one single call of the command python manage.py runcrons?
P.S.: I need this application to work in Heroku as well.
As described in the docs' installation guide at point 6, you need to set up a cron job to execute the command. The packages takes away the annoyance of setting up separate cron jobs for all your commands, but does not eliminate cron entirely.
EDIT: after seeing your update, as I understand working with crons on heroku depends on plan (really not sure about that), but there are some apps that help with that. Heroku Scheduler for example.
Related
I am working on a project written in python and using django to build a website as well. I have a function which is pulling the information from a website and putting that information into a dictionary. When the users refresh the browser, the website will show the latest update of that dictionary, so far I am doing the updates being triggered by the browser but this is only for testing.
So, after several headaches I could finally install celery and make it work, so I have my website running with "python manage.py runserver", and at the same time I have two celery processes running: "celery -A tasks worker -l info pool=solo" and "celery -A tasks beat --loglevel=info". So far everything seems to be working until I realized that the dictionary is being updated but not for all users who access the website, looks like each user has his own instance of the dictionary.
So the idea is to have celery updating the dictionary with the information pulled from the website and all users just seeing what is inside the dictionary variable. Can I do this without a database or a file being written every time the update function is called?
Coming back on this, it seems my problem was Celery. Looks like Celery was running its own instance of my program when I ran the commands mentioned above. Then when I started the webserver using "python manager.py runserver" that had nothing to do with celery tasks. I stopped using celery and now I am using a background scheduler for doing the same, everything works now.
I have a Django web application hosted on IIS. I subprocess should be consistently running alongside the web application. When I run the application locally using
python manage.py runserver
the background task runs perfectly while the application is running. However, hosted on IIS the background task does not appear to run. How do I make the task run even when hosted on IIS?
In the manage.py file of Django I have the following code:
def run_background():
return subprocess.Popen(["python", "background.py"], creationflag=subprocess.CREATE_NEW_PROCESS_GROUP)
run_background()
execute_from_command_line(sys.argv)
What can be done to make the background task always run even on IIS?
Celery is a classic option for a background task manager.
https://pypi.org/project/celery/
Alternatively, I've used a library called schedule when I wanted something a little more lightweight. Note that schedule is still in its infancy. If you need something that is going to maintain support down the line, go with celery to be safe.
https://pypi.org/project/schedule/
Without knowing the context of your project, I can't say which I would choose, but they're both good options for task management.
On Windows, you can use Task Scheduler to automatically start your background process when Windows starts, using an arbitrary user account.
This was the "officially suggested" solution for Celery 3 on Windows until some years ago, and I believe it can be easily adapted to run any process.
You can find a detailed explanation here:
https://www.calazan.com/windows-tip-run-applications-in-the-background-using-task-scheduler/
I have a django-based function that needs to be run only once in Django, when the app boots up. The tricky part is that:
The code in question heavily uses django ORM, so apps have to be ready at that point,
The code should run only once - e.g. not once per every worker, but exactly once per "website" (regardless of whether it's deployed via gunicorn with a dozen of workers, or run locally via the built-in development web-server),
The code should only run once the app boots, but NOT by running other management commands. The code can take a while to complete, and I don't want it to trigger each time I run makemigrations or shell.
I could, theoretically, just introduce a locking mechanism and let it run somewhere in AppConfig.ready() method, but that would still run in all management commands as well.
Since the app is packaged in Docker, I've been also thinking about simply wrapping the code in a separate management command and running it in the entrypoint, just before the app is being started. That seems to do the trick, but it will be done automatically only in that particular container - if somebody runs a local development server to work on the app on his own, he might not be aware that an additional command should be run.
I've searched through the documentation and it doesn't look like Django has a way to do this nativelly on its own. Perhaps there's a better approach that I can't think of?
I had exactly the same need. I run the startup code in AppConfig.ready(), depending on an environment variable that you have to set when you don't want the startup code to be executed.
SKIP_STARTUP = settings.DEBUG or os.environ.get('SKIP_STARTUP')
class myapp(AppConfig):
def ready(self):
if not SKIP_STARTUP:
<startup stuff>
I deployed a Django app using free version of Heroku. Now I need to run some background task so I choose django-background-tasks . As per the documentation, I have to run python manage.py process_tasks command after running the project using python manage.py runserver . So I added Procfile as below
worker: python manage.py process_tasks
web: gunicorn CYC_Heroku.wsgi
But, I couldn't scale the app cause, I'm using a free version. then, can I do the same without paying money / without credit card ??
Heroku Scheduler will allow you to run background tasks for free at one of the following frequencies: every 10 minutes, every hour, or every day. It will use the same dyno type that you use for your web dyno, so if you're using a free dyno to run your app, it will also use a free dyno to run your scheduled tasks.
Once you add it to your app, open it from your Heroku app's Resources view. Add a new job and enter python manage.py process_tasks as the command, and select your desired frequency.
Hopefully you can make this work for your use case!
Actually, you can set up a clock process in heroku using APScheduler now.
I just tried it and it works great.
You can set up the time as you like, 1 minute is also allowed.
And it's free.
How to create shedule on OpenShift hosting to run python script that parses RSS feeds and will send filtered information to my email? It feature is available? Please help, who works with free version of this hosting. I have script that works fine. But i dont know how to run it every 10 min to catch freelance jobs. Or anyone does know free hosting with python that can create shedule for scripts.
You are looking for the add-on cartridge that is called cron. However, by default the cron cartridge only supports jobs that run every minute or every hour. You would have to write a job that runs minutely to determine if its a 10 minute interval and then execute your script.
Make sense?
rhc cartridge add cron -a yourAppName
Then you will have a cron directory in application directory under .openshift for placing the cron job.
You could so something like this here but setup for 10 minutes instead of 5: https://github.com/openshift-quickstart/openshift-cacti-quickstart/blob/master/.openshift/cron/minutely/cactipoll