Cronjob Django doesn't work? - python

I have followd the following links for running a cron job :
django - cron
custom management commands
but all of these ways work if i run commands like:
or
python manage.py crontab add
or
python manage.py runcron
but i don't want to do cron jobs without django server running
i mean i want to run django server and it automatically call a certain function by itself
during running of server for example every (say) 5 minutes.

If I understand correctly you can't use django cron if django isn't running; so create a bash script to check if django is running and if not start it.
MYPROG="myprog"
RESTART="myprog params"
PGREP="/usr/bin/pgrep"
# find myprog pid
$PGREP ${MYPROG}
# if not running
if [ $? -ne 0 ]
then
$RESTART
fi
Then for your cron:
*/5 * * * * /myscript.sh
This is off the cuff and not individualized to your setup but it's as good as any to start.

Let me see if I can elaborate:
On Linux you'll use cron Windows will use at; these are system services not to be confused with Django. They are essentially scheduled tasks.
The custom management commands is essentially a script that you point the cron job to.
You'll need to do some homework on what a cron job is (if you're using Linux) and see how to schedule a reoccuring task and how to have it issue the custom command. This is my understanding of what you're trying to do. If it's not you need to clarify.

Related

How do i run some python code in the background

So I'm making a website using Django, and there is some code that I have set up to run every hour, however when I call this as a function in views.py it doesn't load the web page until the function has finished which isn't ideal for my website, I want it to be a background process which occurs outside my views file.
A traditional approach if you use some UNIX-like system is writing a management command doing what you like to do, and running that management command (with ./manage.py <your_custom_command>) from cron.
Run it from some account that has access to the source code and relevant database tables. Don't run it as root. If you use python virtualenv for your site, you should activate it before running manage.py.
You could for example run crontab -e as that user, and add an entry like
# m h dom mon dow command
# run your comand on every hour, minute 00
0 * * * * bash -c '. /your-site/env/bin/activate;/your-site/manage.py your_custom_command'
Custom management commands:
https://docs.djangoproject.com/en/3.2/howto/custom-management-commands/
A common tool to handle longer running jobs is celery: https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html

Running Python script at Regular intervals using Cron in Virtual Machine (Google Cloud Platform)

Can anyone provide me with the steps of how to execute a Python script at regular intervals in Cron inside Virtual Machine (using Google Cloud Platform)?
I read this link https://cloud.google.com/appengine/docs/standard/python/config/cron
but still could not figure out how to get it to work.
In regards to step (1) - Create the cron.yaml file in the root directory of your application (alongside app.yaml).--> does that mean we have to create both files cron.yaml and app.yaml? I do not see those files. What does app.yaml contain?
If you are using a virtual machine as you suggest, then those instructions you've linked may not be relevant as they are for App Engine.
With a Compute Engine VM you should use the inbuilt Linux cron functionality. For these instructions, I'm going to assume that you want to execute the script every 10 minutes. You can adapt this value for your needs.
Here is how you should proceed if you want to execute a script via a cron job on GCP virtual machine.
1) Run this command to enter the crontab configuration page.
crontab -e
**Note, the above command will allow you to edit the crontab configuration for the user you are logged in as. If you would like to execute the script as the root user, add 'sudo' to the start of the command to edit the crontab configuration for the root user.
2) In the cron configuration, you will be able to add an entry for intervals in minutes, hours, days of month, month and day of the week. On the same line, you can add the command you would like to execute- in your case a command to execute your python script.
As an example, if you wanted to run the script every 10 minutes with python, you would add an entry such as this:
*/10 * * * * /usr/bin/python /path/to/you/python/script.py
3) Once you've saved the crontab configuration and exited from the file, you need to restart the cron service for your changes to take affect. You can do this by running the following command.
sudo systemctl restart cron
There is some useful information here if you would like to discover more about running cron jobs in Linux.

Django crontab not working in elastic beanstalk

My cron jobs work fine on localhost but when i deploy they are not getting added.
The following settings.py:
CRONJOBS = [
('*/1 * * * *', 'push.cron.my_scheduled_job')
]
In development, cron works perfectly by doing this:
python manage.py crontab add
python manage.py crontab run 2e847f370afca8feeddaa55d5094d128
But when i deploy it to server using.. the cron jobs don't get added automatically. How do i add the crob jobs to the server ?
I just managed to get this working.
First I wrote the script as a "Django Custom Management Command".
Then I established a SSH connection, which starts at directory "/home/ec2-user", and entered "crontab -e" to edit the crontab.
In the crontab file, just add the following line (replace MY_CUSTOM_MANAGEMENT_COMMAND with your own file):
source /opt/python/run/venv/bin/activate && python manage.py MY_CUSTOM_MANAGEMENT_COMMAND
Then you're done.
You didn't mention in your question, but there's something I would like to point out, because I've seen it in some well known blogs: you don't need a worker tier for this, the crontab works just fine in the web server tier. Use the worker tier if you have some heavy background processing.
Your cron jobs running on localhost are not related to your server. You will need to run them separately pretty much in the same manner as you do in your local.
## I am assuming that you already activated your virtual env
python manage.py crontab add #returns a hash
python manage.py crontab run "hash" # put the has here without quotes
You could automate by writing and running some sort of bash script.
I will recommend that you use celery-beat instead of crontab if you want some automation.

Running python cron script as non-root user

I have a small problem running a python script as a specific user account in my CentOS 6 box.
My cron.d/cronfile looks like this:
5 17 * * * reports /usr/local/bin/report.py > /var/log/report.log 2>&1
The account reports exists and all the files that are to be accessed by that script are chowned and chgrped to reports. The python script is chmod a+r. The python script starts with a #!/usr/bin/env python.
But this is not the problem. The problem is that I see nothing in the logfile. The python script doesn't even start to run! Any ideas why this might be?
If I change the user to root instead of reports in the cronfile, it runs fine. However I cannot run it as root in production servers.
If you have any questions please ask :)
/e:
If I do sudo -u reports python report.py it works fine.
Cron jobs run with the permissions of the user that the cron job was setup under.
I.E. Whatever is in the cron table of the reports user, will be run as the reports user.
If you're having to so sudo to get the script to run when logged in as reports, then the script likely won't run as a cron job either. Can you run this script when logged in as reports without sudo? If not, then the cron job can't either. Make sense?
Check your logs - are you getting permissions errors?
There are a myriad of reasons why your script would need certain privs, but an easy way to fix this is to set the cron job up under root instead of reports. The longer way is to see what exactly is requiring elevated permissions and fix that. Is it file permissions? A protected command? Maybe adding reports to certain groups would allow you to run it under reports instead of root.
*be ULTRA careful if/when you setup cron jobs as root

Django Apache - Run script as Root

My django project calls a python file at a scheduled time using "at" scheduler. This is executed within my models.py
command = 'echo "python /path/to/script.py params" | /usr/bin/at -t [time] &> path/to/at.log'
status = os.system(command)
Where [time] is schedule time.
It works perfectly when I run it within Django Dev server (I usually run as root but it also works with other users as well)
But when I deployed my application on Apache using mod_wsgi, it doesn't work. at logs shows that the job was schedule but it doesn't execute it.
I tried everything from changing the ownership to www-data, permissions, made it into executable to all users, to setuid to root (Huge Security Issue)
The last thing I want to do is run apache as root user.
Use cron or celery for scheduled tasks. If you need to run something as root, it'd make sense to re-write your script as a simple daemon and run that as root, you can pass commands to it pretty easily with zeromq.

Categories