I have a Python script that runs in several modes. One of those modes monitors certain files, and if those files have been modified, the script restores them. The way I do this is to run the script every minute via cron.
Another cron job exists (actually the same script called with a different argument) to remove the script from the crontab when the scheduled time has elapsed. Initially, I was attempting to work with a crontab in /etc/cron.d. The script behaves as expected if executed on the command line, but does not edit the crontab when it is run from cron.
I then switched to writing a temporary file and executing crontab tempfile (via subprocess.Popen) from the script. This doesn't work either, as the crontab is simply not created. Executing crontab tempfile from the commandline and using the temporary file created by the script works as expected.
I can't use the python-crontab library as this is a commercial project and that library is GPLed.
Are there any inherent limitations in cron that prevent either approach from working?
The GPL is not anti-commercial. python-crontab can be used in commercial products and services. You must only follow the copy-left rules witch state that the actual code itself can't be made proprietary. You can sell it as much as you like, and as the author I encourage you to make money from my work.
Besides that error, it doesn't look like your problem requires python-crontab anyway. You could just open the files yourself and if that doesn't work, it was never going to work with python-crontab anyway.
Related
I have a python / locust script that performs various requests. When I run the build this is successful but, from the console output, I see that several requests are made but of only one type (eg login). Obviously I first launched locally and the script behaves the right way. I don't think it's even a problem in the jenkinsfile because I've used this file for other scripts. Has such a thing ever happened to you? How did you solve?
I was wondering is there any easy way to pull all the dependencies in my Python script into a file(s) that I could include with my script in order to run it on Airflow? This would be a huge time saver for me.
To be clear it CANNOT be an exe, the Airflow scheduler runs python scripts, not Exes.
(My company uses Airflow scheduler and the team that supports it is very slow. Every time I need to install a new package on Airflow to run a script that's a dependency, it takes months of confusion, multiple tickets and wasted time. I don't have the access level to fix it myself and they don't give that access level ever to developers.)
I have a python script which I am able to run it locally from my machine and works fine. I am using spyder for coding. This python script needs to run daily at a certain time so I created a basic task under task scheduler.
The way I am doing this is in the action script I am putting my pythonw.exe and I am adding my python script as a action argument.
The thing is task scheduler triggers the task and script is not producing any outputs. I am reading a files from the shared drive in my native file.
I also tried to create the batch file and run the batch file but same issue.
Not sure what is wrong.
Can anyone help me with this?
I created the batch file and put that under the task sechduler and it worked fine. It looks like I had an issues with the access permissions as I have two accounts on my laptop.
Thanks Everyone for the help.
I managed to make a function that sends lots of emails to every user in my Django application, for that I used the django-cron package.
I need to send the emails in a particular hour of the day, so I added in my function the following:
RUN_AT_TIMES = ['14:00']
schedule = Schedule(run_at_times=RUN_AT_TIMES)
The problem is that this function is only called if I run the command:
python manage.py runcrons
What can I do to make the application work after one single call of the command python manage.py runcrons?
P.S.: I need this application to work in Heroku as well.
As described in the docs' installation guide at point 6, you need to set up a cron job to execute the command. The packages takes away the annoyance of setting up separate cron jobs for all your commands, but does not eliminate cron entirely.
EDIT: after seeing your update, as I understand working with crons on heroku depends on plan (really not sure about that), but there are some apps that help with that. Heroku Scheduler for example.
Does anyone know of a working and well documented implementation of a daemon using python? Please post a link here if you know of a project that fits these two requirements.
Three options I can think of-
Make a cron job that calls your script. Cron is a common name for a GNU/Linux daemon that periodically launches scripts according to a schedule you set. You add your script into a crontab or place a symlink to it into a special directory and the daemon handles the job of launching it in the background. You can read more at wikipedia. There is a variety of different cron daemons, but your GNU/Linux system should have it already installed.
Pythonic approach (a library, for example) for your script to be able to daemonize itself. Yes, it will require a simple event loop (where your events are timer triggering, possibly, provided by sleep function). Here is the one I recommend & use - A simple unix/linux daemon in Python
Use python multiprocessing module. The nitty-gritty of trying to fork a process etc. are hidden in this implementation. It's pretty neat.
I wouldn't recommend 2 or 3 'coz you're in fact repeating cron functionality. The Linux system paradigm is to let multiple simple tools interact and solve your problems. Unless there are additional reasons why you should make a daemon (in addition to trigger periodically), choose the other approach.
Also, if you use daemonize with a loop and a crash happens, make sure that you have logs which will help you debug. Also devise a way so that the script starts again. While if the script is added as a cron job, it will trigger again in the time gap you kept.
If you just want to run a daemon, consider Supervisor, a daemon that itself controls and manages daemons.
If you want to look at the nitty-gritty, you can check out Supervisor's launch script or some of the responses to this lazyweb request.
Check this link for a double-fork daemon: http://code.activestate.com/recipes/278731-creating-a-daemon-the-python-way/
The code is readable and well-documented. You want to take a look at chapter 13 of W. Richard's book 'Advanced Programming in the UNix Environment' for detailed information on Unix daemons.