Automatically run python script twice a day - python

I have python script for ssh which help to run various Linux commands on remote server using paramiko module. All the outputs are saved in text file, script is running properly. Now I wanted to run these script twice a day automatically at 11am and 5pm everyday.
How can I run these script automatically every day at given time without compiling every time manually. Is there any software or module.
Thanks for your help.

If you're running Windows, your best bet would be to create a Scheduled Task to execute Python itself, passing the path to your script as an argument.
If you're using OSX or Linux, CRON is your friend. There are references abound for how to create scheduled events in crontab. This is a good start for setting up CRON tasks.
One thing to mention is permissions. If you're running this from a Linux machine, you'll want to ensure you set up the CRON job to run under the right account (best practice not to use your own).

Assuming you are running on a *nix system, cron is definitely a good option. If you are running a Linux system that uses systemd, you could try creating a timer unit. It is probably more work than cron, but it has some advantages.
I won't go though all the details here, but basically:
Create a service unit that runs your program.
Create a timer unit that activates the server unit at the prescribed times.
Start and enable the timer unit.

Related

How do you schedule some python scripts to run regularly on a Windows PC?

I have some python scripts that I look to run daily form a Windows PC.
My current workflow is:
The desktop PC stays all every day except for a weekly restart over the weekend
After the restart I open VS Code and run a little bash script ./start.sh that kicks off the tasks.
The above works reasonably fine, but it is also fairly painful. I need to re-run start.sh if I ever close VS Code (eg. for an update). Also the processes use some local python libraries so I need to stop them if I'm going to update them.
With regards to how to do this properly, 4 tools came to mind:
Windows Scheduler
Airflow
Prefect (https://www.prefect.io/)
Rocketry (https://rocketry.readthedocs.io/en/stable/)
However, I can't quite get my head around the fundamental issue that Prefect/Airflow/Rocketry run on my PC then there is nothing that will restart them after the PC reboots. I'm also not sure they will give me the isolation I'd prefer on these tools.
Docker came to mind, I could put each task into a docker image and run them via some form of docker swarm or something like that. But not sure if I'm re-inventing the wheel.
I'm 100% sure I'm not the first person in this situation. Could anyone point me to a guide on how this could be done well?
Note:
I am not considering running the python scripts in the cloud. They interact with local tools that are only licenced for my PC.
You can definitely use Prefect for that - it's very lightweight and seems to be matching what you're looking for. You install it with pip install prefect, start Orion API server: prefect orion start and once you create a Deployment, and start an agent prefect agent start -q default you can even configure schedule from the UI
For more information about Deployments, check our FAQ section.
It sounds Rocketry could also be suitable. Rocketry can shut down itself using a task. You could do a task that:
Runs on the main thread and process (blocking starting new tasks)
Waits or terminates all the currently running tasks (use the session)
Calls session.shut_down() which sets a flag to the scheduler.
There is also a app configuration shut_cond which is simply a condition. If this condition is True, the scheduler exits so alternatively you can use this.
Then after the line app.run() you simply have a line that runs shutdown -r (restart) command on shell using a subprocess library, for example. Then you need something that starts Rocketry again when the restart is completed. For this, perhaps this could be an answer: https://superuser.com/a/954957, or use Windows scheduler to have a simple startup task that starts Rocketry.
Especially if you had Linux machines (Raspberry Pis for example), you could integrate Rocketry with FastAPI and make a small cluster in which Rocketry apps communicate with each other, just put script with Rocketry as a startup service. One machine could be a backup that calls another machine's API which runs Linux restart command. Then the backup executes tasks until the primary machine answers to requests again (is up and running).
But as the author of the library, I'm possibly biased toward my own projects. But Rocketry very capable on complex scheduling problems, that's the purpose of the project.
You can use schtasks for windows to schedule the tasks like running bash script or python script and it's pretty reliable too.

Automation of Powershell script, that invokes a Python script, doesn't work (via Task Scheduler)

I am trying to create a bunch of automations on my PC with Windows, and I encountered some obstacles while trying to automate a Powershell script with Windows Task Scheduler.
Right now, I have managed to set up a Task Scheduler to perform an actual script, but despite the whole script working as intended when I start it manually, it doesn't work right while invoked via task scheduler.
The Powershell script is very simple, it is meant to invoke a certain Python script. I also created a log transcript for testing purposes.
python dataset_creator.py --to_import yes --to_export yes
Start-Transcript -Path "<path>\transcript0.txt"
While invoked manually, the Python script works, but via Task Scheduler, the only working part is a transcript creation (ergo - no Python script is running). The Task Scheduler informs me itself that the task executed properly. This is how I set up the action:
Program: %SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe
Arguments: -NoProfile -NoLogo -NonInteractive -ExecutionPolicy Bypass -File "C:\<path>\powershell_script.ps1"
As for now, I have set it on a SYSTEM account, I have tried to set it on my user admin account, but it doesn't work as well.
Could you suggest the potential issues? I scrolled through several articles and nothing works. I also tried to skip the Powershell part (aka - set up the task relying only on a Python task), but it didn't work so far, and I also initially wanted to insert a number of Python scripts invokations into a single Powershell file.
I am also open to some Windows Task Scheduler alternatives suggestions.

what are the good ways to deploy and manage python script on production server?

I've written a lot of python scripts. Now I want to run it on another computer which running non-stop to crawling, analyzing data and update to an sql database.
Normally I open a command prompt and run the scripts:
python [script directory]
But with many scripts I have to open many cmd and every script call an python interpreter, so It end up with huge mess using a lot of memory.
What should I do to manage these scripts.
You haven't specified what OS your server is, but assuming that it's a Linux server you should probably research a process management tool such as Supervisord or Systemd. These are tools designed to run and monitor your program automatically, and even restart it if it crashes.
If you're using Ubuntu 16.04 then it comes with Systemd out of the box, however I personally find Supervisord easier to configure and use for simple tasks.
These programs won't necessarily help with your memory consumption issues however. Sure you can place caps on memory use for a process, but that's not really going to help you if it stops your program from working. You're probably best to re-evaluate your code and look for ways to reduce its memory footprint or use a server with more ram.
EDIT:
You've just added that the OS is Windows 10, which makes the above irrelevant. You can use the Windows Task Scheduler to automatically execute long running tasks.
you can use pythonw *.py , and it will run in background.

How to run a Python package as a scheduled daemon job?

I have created a Python package and released it on PyPI, say spamandeggs. This package is cross-platform (Windows,GNU/Linux, MacOSX) aimed at updating the user with certain information periodically (say every 5 minutes). The package can be run from command-line through the command spamtheeggs.
Here are the issues I am facing:
Question 01: How to daemonize the script running through the spamtheeggs command?
Problem: Following this answer, I tried using schedule in my script. This works fine for scheduling but the execution is not daemonized. The terminal is busy for the entirety of the process.
I would like to know a way to daemonise the Python package.
Question 02: How to add the command as a cron job for scheduled execution ?
Solution 01: One way to do this would be to write an installation guide describing the process (editing crontab using crontab -e, etc.).
Drawback: Not appealing.
Solution 02: As the author of the package, I want to be able to add this command to the user's crontab (after getting user's confirmation, obviously).
Options:
Write a Python script to schedule another script (Is this even possible?)
Use a task scheduler which can also daemonize.
I would like to know which option is suitable(if any) and any tips on how can I go about working on them.
List of resources I have read so far:
How to run a python background process periodically
Scheduling Python Script to run every hour accurately
How to package a Python daemon with setuptools
Execute python Script on Crontab
running a python script with cron
Creating a Cron Job - Linux / Python
How to make Python script run as service?
Note: I would appreciate if the solution is applicable over all the 3 platforms.
P.S: This is my first attempt at cron and daemon jobs.
python-deamon package can help you, it works in 2 modes:
using a runner (I think is just what you want): http://www.gavinj.net/2012/06/building-python-daemon-process.html
or running using daemon.DaemonContext directly (a more traditional way)
python-deamon has no documentation, but I've found some code that may help you http://www.programcreek.com/python/example/10392/daemon.DaemonContext

How to create a "watchdog" for a python script running on a shared host (no ssh access or shell scripts)

My friends and I have written a simple telegram bot in python. The script is run on a remote shared host. The problem is that for some reason the script stops from time to time, and we want to have some sort of a mechanism to check whether it is running or not and restart it if necessary.
However, we don't have access to ssh, we can't run bash scripts and I couldn't find a way to install supervisord. Is there a way to achieve the same result by using a different method?
P.S. I would appreciate it if you gave detailed a explanation as I'm a newbie hobbyist. However, I have no problem with researching and learning new things.
You can have a small supervisor Python script whose only purpose is to start (and restart) your main application Python script. When your application crashes the supervisor takes care and restarts it.

Categories