Running a python script as task from task scheduler - python

I have a python script which I am able to run it locally from my machine and works fine. I am using spyder for coding. This python script needs to run daily at a certain time so I created a basic task under task scheduler.
The way I am doing this is in the action script I am putting my pythonw.exe and I am adding my python script as a action argument.
The thing is task scheduler triggers the task and script is not producing any outputs. I am reading a files from the shared drive in my native file.
I also tried to create the batch file and run the batch file but same issue.
Not sure what is wrong.
Can anyone help me with this?

I created the batch file and put that under the task sechduler and it worked fine. It looks like I had an issues with the access permissions as I have two accounts on my laptop.
Thanks Everyone for the help.

Related

Jenkins run only a part of python script

I have a python / locust script that performs various requests. When I run the build this is successful but, from the console output, I see that several requests are made but of only one type (eg login). Obviously I first launched locally and the script behaves the right way. I don't think it's even a problem in the jenkinsfile because I've used this file for other scripts. Has such a thing ever happened to you? How did you solve?

Where to run online Python 3.8 script

I have a folder with different folders in it.
In one of the folder I have a python script.
The python script reads an excel file (which is in the same folder), scrapes information from the internet, updates the excel file and creates another excel file in the main directory.
My question is:
As I can't run my computer non stop, I imagine it's possible (easy? and free) to upload all my folders on a website which will allow me to run my python (3.8) script. Do you have any suggestions ? Which website could be appropriate ? Pythonanywhere.com ?
Plus, I'd like to run this script every morning at 6am.
Thank you for your answers ! :)
Yes, you could use PythonAnywhere -- free accounts allow you to create one scheduled task, which can run once a day. If you have an account, you can set it up on the "Tasks" page.
Some public cloud providers, such as GCP, AWS, and Azure, offer free tier VMs. Simply run the code on those and set up a cron job. Though the network usage probably still costs you a few cents a month, this is a very cheap way to go. You could also consider setting up a FaaS solution against very low cost.
As #Klaus said, this is not a programming question. If you are on linux you can use crontab to schedule your process.
crontab
And if you want to run it on the cloud you can use free services like Heroku

Running Caffe creates high load over ksoftirqd/0

I'm running Caffe using python on AWS.
The scripts uses the GPU, uploads an existing model, and checks the output of it per image URL.
On the first few tries the script ran well. Than it stuck on a few different phases at each time.
Using 'top' I could see that ksoftirqd/0 gets about 93% of the CPU when the process is stuck.
I don't think there is a bug in my script, because originally it ran well no the server. When I reboot the server, sometimes it helps. But later we get the same problem.
Killing all python process on the server doesn't help. Only rebootting it.
Any ideas what I can do here?
It seems like you are experiencing a very high network load.
What exactly are you trying to download from URLs?
Are there any other processes running at the same time on the machine?
It is difficult to diagnose your problem without the specifics of your script.

Running a python script that logs into a spark EC2 cluster and then runs a script.

Is there documentation on running a script that can log into a spark cluster and run a script? I've been able to launch clusters with Linux Bash scripts but I'm wondering if there is anything more general (python would be great). I would like to have a script that reads certain parameters and then runs them automatically without having to have the user log in. I want this to be as easy and intuitive as possible (so someone less tech savy can just start the script without having to worry about dealing with spark or AWS) or be able to run in the background of a web app.

Modifying a crontab from a cron job

I have a Python script that runs in several modes. One of those modes monitors certain files, and if those files have been modified, the script restores them. The way I do this is to run the script every minute via cron.
Another cron job exists (actually the same script called with a different argument) to remove the script from the crontab when the scheduled time has elapsed. Initially, I was attempting to work with a crontab in /etc/cron.d. The script behaves as expected if executed on the command line, but does not edit the crontab when it is run from cron.
I then switched to writing a temporary file and executing crontab tempfile (via subprocess.Popen) from the script. This doesn't work either, as the crontab is simply not created. Executing crontab tempfile from the commandline and using the temporary file created by the script works as expected.
I can't use the python-crontab library as this is a commercial project and that library is GPLed.
Are there any inherent limitations in cron that prevent either approach from working?
The GPL is not anti-commercial. python-crontab can be used in commercial products and services. You must only follow the copy-left rules witch state that the actual code itself can't be made proprietary. You can sell it as much as you like, and as the author I encourage you to make money from my work.
Besides that error, it doesn't look like your problem requires python-crontab anyway. You could just open the files yourself and if that doesn't work, it was never going to work with python-crontab anyway.

Categories