Python Task Scheduler Permission Error - python

I am trying to run a python script that downloads qualtrics survey responses and stores it as a CSV. As part of the script it calls zipfile.zipfile. When i run the program inside of VS Code Terminal window it downloads and unzips the file just fine. However when I run the script in Windows Task Scheduler i run into this error.
<class 'PermissionError'>
Traceback (most recent call last):
File "C:\Users\sdr-research\Dropbox (PCC)\SDR6\Qualtrics_Data\QualtricsSurveyResponses\mindshareResponsesPull.py", line 54, in <module>
zipfile.ZipFile(io.BytesIO(requestDownload.content)).extractall("C:\\Users\sdr-research\Dropbox (PCC)\SDR6\Qualtrics_Data\QualtricsSurveyResponses")
File "C:\Python36\lib\zipfile.py", line 1501, in extractall
self._extract_member(zipinfo, path, pwd)
File "C:\Python36\lib\zipfile.py", line 1555, in _extract_member
open(targetpath, "wb") as target:
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\sdr-research\\Dropbox (PCC)\\SDR6\\Qualtrics_Data\\QualtricsSurveyResponses\\Mindshare English v2.1.csv'
I am running python 3.6, windows 10, i have tried changing the user permissions from my logged in account the administrator's, placing the file's path in as the program to run, placing python.exe as the program to run and the file's path as the first argument with and without the folder the file is located in as the "start in" parameter. I make sure when i put in the file name into the task scheduler i put quotations around the path because there is a space in the path.
Any ideas?

I had a similar issue with a python script not being able to create a log file when opened with task scheduler (when I scheduled Selenium to do a daily browser task). It was solved by checking "Run with the highest privileges" in the General tab of the task.

Related

Python on RPi: Errno 2 No such file or directory: 'config.json' but config.json is in the same folder as the main.py script

Hey all
I am trying to run a python script from GitHub, https://github.com/dudisgit/gmod_toolgun_prop for a project with a functioning screen and I put a command at the end of the .bashrc file
python3 /home/pi/gmod_toolgun_prop-main/main.py
so that the code executes as soon as the RPi powers up. When running the script in Thonny's Python IDE on my RPi 2B it executes no problem and the screen works. However when I open terminal I get an error message from the code running in the .bashrc file:
Traceback (most recent call last):
File "/home/pi/gmod_toolgun_prop-main/main.py", line 381, in <module>
main()
File "/home/pi/gmod_toolgun_prop-main/main.py", line 357, in main
with open(args.config) as config_file:
FileNotFoundError: [Errno 2] No such file or directory: 'config.json'
However, the config.json file is in the same folder as the main.py file as shown below:
Screenshot of file explorer showing config.json is in the same folder as main.py
And here's a screenshot of the error message
And here's the code that is refenced in the error message as line 357:
with open(args.config) as config_file:
config = json.load(config_file)
The entirety of the main.py script is in the Github link as attached in the first paragraph as well as the config.json file and other relevant files.
I am fairly new to the Python programming space so I don't understand what could be causing this error nor how this script handles opening the config.json file.
I have tried creating a custom service but it spits out the same error. Crontab and the local.bashrc file just doesn't work straight up for this. This is the furthest I have got with it attempting to execute on boot.
Maybe it helps to use the absolute path of config.json, i.e.(according to your description)
/home/pi/gmod_toolgun_prop-main/config.json
instead of the simple filename config.json.
If Python says the file is not there, then the file is not there. The only question is, where is there?
The filename in this case is config.json. Since there's no / (no directory name), the name is taken to be relative to the current working directory. That might or might not be the same as the directory of the main Python module, here /home/pi/gmod_toolgun_prop-main/main.py.
You can verify that by printing the current working directory just before opening the file. You can use os.getcwd to do that. Or, use strace(1) to show the interpreter's attempts to open config.json.

Getting botocore.exceptions.NoCredentialsError: Unable to locate credentials especially when I using CronTab

I understand we have couple of answers for this error. However, my issue is different hence the question.
I am pushing audio files from a raspberry pi to AWS-S3 instance. The upload works without any problem when I run the script manually. However the same script when run through a crontab instance it throws up the above error.
My python code is as follows:
import boto3
import argparse
import os
ap=argparse.ArgumentParser()
ap.add_argument("-f","--filetoupload", required=True, help="file to upload")
args=vars(ap.parse_args())
file = os.path.basename(args['filetoupload'])
client=boto3.client('s3', region_name='ap-south-1')
print("[INFO:] Uploading file to cloud")
client.upload_file(args['filetoupload'],'MyS3Bucket',file)
print("[INFO:] File upload completed successfully")
I am calling this python script in a bash script.
python /home/pi/s3upload.py --filetoupload /home/pi/upload/${FILENAME}.mp3
The mp3 file gets created and as said above, when I run it manually it runs without any trouble.
I checked for the config and credentials files within .aws folder. As indicated in other responses in SO, they start with default and have the right credentials set up.
The complete error message is as follows:
[INFO:] Uploading file to cloud
Traceback (most recent call last):
File "/home/pi/s3upload.py", line 10, in <module>
client.upload_file(args['filetoupload'],'MyS3Bucket',file)
File "/usr/local/lib/python2.7/dist-packages/boto3/s3/inject.py", line 110, in upload_file
extra_args=ExtraArgs, callback=Callback)
File "/usr/local/lib/python2.7/dist-packages/boto3/s3/transfer.py", line 279, in upload_file
future.result()
File "/usr/local/lib/python2.7/dist-packages/s3transfer/futures.py", line 73, in result
return self._coordinator.result()
File "/usr/local/lib/python2.7/dist-packages/s3transfer/futures.py", line 233, in result
raise self._exception
botocore.exceptions.NoCredentialsError: Unable to locate credentials
script finished
By default, when crontab runs a script, it sets the $HOME variable to be /.
AWS looks for its credentials in a location which resolves to $HOME/.aws.
So, you need to either move your .aws directory into the root dir, which is probably not a good idea, or redefine $HOME to point to the folder that contains your .aws directory.
The easiest way to do this, if you are running your python script from a shell script, is to add:
export /HOME=/home/pi/
To the start of your shell script.

Python Script can see the Y:/ Drive when run from IDLE but not the command line

I have a python script that builds a file by searching through folders and pulling in a list of files. This file runs fine and works as expected when I open and run it in IDLE, but if when I run the script in a commandline window I get this error:
C:\Windows\system32>python "C:\Users\ntreanor\Documents\RV Scripts\Server RV Sequence.py"
Traceback (most recent call last):
File "C:\Users\ntreanor\Documents\RV Scripts\Server RV Sequence.py", line 69,
in <module>
for foldername in os.listdir(pngFolders):
WindowsError: [Error 3] The system cannot find the path specified:
'Y:/20_temp_script_testing/pr126 movs\\04_comp_pngs/*.*'
In case it's not obvious, yes the path does exist. It not only works in IDLE but I double checked and the path definitely exists.
I also tried to create folders with a script that runs as a daemon and got a similar result
Traceback (most recent call last):
File "D:\shotgun\shotgunEventDaemon.py", line 888, in process
self._callback(self._shotgun, self._logger, event, self._args)
File "D:\shotgun\plugins\CreateAssetFolders.py", line 72, in createAssetFolders
os.makedirs(folder)
File "D:\Python27\Lib\os.py", line 150, in makedirs
makedirs(head, mode)
File "D:\Python27\Lib\os.py", line 150, in makedirs
makedirs(head, mode)
File "D:\Python27\Lib\os.py", line 150, in makedirs
makedirs(head, mode)
File "D:\Python27\Lib\os.py", line 157, in makedirs
mkdir(name, mode)
WindowsError: [Error 3] The system cannot find the path specified: 'Y:/'
This is what the script logged as a folder right before that:
Making folder:
Y:/07_design/04_environmental_elements\eec005-08_insect_ladybird_red_7_spots_wide
(the reason it's saying Y and not the whole path is that it attempts to make each folder back until it can't go back any further and that's when the exception is thrown)
Are the environment variables of the commandline window somehow affecting the drive mapping that should be pointing the script to the right location?
The issue is likely because IDLE and your command line are running with a different level of privileges. Mapped network drives are not automatically available to all user contexts. There is a superuser question on this here and plenty of other resources cover this topic. In short, the mapped network drive is only available to programs running at the level the mapping was made at.
If you have mapped the network drive through the windows UI, then it will be mapped for un-elevated programs. However if it was mapped with net use then it depends on the level of the command prompt when the mapping was made!
Disabling UAC will also affect change this behaviour, as will use of an elevated (or not) command prompt, which may explain why some PCs display different behaviour.
I think your problem is that you're trying to open the * file, which of course doesn't exist. open(path) takes path as a literal string and doesn't translate it in anyway, so it expects that value to be a valid filename. You should change your code to get a directory instead of a file, and then walking that directory.

Python - "IOError: [Errno 13] Permission denied" when running cron job but not when running from command line

I have SSHed from my local machine (a Mac) to a remote machine called “ten-thousand-dollar-bill” as the user “chilge”.
I want to run a Python script in the folder “/afs/athena.mit.edu/c/h/chilge/web_scripts” that generates and saves a .png image to the folder “/afs/athena.mit.edu/c/h/chilge/www/TAF_figures/KORD/1407”. When I run the script from the command line, the image is generated and saved without any issues. When I run the script as cron job, though (the crontab resides in “/afs/athena.mit.edu/c/h/chilge/cron_scripts”), I get the following error:
Traceback (most recent call last):
File "/afs/athena.mit.edu/user/c/h/chilge/web_scripts/generate_plots.py", line 15, in
save_taffig(taf,fig)
File "/afs/athena.mit.edu/user/c/h/chilge/web_scripts/plotting.py", line 928, in save_taffig
fig.savefig(os.getcwd()+'/'+savename+'.png')
File "/usr/lib64/python2.7/site-packages/matplotlib/figure.py", line 1084, in savefig
self.canvas.print_figure(*args, **kwargs)
File "/usr/lib64/python2.7/site-packages/matplotlib/backend_bases.py", line 1923, in print_figure
**kwargs)
File "/usr/lib64/python2.7/site-packages/matplotlib/backends/backend_agg.py", line 443, in print_png
filename_or_obj = file(filename_or_obj, 'wb')
IOError: [Errno 13] Permission denied: '/afs/athena.mit.edu/user/c/h/chilge/www/TAF_figures/KORD/1407/140723-1200_AMD_140723-1558.png'
I believe I’ve correctly changed the permissions of all of the necessary directories, but I’m still getting this error. I am not sure why the script would run fine from the command line, but fail when I try to run the script as a cron job.
(Also, I’m not sure if this will be relevant, but don’t have sudo permissions on the remote machine.)
Maybe an other software opens the file which you want to overwrite?

Python script runs with double-click and IDLE but not windows CMD shell

I'm have a problem where if I double click my script (.py), or open it with IDLE, it will compile and run correct. However, if I try to run the script in my windows command line, using
C:\> "C:\Software_Dev\Python 2.7.1\python.exe" C:\path\to\script\script.py
I get...
Traceback (most recent call last):
File "C:\path\to\script\script.py", line 66, in <module>
a.CheckTorrent()
File "C:\path\to\script\script.py", line 33, in script
self.WriteLog(fileName)
File "C:\path\to\script\script.py", line 54, in WriteLog
myFile = open(r'%s' %(filename), 'w')
IOError: [Errno 13] Permission denied: './TorrentMonitor.log'
So my question is, why am I getting permission errors when I run this script through command line in window 7 but not when I double click? What's the difference between those two processes?
Thanks in advance!
The script is trying to write into a file in the current directory. In the example above, you're starting it from C:\ where you probably don't have write permissions.
cd to a directory that you own, and you should be able to run that command just fine.
This is because when you double-click the file (or when running it from IDLE), the current working directory is the directory that contains your script. When starting it from the command line, it's C:\ which you don't seem to have write access to.

Categories