I uploaded few files on Google-drive using my gmail account. Now my customers have to go to the drive and download those files. Files on the drive have read and write access and customers can download those files without gmail account.
I studied how i can automate this downloading process using Python. I found one wrapper PyDrive for Google-Drive Rest-api. For this API to work, i need to get credentials from the Google-Console.
I want to send that python script to user so that they can download the files but
Is it necessary to send the credientials also?
Users can't download the files without gmail account?
I can't just write a script and send to customers and they just run it without any credentials and gmail-account?
Related
I am new to SharePoint. I have written a simple python script that basically connects to SharePoint and uploads files (aspx and other frontend files) from a folder on my local machine to a specific folder on SharePoint site.
To facilitate the script to communicate with the SharePoint, I have a created an App principal under SharePoint using the SharePoint App-Only model. I have done this by calling the appregnew.aspx, example: https://spo.test.com/sites/MYSITE/\_layouts/15/appregnew.aspx , below is the sample page when 'appregnew.aspx' is called
Then, I have provided the below permissions to the App principal through 'appinv.aspx', example - https://spo.test.com/sites/MYSITE/\_layouts/15/appinv.aspx
<AppPermissionRequests AllowAppOnlyPolicy="true">
<AppPermissionRequest Scope="http://sharepoint/content/sitecollection/web" Right="FullControl"/>
</AppPermissionRequests>
Next, I use the Client ID and Client Secret under the Python script to establish communication with SharePoint and to upload files to a specific folder (folder already exists and is not created by the program) on SharePoint, example path to which files are uploaded: https://spo.test.com/sites/MYSITE/Shared%20Documents/TeamDocs2
Note: This script uses Python library 'Office365-REST-Python-Client' to communicate with SharePoint
The script can successfully authenticate itself and also upload the files to the folder on SharePoint. But then when I manually go to the SharePoint folder and click on the aspx file, example : index.aspx; the file gets downloaded instead of getting rendered.
There is no issue with the file i.e. it is not corrupted because when I manually upload the same file onto the same folder, then there is no issue, the file gets rendered.
In regards to the permissions for the App principal, I've already given 'FullControl' at the scope 'sitecolletion/web' level. I also tried changing the scope from 'http://sharepoint/content/sitecollection/web' to 'http://sharepoint/content/sitecollection', this didn't work as well
Please can somebody help me with this. Thanks in advance
The reason the .aspx page is being downloaded is related to security risk mitigation in SharePoint. If you consider that Javascript (.js) files and .aspx files are executable files in the browser, then it should also be self evident that allowing users to upload such files to SharePoint could pose risk. Because of this, Microsoft has disabled custom script on all modern sites by default. You can choose to overrule this setting, but it should be done with extreme caution.
I'm creating a cli tool to move file around in a user Google Drive space. I'm using Python and Google Drive Api Python SDK to do that and I've created this repo.
Now I have to run every midnight this tool to move file from a folder to another with input from user. Locally I can create credentials.son and retrieve credentials from it, generate the token.json file and use it to authenticate to Drive api. But in public CI environment, I would save my credentials in a Github Secret and give them to my tool at runtime using an option.
Can I do that?
There's some security issues?
I would publish this tool in some forums (like Python SubReddit) to get suggesstions and improvements, but before do that, I'd like to make this move file function as complete as possible.
I have the access token and file id of the file I need to download. After authenticating the user in a react application and using File picker API, I got both of them.
I need to pass them a python file that can download the file but without user intervention(without user permission )
Any suggestion on how to do it?
I do not have a shareable link. I have the file id and the access token after authentication. I need to download it through another application written in python.
Answers on stack overflow to download drive file using python have to do authentication using OAuth first, but in my case, I need to download the file without authentication
I want a python scripts which uploads files to google drive periodically in the background.
I have created a separate google drive account for this purpose. The file should be uploaded to this account, not the user's personal account. The script should upload files to this folder without asking the user to login to his google account. How can I achieve this, as all samples that I see open up the login page in the browser?
As of now, I am using this code :
https://gist.github.com/macieksk/038b201a54d9e804d1b5
I want to upload images in Microsoft azure through a python script and show those images in a dashboard build on django admin interface. Now, i figured since i am sending pics i should be using ftp.. So this is the code:
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Now, i don't know how to setup ftp server in azure and how would i be able to fetch those images from server to my dashboard.. I don't know much about deployment so any link to do this or guide would be welcome/helpful.
It sounds like you tried to create a Django app to upload & show images via FTP on Azure WebApps.
Per my experience, the only feasible way on Azure WebApps is that reading or writing images via Kudu FTP, please refer to the offical wiki page for Kudu FTP, and set up the user name & password via Azure portal.
However, I think it's not a best practice for image showing on Azure WebApp, because normally there is concurrency limits for uploading & downloading in FTP that be not suitable for your scenario, and the storage via FTP on Azure WebApp is ready for App self, not for resource.
So my suggestion is that using Azure Blob Storage to read & write images, and Django Framework has supported for integrating with Azure Storgae via simply setup configuration. Please refer to the Django document reference for Azure Storage to knwo how to do it.