Using python to recursively download files from SharePoint site - python

Has anyone been able to recursively download files from a SharePoint site using python? I want to download, basically, an entire folder structure and all of the files, primarily image files (.jpg, .png), and store them on desktop.
Has anyone been able to do this?

Related

Unable to render aspx files when uploaded to SharePoint programmatically

I am new to SharePoint. I have written a simple python script that basically connects to SharePoint and uploads files (aspx and other frontend files) from a folder on my local machine to a specific folder on SharePoint site.
To facilitate the script to communicate with the SharePoint, I have a created an App principal under SharePoint using the SharePoint App-Only model. I have done this by calling the appregnew.aspx, example: https://spo.test.com/sites/MYSITE/\_layouts/15/appregnew.aspx , below is the sample page when 'appregnew.aspx' is called
Then, I have provided the below permissions to the App principal through 'appinv.aspx', example - https://spo.test.com/sites/MYSITE/\_layouts/15/appinv.aspx
<AppPermissionRequests AllowAppOnlyPolicy="true">
<AppPermissionRequest Scope="http://sharepoint/content/sitecollection/web" Right="FullControl"/>
</AppPermissionRequests>
Next, I use the Client ID and Client Secret under the Python script to establish communication with SharePoint and to upload files to a specific folder (folder already exists and is not created by the program) on SharePoint, example path to which files are uploaded: https://spo.test.com/sites/MYSITE/Shared%20Documents/TeamDocs2
Note: This script uses Python library 'Office365-REST-Python-Client' to communicate with SharePoint
The script can successfully authenticate itself and also upload the files to the folder on SharePoint. But then when I manually go to the SharePoint folder and click on the aspx file, example : index.aspx; the file gets downloaded instead of getting rendered.
There is no issue with the file i.e. it is not corrupted because when I manually upload the same file onto the same folder, then there is no issue, the file gets rendered.
In regards to the permissions for the App principal, I've already given 'FullControl' at the scope 'sitecolletion/web' level. I also tried changing the scope from 'http://sharepoint/content/sitecollection/web' to 'http://sharepoint/content/sitecollection', this didn't work as well
Please can somebody help me with this. Thanks in advance
The reason the .aspx page is being downloaded is related to security risk mitigation in SharePoint. If you consider that Javascript (.js) files and .aspx files are executable files in the browser, then it should also be self evident that allowing users to upload such files to SharePoint could pose risk. Because of this, Microsoft has disabled custom script on all modern sites by default. You can choose to overrule this setting, but it should be done with extreme caution.

How to download a file from google drive without showing this action in the change history

If I have access to a folder with some files in google drive, it's possible to download this folder without showing any changes in history? The owner of the folder should not see that the file was downloaded. If it's possible in python please tell me how to do it.
I used to think about some script (with using Google API) or some service/site to resolve this problem.

Save files directly to Sharepoint

I have a working code that sends a GET request to a server and downloads files to my local folder. Now I need to have these files available on a SharePoint. I can upload the files to SharePoint after they are downloaded to my local folder. But is there a way that I can run my GET request and directly download the file to SharePoint? I am okay with python or node.

How to download numpy array files from an online drive

I have a dataset contains hundreds of numpy arrays looks like this,
I am trying to save them to an online drive so that I can run the code with this dataset remotely from a sever. I cannot access the drive of the server but can only run code script and access the terminal. So I have tried with google drive and Onedrive, and looked up how to generate a direct download link from those drives but it did not work.
In short, I need to be able to get those files from my python scripts. Could anyone give some hints?
You can get the download URLs very easily from Drive. I assume that you already uploaded the files into a Drive folder. Then you can easily set up a scenario to download the files on Python. First you would need an environment on Python to connect to Drive. If you don't currently have one, you can follow this guide. That guide will install the required libraries, credentials and run a sample script. Once you can run the sample script you can make minor modifications to reach your goal.
To download the files you are going to need their ids. I am assuming that you already know them, but if you don't you could retrieve them by doing a Files.list on the folder where you keep the files. To do so you can use '{ FOLDER ID }' in parents as the q parameter.
To download the files you only have to run a Files.get request by providing the file id. You will find the download URL on the webContentLink property. Feel free to leave a comment if you need further clarifications.

Uploading and listing files from a sharable google drive link with Pydrive

I have searched through the API's but haven't been able to find a way to upload files and download files(list files) pragmatically to a shared folder (and not my own google drive). Is there a way to do this?

Categories