How to download numpy array files from an online drive - python

I have a dataset contains hundreds of numpy arrays looks like this,
I am trying to save them to an online drive so that I can run the code with this dataset remotely from a sever. I cannot access the drive of the server but can only run code script and access the terminal. So I have tried with google drive and Onedrive, and looked up how to generate a direct download link from those drives but it did not work.
In short, I need to be able to get those files from my python scripts. Could anyone give some hints?

You can get the download URLs very easily from Drive. I assume that you already uploaded the files into a Drive folder. Then you can easily set up a scenario to download the files on Python. First you would need an environment on Python to connect to Drive. If you don't currently have one, you can follow this guide. That guide will install the required libraries, credentials and run a sample script. Once you can run the sample script you can make minor modifications to reach your goal.
To download the files you are going to need their ids. I am assuming that you already know them, but if you don't you could retrieve them by doing a Files.list on the folder where you keep the files. To do so you can use '{ FOLDER ID }' in parents as the q parameter.
To download the files you only have to run a Files.get request by providing the file id. You will find the download URL on the webContentLink property. Feel free to leave a comment if you need further clarifications.

Related

How to download a file from google drive without showing this action in the change history

If I have access to a folder with some files in google drive, it's possible to download this folder without showing any changes in history? The owner of the folder should not see that the file was downloaded. If it's possible in python please tell me how to do it.
I used to think about some script (with using Google API) or some service/site to resolve this problem.

Accessing public google drive file from python colab

I have the following link:
https://drive.google.com/file/d/XXXX/view?usp=sharing
It's a csv file, with public access which i want to use as a dataset.
I don't know it's name, i want to access it and read it's content simple as possible.
This notebook should run in different environments should it download the file on execution.
I've tried pydrive but the authentication was a mess.
Quick help would be appreciated, snippets are welcome

Fetching Google Sheet data from locally mounted drive via gspread

I am working on a Google Colab notebook that requires the user to mount google drive using the colab.drive python library. They then input relative paths on the local directory tree (/content/drive/... by default on that mount) to files of interest for analysis. Now, I want to use a Google Sheet they can create as a configuration file. There is lots of info on how to authenticate gspread and fetch a sheet from its HTTPS url, but I can't find any info on how to access the .gsheet file using gspread that is already mounted on the local filesystem of the colab runtime.
There are many tutorials using this flow: https://colab.research.google.com/notebooks/io.ipynb#scrollTo=yjrZQUrt6kKj , but I don't want to make the user authenticate twice (having already done so for the initial mount), and i don't want to make them input some files as relative path, some as HTTPS URL.
I had thought this would be quite like using gspread to work with google sheets that I might have on my locally mounted drive as well. But, I haven't seen this workflow anywhere either. Any pointers in that direction might help me out as well.
Thank you!
Instead of adding .gsheet on colab's drive you can try storing it in the user's drive and later fetch from there when needed. So until that kernel is running you won't have to re-authenticate the user.
I'm also not finding anything to authenticate into colab from other device. So you would consider modifying your flow a bit.

How to run Github code (with python) automatically on MyBinder or Google Colab without downloading the Sample code?

MyBinder and Colaboratory are feasible to allow people to run our examples from the website directly in their browser, without any download required.
When I work on the Binder, our data to be loaded takes a huge time. So, I need to run python code on the website directly.
I'm not sure whether I totally get the question. If you want to avoid having to download the data from another source, you can add the data into you git repo which you use to start Binder. It should look something like this: https://github.com/lschmiddey/book_recommender_voila
However, if your dataset is too big to be uploaded to your git repo, you have to get the data onto the provided Binder server somehow. So you usually have to download the data onto your Binder Server so that other users can work on your notebook.

How can i automate a spreadsheet which is stored in google drive without downloading it

How can i automate a spreadsheet which is stored in google drive without downloading it.
https://automatetheboringstuff.com/chapter12/
If you have Read permission on a folder (called View permission on Google Drive), you can open and run code from a workbook stored there, however technically the file is still being downloaded to your local machine, likely in a temporary folder.
If you have Write permission on a folder (called Edit permission on Google Drive), you can replace/overwrite a file that's stored there.
View: People can view, but can’t change or share the file with others.
Comment: People can make comments and suggestions, but can’t change or share the file with others.
Edit: People can make changes, accept or reject suggestions, and share the file with others.
More Information :
Google Support : Share files from Google Drive
Microsoft.com : Windows Temporary Files

Categories