Add member(s) to Google Drive Shared Drive with Python - python

How can I add a member to a Google Drive's Shared Drive?
I am working on a Colab and tried to search but nothing useful came up.
I tried using PyDrive but:
I can't select the whole Shared Drive, only a folder
I can't update permissions on the folder even tought I can do it through the google drive web gui
This is what I'm doing now (error 400):
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
from google.colab import drive as drv
drv.mount('/gdrive', force_remount=True)
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
%cd /gdrive/Shared\ drives/
file_list = drive.ListFile({'q': " title = 'FolderTitleOrSharedDriveTitle'"}).GetList()
for file1 in file_list:
print('title: %s, id: %s' % (file1['title'], file1['id']))
file = file1
print(file['title'])
file.GetPermissions()
new_permission = {
'type': 'user',
'value': 'usertogrant#access.com',
'role': 'reader'
}
permission = file.auth.service.permissions().insert(fileId=file['id'], body=new_permission, supportsAllDrives=True).execute(http=file.http)

Related

How to set folder path when downloading from google drive

I am trying to download some files from a google drive folder to local folder /home/lungsang/Desktop/gdrive/ABC. Can you guys can modify the below code so that I can achieve it? PS: Right now its just downloading in the root folder :)
import streamlit as st
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
folder = '1tuQxaiDOdbfv1JHXNAln2nbq1IvBOrmP'
file_list = drive.ListFile({'q': f"'{folder}' in parents and trashed=false"}).GetList()
for index, file in enumerate(file_list):
print(index+1, 'file Downloaded : ', file['title'])
file.GetContentFile(file['title'])
In your script, how about the following modification? Please add the path as follows.
Modified script:
import streamlit as st
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
path = "/home/lungsang/Desktop/gdrive/ABC/" # Added
folder = '1tuQxaiDOdbfv1JHXNAln2nbq1IvBOrmP'
file_list = drive.ListFile({'q': f"'{folder}' in parents and trashed=false"}).GetList()
for index, file in enumerate(file_list):
print(index+1, 'file Downloaded : ', file['title'])
file.GetContentFile(path + file["title"]) # Modified
In this modification, it supposes that the directory of /home/lungsang/Desktop/gdrive/ABC/ is existing. Please be careful about this.

How to download file from google drive folder?

I have a script that gets a list of files from google drive
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LoadCredentialsFile("mycreds.txt")
gauth.LoadCredentialsFile("mycreds.txt")
if gauth.credentials is None:
gauth.LocalWebserverAuth()
elif gauth.access_token_expired:
gauth.Refresh()
else:
gauth.Authorize()
gauth.SaveCredentialsFile("mycreds.txt")
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
folder = "1CNtWRS005fkX6vlZowZiXYITNGifXPKS"
file_list = drive.ListFile({'q': f"'{folder}' in parents"}).GetList()
for file in file_list:
print(file['title'])
-> 1.txt
It receives data only from its disk, but I need the script to receive a list of files from a folder to which it has access - "available to me". I have a folder ID, but if I substitute it in the folder field, nothing happens
I think gdown could help you.
pip install gdown
Then could try something like this:
import gdown
id = "folderId..."
gdown.download_folder(id=id, quiet=True, use_cookies=False)

How to work with Google Colab efficiently?

I try to train a neural network on Colab using a GPU there. I am now wondering if I am on the right pave and if all the steps I am doing are necessary, because the process I am following does not appear very efficient to me.
# Install the PyDrive wrapper & import libraries.
# This only needs to be done once per notebook.
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Authenticate and create the PyDrive client.
# This only needs to be done once per notebook.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
import os
# choose a local (colab) directory to store the data.
local_root_path = os.path.expanduser("~/data")
try:
os.makedirs(local_root_path)
except: pass
def ListFolder(google_drive_id, destination):
file_list = drive.ListFile({'q': "'%s' in parents and trashed=false" % google_drive_id}).GetList()
counter = 0
for f in file_list:
# If it is a directory then, create the dicrectory and upload the file inside it
if f['mimeType']=='application/vnd.google-apps.folder':
folder_path = os.path.join(destination, f['title'])
os.makedirs(folder_path)
print('creating directory {}'.format(folder_path))
ListFolder(f['id'], folder_path)
else:
fname = os.path.join(destination, f['title'])
f_ = drive.CreateFile({'id': f['id']})
f_.GetContentFile(fname)
counter += 1
print('{} files were uploaded in {}'.format(counter, destination))
ListFolder("1s1Ks_Gf_cW-F-RwXFjBu96svbmqiXB0o", local_root_path)
This commands allow to connect the Notebook in Colab with my Google Drive and stores the data in Colab. Because I have a lot of images (more than 180k) the storage of the data in Colab takes very, very long and partially the connection breaks. I am now wondering if it is necessray to upload all the data from my Google Drive to Colab?
If no, what do I have to do instead to work with the data from Google Drive?
If yes, is there a way to do this more efficiently?
Or is there maybe a completely different way I should work with Colab?
You can access files directly on your Google drive without copying them into Notebook environment.
Execute this code in one cell:
from google.colab import drive
drive.mount('/content/gdrive')
And try:
!ls /content/gdrive
Now you can copy your files from/to /content/gdrive directory and they will appear in your Google Drive.

How to be able to import every file in a folder in google drive?

I read this article which was about how we can import files in a google drive to our google colab environment. For each file we go through these steps as the article says:
1 - Get a shareable link
2 - then we extract the id section of the link.
3 - after that we use this code to be able to import
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
downloaded = drive.CreateFile({'id':"your_file_ID"})
downloaded.GetContentFile('your_file_name.csv')
So then I can write import file.py. I wanted to ask that is there any way that I could get the access to main folder and start importing like from shared_file.subfolder.some_module import func1, class1?
What I really need is to do authentication just once and avoid doing all the steps above for each file in a folder. Even automating above steps can help.
Thanks
If the folder is in your own Google Drive, it's easier. Otherwise, you can add that folder to your Google Drive first (it won't take your space quota).
Then you can mount it with
from google.colab import drive
drive.mount('gdrive')
Now, you can access that folder, by changing the current directory.
import os
os.chdir("/content/gdrive/My Drive/that_folder")
Now you can import your_library.py easily, because it's in the current directory.
from your_library import *

google drive api to upload all pdfs to google drive

I am using the pydrive to upload pdf files to my google drive folder. I am wanting to send all *pdf files in a local folder at once with this code but not sure where to go from here? Should I use glob? If so I would like to see an example, please.
working code that sends 1 file to the designated google drive folder:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(g_login)
folder_id = 'google_drive_id_goes_here'
f = drive.CreateFile({'title': 'testing_pdf',
'mimeType': 'application/pdf',
'parents': [{'kind': 'drive#fileLink', 'id':folder_id}]})
f.SetContentFile('/Users/Documents/python/google_drive/testing.pdf')
f.Upload()
You cant upload files at once. Create file with the API is a single thing and pydrive as no mechanism for uploading more then one .
Your going to have to put this in a loop and upload each file as you go.
import os
directory = 'the/directory/you/want/to/use'
for filename in os.listdir(directory):
if filename.endswith(".txt"):
f = open(filename)
lines = f.read()
print (lines[10])
continue
else:
continue

Categories