How to use Google Colab local runtime and connect to Google Drive - python

I am running Google Colab with a local runtime. I use this command to start Jupyter:
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0
When I previously connected I was able to mount Google Drive with this code in the notebook:
from google.colab import drive
drive.mount('/content/drive')
I was able to connect to Google Drive and access files while on local runtime.
However, now I am getting this error:
ModuleNotFoundError: No module named 'google.colab'
I see other people have this problem, and some suggest using PyDrive. But I am certain I was connected to Google Drive without using PyDrive.
I suspect the first command I ran to start Jupyter was different when Google Drive was able to connect.
Is there a specific flag I have to add to that first command

The google.colab libraries only exist on managed backends.
When running on your local machine, you'll want to instead use Drive's existing sync apps available here: https://www.google.com/drive/download/

Related

Google Drive: File Load Error for.ipynb. Permission denied: Google Drive/My Drive/project/code/.ipynb

I am working with other colleagues on a project through Google Drive. I need to access this specific project through Drive instead of GitHub because we also have some folders with data (that we modify using the code folder files). But now, when I try to access the .ipynb files that I had created through my local host and Jupyter Lab / Notebook, I get the following error:
File Load Error for file.ipynb.
Permission denied: Google Drive/My Drive/project/code/file.ipynb
I have tried several alternatives but I cannot solve it, and I would like to work from my local host instead of loading all the data into google colab.
Note: I am working on a Mac computer.

Importing google.colab in VM Engine Doesn't Let me Run Jupyter Notebook in Google Colab?

I have a script on Google Colab that needs a lot of memory usage, so I tried linking it to a VM engine on Google cloud. According to this documentation, you create the instance, run it, authenticate the jupyter notebook server, and connect the link to the local runtime. However, because I am using a local runtime to run everything, I have to import certain libraries. I was able to import tensorflow and matplot without any trouble and was able to connect to the server, but when I imported google.colab, it's not working. I've tested this thrice with different vm-instances, and every time it seems to be the case. Is there a way to fix this issue? Or if not, another way to access the datasets I need without using google colab? (code is below)
Thanks in advance
import pathlib
from google.colab import drive
drive.mount("/content/gdrive")
data_dir_ = "gdrive/My Drive/training"
data_dir_training = pathlib.Path(data_dir_training)
This is also how my notebook looks in Google Colab when I try running the code without importing google.colab:
Figured it out. Pip installing google colab causes outdated libraries, so you have to pip upgrade it as well.

Automatically mount and authorise Google Drive in Collab [duplicate]

I'm looking for a way to automate the authentication process when connecting a colab-session to my google drive.
I'd prefer to use the built-in tools for this one, instead of PyDrive.
In short: have the following cell run without having to manually authenticate by logging in and copying the password from the dialogue
from google.colab import drive
drive.mount('/content/drive/')
Automatically mounting to your Drive files is now supported for Colab notebooks which aren't shared by multiple people.
To enable this for a notebook, create a new Drive notebook, open the file browser, and click the 'Mount Drive' button.
You'll see a permissions dialog like so:
After you complete the permissions once, you'll see your Drive mounted in the file browser.
Better still, if you reload the notebook later and reconnect, your Drive will mount automatically with no more drive.mount copy/paste required. Your Drive files will just be there.

Is there a way to directly import or open local files from Microsoft Azure Notebooks without uploading them to the Azure cloud?

I am trying to find a way that I can run commands to either open applications/files from my local machine or import files from my local machine in Microsoft Azure Notebooks without uploading them directly to the Azure cloud. Does anyone know if this is possible or if who/where is a better place to ask?
No, you cannot. The jupyter is run a micro-server, which is hosted on the Azure. You can only used the data on the Azure cloud. Been there, wished that, but no.

Is jupyter replicating my data to a cloud server

This is a pretty simple question. I if you download jupyter as a part of Anaconda. How is your data being secured. When I run jupyter it does go straight to an html page, but that page displays my local folders on the servers I am connected to.
If I make a notebook, will that notebook be stored on a cloud server. Where does it go, and how can I keep all of my filee ("notebooks) local?
Jupyter uses no cloud services, and should make no external requests when you are running it locally. The best way to think of a local install of Jupyter notebook is a desktop application that happens to use your web browser for its UI. It talks to the local filesystem, and relays that data to your browser over HTTP on localhost.

Categories