How to use tensorflow federated library in google colab? - python

I am trying to use the tensorflow_federated library in google colab but cannot figure out how to do this. I have searched a lot on the internet for the same, but everywhere it's given, you don't need to install this library in google colab and you can use it directly, but I am not able to do so. Can anyone who has used this library in google colab tell me how to install/directly use it?

In the TensorFlow Federated Tutorials Overview there are links to 10+ colab notebooks. Navigating to any of them, and then clicking on Run in Google Colab should open the notebook in colab and demonstrates how to use.
Notably, TensorFlow Federated is not installed by default in Google Colab, rather all of the notebooks start with the following cell:
!pip install --quiet --upgrade tensorflow-federated
!pip install --quiet --upgrade nest-asyncio
import nest_asyncio
nest_asyncio.apply()
which installs TensorFlow Federated, as well as the nest-asyncio package which is needed for TensorFlow Federated's use of asyncio inside the event loop of colab itself.

Related

Google Research's model Mol_dqn not working on Google Colab. I am getting "No module named tensorflow.contrib" error. What can I do?

On Google Colab I am trying to implement Mol_dqn from the paper Optimization of Molecules via Deep Reinforcement Learning
. I have used the code from Google Research's Github here.
The model relies on TensorFlow version 1, which Google Colab no longer supports.
How can I get the model to run? How could I update the code scripts to run on Tensorflow ver 2? Is this the only option?
When I try to execute one of the python scripts, the error "ModuleNotFoundError: No module named 'tensorflow.contrib'" occurs.
I have tried uninstalling Tensorflow and reinstalling version 1.5, but Google Colab would not allow it.
I tried the command
%tensorflow_version 1.x
but Google Colab no longer supports it.

Running TensorFlow 2 setup.py in Google Colab loads forever and then times out

I am using Google Colab and am experiencing an issue when installing packages with python/pip following this TensorFlow guide. It loads forever then I get a timeout error when running the TensorFlow 2 setup script.
So then I tried to open their own notebook, which results in the same error. It just loads forever then throws a timeout. It is this line that loads forever: python -m pip install .
I am new to TensorFlow and Google Colab so I'm not sure how to debug this properly. How do I install the TensorFlow Object Detection API on Google Colab?
This is apparently a known issue as seen here.

How to permanently install Rapids on Google colab?

Is there a way to install Rapids permanently on Google colab? I tried many solutions given on StackOverflow and other websites but nothing is working. This is a very big library and it is very frustrating to download this every time I want to work on colab.
I tried this code from Rapids but it is also not working. When I close colab and start again later, I get ModuleNotFoundError: No module named 'cudf'.
# Install RAPIDS
!git clone https://github.com/rapidsai/rapidsai-csp-utils.git
!bash rapidsai-csp-utils/colab/rapids-colab.sh stable
import sys, os, shutil
sys.path.append('/usr/local/lib/python3.7/site-packages/')
os.environ['NUMBAPRO_NVVM'] = '/usr/local/cuda/nvvm/lib64/libnvvm.so'
os.environ['NUMBAPRO_LIBDEVICE'] = '/usr/local/cuda/nvvm/libdevice/'
os.environ["CONDA_PREFIX"] = "/usr/local"
for so in ['cudf', 'rmm', 'nccl', 'cuml', 'cugraph', 'xgboost', 'cuspatial']:
fn = 'lib'+so+'.so'
source_fn = '/usr/local/lib/'+fn
dest_fn = '/usr/lib/'+fn
if os.path.exists(source_fn):
print(f'Copying {source_fn} to {dest_fn}')
shutil.copyfile(source_fn, dest_fn)
# fix for BlazingSQL import issue
# ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /usr/local/lib/python3.7/site-packages/../../libblazingsql-engine.so)
if not os.path.exists('/usr/lib64'):
os.makedirs('/usr/lib64')
for so_file in os.listdir('/usr/local/lib'):
if 'libstdc' in so_file:
shutil.copyfile('/usr/local/lib/'+so_file, '/usr/lib64/'+so_file)
shutil.copyfile('/usr/local/lib/'+so_file, '/usr/lib/x86_64-linux-gnu/'+so_file)
A solution has been suggested but which uses pip to install libraries - How do I install a library permanently in Colab? but Rapids can't be installed using pip. It can only be installed using Conda. This is the code to install it.
conda create -n rapids-0.19 -c rapidsai -c nvidia -c conda-forge \
rapids-blazing=0.19 python=3.7 cudatoolkit=11.0
I tried to include the google drive path(nb_path) to this code using the --prefix flag as suggested by the above link !pip install --target=$nb_path jdc but I am getting a syntax error.
Can anyone tell me how to set this nb_path to the conda create code above?
For reference, the conda target path for RAPIDS install is /usr/local. We use a different location in the RAPIDS-Colab install script to get it to work.
At the moment, I'm not aware of any way for a user to permanently install RAPIDS into Google Colab. Google Colab isn't designed for the purpose of persisting libraries - or any data for that matter- that aren't preinstalled in the environment. While you have a decent looking workaround there for pip libraries and datasets with Google Drive mounting, with RAPIDS, it is a little more tricky as we update quite a bit of the Colab environment in order to get it to even install RAPIDS. What you propose is an interesting path to explore. We do encourage and work with RAPIDS community members in our Slack channel who try new methods and improve some of our community code like the RAPIDS-Colab installation script.
Just remember, the RAPIDS + Google Colab effort was never meant to be more than a fun, easy way to "Try RAPIDS out". For Google Cloud users, GCP is supposed to be the next step. While it's heartening to see the usage grow over time, Google would need to create a Colab instance that has RAPIDS preinstalled for what you want to happen. You should let the know you want this by
Open any Colab notebook
Go to the Help menu and select ”Send feedback...”
In the meantime, if you need a ready-to-go instance, there are some inexpensive, RAPIDS-enabled, quick start options on the horizon.

package in google colab

simple question:
does different notebooks in google colab share the same package?
I run !pip list, and I want to find Kora, which is a module that I just installed by !pip Kora.
As you can see, I can find kora from one notebook, but can't find it in another notebook. Why?
You will need to install it each time in every new notebooks.

Error using TensorFlow 2.0 on Google Colab

I am using Google Colab. The link for the project is here: https://colab.research.google.com/drive/1K8aaNq5ZTXQM1zzhaWICuuY5nA06Qn7z?usp=sharing
The error:
I am learning from a course I found online on https://www.udemy.com/. I do not not know much about tensorflow and I don't know why I am getting this error. You may take a look inside the project and help me find out what I am doing wrong here.
Google Colab specifies
We recommend against using pip install to specify a particular TensorFlow version for both GPU and TPU backends.
You can read it here
So I would suggest you directly import dependencies for the project.
import numpy as np
import datetime
import tensorflow as tf
from tensorflow.keras.datasets import fashion_mnist
tf.__version__
gives output
2.3.0
this might not be directly a solution to your issue but seems worth noting. When I tried your notebook I had an issue importing 'export_saved_model'.
The solution to that seems to be here on this github repo:
https://github.com/tensorflow/models/issues/8450
Also, maybe worth noting is that you dont need to install tensorflow into colab as per this tutorial:
https://colab.research.google.com/notebooks/tensorflow_version.ipynb
Hope this helps. Good luck on your course.

Categories