I am newbie and am currently trying out the Python Notebook https://github.com/TessFerrandez/research-papers/tree/master/facenet on Google Colaboratory.
I added
!pip install face-recognition
!git clone https://github.com/TessFerrandez/research-papers.git
%cd research-papers/facenet
at the start of the notebook so that I can import the right utils.
However, in one of the cells below, I am unable to run the code. I get this error message:
RuntimeError
Traceback (most recent call last)
ipython-input-14-45bae69bfbbe in <module>()
15 # Initialize the OpenFace face alignment utility
---> 16 alignment = AlignDlib('models/landmarks.dat')
/content/research-papers/facenet/research-papers/facenet/align.py in __init__(self, facePredictor)
88 self.detector = dlib.get_frontal_face_detector()
---> 89 self.predictor = dlib.shape_predictor(facePredictor)
RuntimeError: Unable to open models/landmarks.dat
Do you know where to find models/landmarks.dat so that AlignDlib will not throw an error?
Do I have to install openface into the google colabortary or upload the model from somewhere?
You can download and decompress the necessary file with this code.
!wget http://dlib.net/files/shape_predictor_68_face_landmarks.dat.bz2
!bunzip2 "shape_predictor_68_face_landmarks.dat.bz2"
Related
I am working on an object detection model in google colab and I'm following most of the instructions outlined here.
In order to train the model, I am trying to use:
!python model_main_tf2.py
!python --model_dir=models/SSD_640
!python --pipeline_config_path=models/SSD_640/pipeline.config
However, I am getting the following error:
/content/drive/MyDrive/Workspace
2021-06-28 11:33:33.510377: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
Traceback (most recent call last):
File "model_main_tf2.py", line 32, in <module>
from object_detection import model_lib_v2
ImportError: cannot import name 'model_lib_v2' from 'object_detection' (/usr/local/lib/python3.7/dist-packages/object_detection/__init__.py)
The mentioned file model_lib_v2.py is available in the following path - Workspace/models/research/model_lib_v2.py
I've tried adjusting the cd to point to the workspace folder by using the code:
%cd '/content/drive/MyDrive/Workspace'
I have also been trying to fix it by trying to install object_detection again, but this has not helped. I also made sure that the paths are correct, and there do not seem to be any inconsistencies.
Any help on this would be much appreciated! Thanks in advance.
model_lib_v2.py have to be in the folder object_detection.
Try to add to PYTHONPATH.
You can get the files from here.
https://github.com/tensorflow/models/tree/master/research/object_detection
I am trying to create a simple chatbot in Google Colab. The code ran successfully for the first time, but when I tried to run it again the next day it shows the following error :-
FileNotFoundError Traceback (most recent call last)
in ()
4
5 trainer= ChatterBotCorpusTrainer(chatbot)
----> 6 trainer.train ("chatterbot.corpus.english")
2 frames
/usr/local/lib/python3.7/dist-packages/chatterbot/corpus.py in read_corpus(file_name)
56 raise OptionalDependencyImportError(message)
57
---> 58 with io.open(file_name, encoding='utf-8') as data_file:
59 return yaml.load(data_file)
60
FileNotFoundError: [Errno 2] No such file or directory: '/root/chatterbot_corpus/data/english'
I had not made any changes in the code. What is the problem?
pip install chatterbot
pip install chatterbot_corpus
from chatterbot import ChatBot
from chatterbot.trainers import ChatterBotCorpusTrainer
chatbot = ChatBot('mybot')
trainer= ChatterBotCorpusTrainer(chatbot)
trainer.train("chatterbot.corpus.english")
This error arises when you get connected to a new runtime as it does not contain any of the pre-installed packages or files. I would suggest keeping the pip install commands in different code block and running it first then running the code also if you had any files in directory you gotta reupload it.
So I just ran your code in my colab and the problem is this:
!pip install chatterbot
!pip install chatterbot_corpus
just add the '!' before and it will work'.
It gives this error when I execute the training code.
ModuleNotFoundError
Traceback (most recent call last)
<ipython-input-12-08472a50f5e6> in <module>()
6 import logging
7 logging.getLogger('tensorflow').disabled = True
----> 8 import input_data
9 import resnet_utils
10 import resnet_v2
ModuleNotFoundError: No module named 'input_data'
---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
To view examples of installing some common dependencies, click the
"Open Examples" button below.
I'm getting this error when I'm trying to access to run this code:
word_embedding_matrix = np.load(open("word_embedding_matrix.npy", 'rb'))
FileNotFoundError
Traceback (most recent call last)
in ()
----> 1 word_embedding_matrix = np.load(open("word_embedding_matrix.npy", 'rb'))
FileNotFoundError: [Errno 2] No such file or directory: 'word_embedding_matrix.npy'
The most common reason for that is you didn't mount your google drive to your notebook. You can do that easily by running the following code in the first cell of your notebook:
import os
from google.colab import drive
drive.mount('/content/gdrive')
ROOT = "/content/gdrive/My Drive/"
os.chdir(ROOT)
This code enables your notebook to access your google drive and you can access your GoogleDrive the same as you do with your local disk. So, when you run
os.listdir('.'), you should find word_embedding_matrix.npy among the returned items.
Source: official documentation
I am trying to use the google bigquery python library but whenever I run import bq I get the following error;
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-21-923a4eec0450> in <module>()
----> 1 import bq
/Users/tahirfayyaz/anaconda/python.app/Contents/lib/python2.7/site-packages/bq.py in <module>()
31 import oauth2client.tools
32
---> 33 from google.apputils import app
34 from google.apputils import appcommands
35 import gflags as flags
ImportError: No module named google.apputils
I have installed and even upgraded google-apputils but I still get this error.
The way Google Cloud tools are distributed has changed a bit, you'll be able to download a current version of the software via the Cloud SDK:
* https://developers.google.com/cloud/sdk/
The SDK will install a hermetic environment that contains bigquery as well as all of it's dependencies, like oauth2client and google.apputils. It doesn't use ez-install anymore.
You can add the SDK to your PATH to pick up the current bq.py program.
export PATH=$SDKROOT/platform/bigquery:$PATH
You can add the SDK to your PYTHONPATH if you're trying to import something directly as in your example above.
export PYTHONPATH=$SDKROOT/platform/bigquery:$PYTHONPATH