I tried to import keras-vggface like this:
from keras_vggface.vggface import VGGFace
But it always gives me this error
"ModuleNotFoundError: No module named 'keras_vggface".
I tried to install keras_vggface with pip like this:
!pip install --user keras-vggface
!pip install keras-vggface
!pip install git+https://github.com/rcmalli/keras-vggface.git
I think you need to install it as below:
!pip install keras_vggface
this works like a charm.
Hope it can be useful.
Struggling with VGGFace use in colab as well, resolved your problem by specifying in colab tensor flow x1 with %tensorflow_version 1.xas informed by colab warning VGGFace git:
"VGGFAce works only with 1.x TensorFlow backend AND without eager execution!!!"
EDIT: but you have to properly call !pip install keras_vggface first
Related
this:
import tensorflow as tf
from transformers import BertTokenizer, TFBertForSequenceClassification
model = TFBertForSequenceClassification.from_pretrained("bert-base-uncased")
Outputs the following error:
ImportError:
TFBertForSequenceClassification requires the TensorFlow library but it was not found in your environment. Checkout the instructions on the
installation page: https://www.tensorflow.org/install and follow the ones that match your environment.
However it's not true I don't have the TensorFlow library imported.
> print(tf.__version__)
'2.7.0'
As this is to answer the question when Tensorflow is indeed installed and imported this can often be a problem in this case.
BERT requires the tensorflow-gpu package so you need to have installed the tensorflow-gpu package to do what you are attempting here.
IF you are in a notebook run:
!pip install tensorflow-gpu
Otherwise on the command line run:
pip install tensorflow-gpu
Hope this is helpful to others out there facing issues!
import tensorflow as tf
from transformers import BertTokenizer, TFBertForSequenceClassification
Worked with
Tensorflow-2.7.0
huggingface-hub-0.2.1
tokenizers-0.10.3
transformers-4.15.0
Python-3.7
Install Tensorflow and Transformers
!pip install tensorflow
!pip install transformers
I believe you are on MacOS. If so, installing the latest version of transformers (v4.21.1) should fix this issue. (I am also running M1 pro, if that helps).
For me the latest update was giving an error with incompatible checkpoint weight files, so I had to install transformers v4.2.2 which gave me the above error.
To fix it, you need to do an editable install: https://huggingface.co/docs/transformers/installation#installing-from-source
i.e.
git clone https://github.com/huggingface/transformers.git
cd transformers
pip install -e .
Before running the above code, make the following changes:
In transformers/setup.py Line 134, change to "tokenizers==0.6.0"
In transformers/src/transformers/file_utils.py Line 87, change "tensorflow" to "tensorflow-macos"
Finally, in transformers/src/transformers/dependency_versions_check.py comment out Line 41 ("# require_version_core(deps[pkg])"
For the final step, upgrade the tokenizers library to a newer version (say 0.12.1). You cant have the newer tokenizers library before installing the transformers package - it fails to build the wheel.
Reference: https://id2thomas.medium.com/apple-silicon-experiment-1-installing-huggingface-transformers-2e45392d3d0f
https://github.com/apple/tensorflow_macos/issues/144
Transformers requires Tensorflow version >= 2.3
To see the installed Tensorflow version, run the python script below:
import tensorflow as tf
print(tf.__version__)
If tensorflow version is actually too low, you may upgrade it.
For Tensorflow of CPU version,
run the command in a notebook,
!pip install tensorflow>=2.3
or run the command in a command window:
pip install tensorflow>=2.3
For Tensorflow of GPU version,
run the command in a notebook,
!pip install tensorflow-gpu>=2.3
or run the command in a command window,
pip install tensorflow-gpu>=2.3
There are several tutorials online that import a VGGFace model from keras_vggface like this:
from keras_vggface.vggface import VGGFace
However, I get the following error:
ModuleNotFoundError: No module named 'keras.engine.topology'
This problem happens on my local machine, but also on Google Colab after installing keras_vggface with
!pip install keras_vggface
I solved this issue in Google Colab by changing the import from
from keras.engine.topology import get_source_inputs
to
from keras.utils.layer_utils import get_source_inputs
in usr/local/lib/python3.7/dist-packages/keras_vggface/models.py
! pip install git+https://github.com/rcmalli/keras-vggface.git
!pip install keras_applications --no-deps
filename = "/usr/local/lib/python3.7/dist-packages/keras_vggface/models.py"
text = open(filename).read()
open(filename, "w+").write(text.replace('keras.engine.topology', 'tensorflow.keras.utils'))
import tensorflow as tf
from keras_vggface.vggface import VGGFace
vggface = VGGFace(model='resnet50') # or VGGFace() as default
worked for me and colab
I think you need to install it as below:
!pip install keras_vggface
It should work
I am going to use Bindsnet for Spiking Neural Network and I have imported it using
! pip install bindsnet in jupyter notebook.
and My Python's version is 3.6.
when I run:
from bindsnet.network import Network
It returns below error message:
ModuleNotFoundError: No module named 'bindsnet.network'
Can you please let me know how to solve this?
It turned out that using pip install was not successful so I changed the code for installation to :
!pip install bindsnet --ignore-installed
and now I can run from bindsnet.network import Network with no error.
On Google Colab I'm getting an error when trying to import GeoJSON from IPython.display:
Any help on how to properly import it would be appreciated.
I found the issue to be caused by your Collab runtime having an older version of IPython installed.
pip freeze
Output
ipython==5.5.0
ipython-genutils==0.2.0
ipython-sql==0.3.9
ipywidgets==7.5.1
Since the collab was created sometime ago
Updating the module fixed the issue for me
pip install -U IPython
After which you can restart your runtime and the changes should be reflected
pip freeze
Output
ipython==7.16.1
ipython-genutils==0.2.0
ipython-sql==0.3.9
Exactly how to execute in Google Colab:
import IPython
Update IPython
Now able to from IPython.display import GeoJson
Google Colaboratory has an older version of IPython installed (v5.5.0 as of the time I'm writing this). The best way to fix this is to include the following line in the Colab notebook before your import statements:
!pip3 install --upgrade IPython
This should fix it for you!
I'm using python 3.8.2 , keras 2.3.1 and tensorflow 2.2.0rc4 .
just with the following code :
import keras
from keras.models import sequential
I have this error :
AttributeError: partially initialized module 'keras.backend' has no attribute 'eager' (most likely due to a circular import)
if I use :
import tensorflow
or
from tensorflow.keras import ....
new error :
AttributeError: partially initialized module 'tensorflow.python.framework.ops' has no attribute 'register_tensor_conversion_function' (most likely due to a circular import)
full traceback:
enter image description here
enter image description here
My suggestion is to reinstall the package. Sometimes this happens due to the installation problem.
Use the following code to do so
Uninstall tensorflow
pip uninstall tensorflow
Requires the latest pip
pip install --upgrade pip
To install keras as separate package
pip install Keras
Current stable release for CPU and GPU
pip install tensorflow
Try this and hope this helps you.
It's an install problem, most likely; K.eager was introduced in Keras 2.3.0 (and is included in Keras 2.3.1), so your Python interpreter is somehow reading code of 2.2.5 or earlier.
A possible culprit is an Anaconda mishap. First run conda uninstall keras. Then, in the anaconda3 directory, search "keras" and delete all results. Lastly, run conda install -c conda-forge keras, which should download version 2.3.1. You might need to run similar steps with TensorFlow (in fact, it's better you do, and first reinstall TensorFlow then Keras).
P.S., your code is probably from keras import Sequential, as sequential should error differently.