'Adam' object has no attribute 'Adam' - python

This is how I imported the modules
from keras.layers import Conv2D, BatchNormalization, Activation
from keras.models import Model, Input
from keras import optimizer_v2
from keras.optimizer_v2 import adam
import keras.backend as K
But I'm getting this Error
I tried using adam = adam(learning_rate= 0.00001) but it doesn't work either

I tried the following steps and it worked
import keras
adam = keras.optimizer_v2.adam.Adam(learning_rate= 0.00001)
deblur_CNN.compile(optimizer= adam, loss= 'mean_squared_error')
deblur_CNN.load_weights('/content/drive/My Drive/Colab Notebooks/FDmodels/deblur_cnn_weights.h5')
after specifying the exact path to the adam.Adam module it worked fine.
Still, if anyone wants to add something can add.

Related

Unable to import SGD and Adam from 'tensorflow.python.keras.optimizers'

Trying to run---
import tensorflow as tf
from tensorflow import keras
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Flatten, Dense
from tensorflow.python.keras.optimizers import SGD, Adam
import numpy as np
print(tf.__version__)
I get this error---
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-8-f05f8f753c47> in <module>()
4 from tensorflow.python.keras.models import Sequential
5 from tensorflow.python.keras.layers import Flatten, Dense
----> 6 from tensorflow.python.keras.optimizers import SGD, Adam
7
8 import numpy as np
ImportError: cannot import name 'SGD' from 'tensorflow.python.keras.optimizers' (/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/optimizers.py)
---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
---------------------------------------------------------------------------
I'm studying machine learning in Google Colab.
I pasted the example code and run it, and get error message.
I could find similar errors in Google, but I couldn't find anything to solve this problem.
I tried 'from tensorflow.keras.optimizers import SGD, Adam', 'from tf.keras.optimizers import SGD, Adam', and 'from keras.optimizers import SGD, Adam'.
But everything didn't work.
try this:
from tensorflow.python.keras.optimizer_v1 import SGD
You need to mention the exact updated alias name while importing the model(Sequential),layers (Flatten, Dense) and optimizers (SGD, Adam).
Please try again using the below code in new Google Colab notebook.
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.models import Sequential #removed python from each layer
from tensorflow.keras.layers import Flatten, Dense
from tensorflow.keras.optimizers import SGD, Adam
import numpy as np
print(tf.__version__)
Output:
2.8.2

AttributeError: 'Node' object has no attribute 'output_masks'

I use Keras pretrained model VGG16. The problem is that after configuring tensorflow to use the GPU I get an error that I didn't have before when using the CPU.
The error is the following one:
Traceback (most recent call last):
File "/home/guillaume/Documents/Allianz/ConstatOrNotConstatv3/train_network.py", line 109, in <module>
model = LeNet.build(width=100, height=100, depth=3, classes=5)
File "/home/guillaume/Documents/Allianz/ConstatOrNotConstatv3/lenet.py", line 39, in build
output = model(pretrainedOutput)
File "/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py", line 443, in __call__
previous_mask = _collect_previous_mask(inputs)
File "/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py", line 1311, in _collect_previous_mask
mask = node.output_masks[tensor_index]
AttributeError: 'Node' object has no attribute 'output_masks'
I get it after executing this code :
pretrained_model = VGG16(
include_top=False,
input_shape=(height, width, depth),
weights='imagenet'
)
for layer in pretrained_model.layers:
layer.trainable = False
model = Sequential()
# first (and only) set of FC => RELU layers
model.add(Flatten())
model.add(Dense(200, activation='relu'))
model.add(Dropout(0.5))
model.add(BatchNormalization())
model.add(Dense(400, activation='relu'))
model.add(Dropout(0.5))
model.add(BatchNormalization())
# softmax classifier
model.add(Dense(classes,activation='softmax'))
pretrainedInput = pretrained_model.input
pretrainedOutput = pretrained_model.output
output = model(pretrainedOutput)
model = Model(pretrainedInput, output)
EDIT1 : I've got keras (2.2.2) and tensorflow(1.10.0rc1). I've also tried on keras 2.2.0 and same error. The thing is that the python environment I use works on others non-pretrained NN.
EDIT2 : I'm able to connect two homemade models. It's only whith the pretrained ones there is a problem and not only VGG16.
You're likely importing tf.keras.layers or tf.keras.applications or other keras modules from tensorflow.keras, and mixing these objects with objects from the "pure" keras package, which is not compatible, based upon version, etc.
I recommend seeing if you can import and run everything from the "pure" keras modules; don't use tf.keras while debugging, as they're not necessarily compatible. I had the same problem, and this solution is working for me.
I had the same error when I import keras and tenerflow.keras simultaneously:
from tensorflow.keras.optimizers import Adam
from keras.utils import multi_gpu_model
I solved this problem after changing the code into:
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.utils import multi_gpu_model
I had a similar issue, but with different architecture. As people suggested, it's important not to mix keras with tensorflow.keras, so try swapping code like:
from keras.preprocessing import image
from keras.models import Model
from keras.layers import Dense, GlobalAveragePooling2D
from keras import backend as K
to:
from tensorflow.keras.preprocessing import image
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D
from tensorflow.keras import backend as K
Also make sure, you don't use keras.something inside your code (not only imports) as well, hope it helps : )
Also, I used Keras 2.2.4 with tensorflow 1.10.0
if you are importing VGG16 from tensorflow.keras.applications.vgg16 then import all models from tensorflow
from tensorflow.keras.applications.vgg16 import VGG16
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Flatten
elif you are importing from keras like:
from keras.applications.vgg16 import VGG16
then use model import from keras
from keras.models import Sequential
from keras.layers import Dense

How to import keras.engine.topology in Tensorflow?

I want to import keras.engine.topology in Tensorflow.
I used to add the word tensorflow at the beginning of every Keras import if I want to use the Tensorflow version of Keras.
For example: instead of writing:
from keras.layers import Dense, Dropout, Input
I just write the following code and it works fine :
from tensorflow.keras.layers import Dense, Dropout, Input
But that's not the case for this specific import:
from tensorflow.keras.engine.topology import Layer, InputSpec
And I m getting the following error message:
No module named 'tensorflow.keras.engine'
You can import Layer and InputSpec from TensorFlow as follows:
from tensorflow.python.keras.layers import Layer, InputSpec
UPDATE: 30/10/2019
from tensorflow.keras.layers import Layer, InputSpec
In the keras_vggface/models.py file, change the import from:
from keras.engine.topology import get_source_inputs
to:
from keras.utils.layer_utils import get_source_inputs
In order to import keras.engine you may try using:
import tensorflow.python.keras.engine
Note: But from tensorflow.python.keras.engine you cannot import topology
I solved this issue by changing the import from from keras.engine.topology import get_source_inputs to from keras.utils.layer_utils import get_source_inputs

"Could not interpret optimizer identifier" error in Keras

I got this error when I tried to modify the learning rate parameter of SGD optimizer in Keras. Did I miss something in my code or my Keras was not installed properly?
Here is my code:
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, Flatten, GlobalAveragePooling2D, Activation
import keras
from keras.optimizers import SGD
model = Sequential()
model.add(Dense(64, kernel_initializer='uniform', input_shape=(10,)))
model.add(Activation('softmax'))
model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics= ['accuracy'])*
and here is the error message:
Traceback (most recent call last): File
"C:\TensorFlow\Keras\ResNet-50\test_sgd.py", line 10, in
model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics=['accuracy']) File
"C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\models.py",
line 787, in compile
**kwargs) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\engine\training.py",
line 632, in compile
self.optimizer = optimizers.get(optimizer) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\optimizers.py",
line 788, in get
raise ValueError('Could not interpret optimizer identifier:', identifier) ValueError: ('Could not interpret optimizer identifier:',
<keras.optimizers.SGD object at 0x000002039B152FD0>)
The reason is you are using tensorflow.python.keras API for model and layers and keras.optimizers for SGD. They are two different Keras versions of TensorFlow and pure Keras. They could not work together. You have to change everything to one version. Then it should work.
I am bit late here, Your issue is you have mixed Tensorflow keras and keras API in your code. The optimizer and the model should come from same layer definition. Use Keras API for everything as below:
from keras.models import Sequential
from keras.layers import Dense, Dropout, LSTM, BatchNormalization
from keras.callbacks import TensorBoard
from keras.callbacks import ModelCheckpoint
from keras.optimizers import adam
# Set Model
model = Sequential()
model.add(LSTM(128, input_shape=(train_x.shape[1:]), return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())
# Set Optimizer
opt = adam(lr=0.001, decay=1e-6)
# Compile model
model.compile(
loss='sparse_categorical_crossentropy',
optimizer=opt,
metrics=['accuracy']
)
I have used adam in this example. Please use your relevant optimizer as per above code.
Hope this helps.
This problem is mainly caused due to different versions. The tensorflow.keras version may not be same as the keras.
Thus causing the error as mentioned by #Priyanka.
For me, whenever this error arises, I pass in the name of the optimizer as a string, and the backend figures it out.
For example instead of
tf.keras.optimizers.Adam
or
keras.optimizers.Adam
I do
model.compile(optimizer= 'adam' , loss= keras.losses.binary_crossentropy, metrics=['accuracy'])
from tensorflow.keras.optimizers import SGD
This works well.
Since Tensorflow 2.0, there is a new API available directly via tensorflow:
https://www.pyimagesearch.com/2019/10/21/keras-vs-tf-keras-whats-the-difference-in-tensorflow-2-0/
Solution works for tensorflow==2.2.0rc2, Keras==2.2.4 (on Win10)
Please also note that the version above uses learning_rate as parameter and no longer lr.
For some libraries (e.g. keras_radam) you'll need to set up an environment variable before the import:
import os
os.environ['TF_KERAS'] = '1'
import tensorflow
import your_library
In my case it was because I missed the parentheses. I am using tensorflow_addons so my code was like
model.compile(optimizer=tfa.optimizers.LAMB, loss='binary_crossentropy',
metrics=['binary_accuracy'])
And it gives
ValueError: ('Could not interpret optimizer identifier:', <class tensorflow_addons.optimizers.lamb.LAMB'>)
Then I changed my code into:
model.compile(optimizer=tfa.optimizers.LAMB(), loss='binary_crossentropy',
metrics=['binary_accuracy'])
and it works.
recently, in the latest update of Keras API 2.5.0 , importing Adam optimizer shows the following error:
from keras.optimizers import Adam
ImportError: cannot import name 'Adam' from 'keras.optimizers'
instead use the following for importing optimizers (i.e. Adam) :
from keras.optimizers import adam_v2
optimizer = adam_v2.Adam(learning_rate=lr, decay=lr/epochs)
Model.compile(loss='--', optimizer=optimizer , metrics=['--'])
Running the Keras documentaion example https://keras.io/examples/cifar10_cnn/
and installing the latest keras and tensor flow versions
(at the time of this writing
tensorflow 2.0.0a0 and Keras version 2.2.4 )
I had to import explicitly the optimizer the keras the example is using,specifically the line on top of the example :
opt = tensorflow.keras.optimizers.rmsprop(lr=0.0001, decay=1e-6)
was replaced by
from tensorflow.keras.optimizers import RMSprop
opt = RMSprop(lr=0.0001, decay=1e-6)
In the recent version the api "broke" and keras.stuff in a lot of cases became tensorflow.keras.stuff.
Use one style in one kernel, try not to mix
from keras.optimizers import sth
with
from tensorflow.keras.optimizers import sth
I tried the following and it worked for me:
from keras import optimizers
sgd = optimizers.SGD(lr=0.01)
model.compile(loss='mean_squared_error', optimizer=sgd)
use
from tensorflow.keras import optimizers
instead of
from keras import optimizers
Try changing your import lines to
from keras.models import Sequential
from keras.layers import Dense, ...
Your imports seem a little strange to me. Maybe you could elaborate more on that.
I have misplaced parenthesis and got this error,
Initially it was
x=Conv2D(filters[0],(3,3),use_bias=False,padding="same",kernel_regularizer=l2(reg),x))
The corrected version was
x=Conv2D(filters[0],(3,3),use_bias=False,padding="same",kernel_regularizer=l2(reg))(x)
Just give
optimizer = 'sgd' / 'RMSprop'
I got the same error message and resolved this issue, in my case, by replacing the assignment of optimizer:
optimizer=keras.optimizers.Adam
with its instance instead of the class itself:
optimizer=keras.optimizers.Adam()
I tried everything in this thread to fix it but they didn't work. However, I managed to fix it for me. For me, the issue was that calling the optimizer class, ie. tensorflow.keras.optimizers.Adam caused the error, but calling the optimizer as a function, ie. tensorflow.keras.optimizers.Adam() worked. So my code looks like:
model.compile(
loss=tensorflow.keras.losses.categorical_crossentropy(),
optimizer=tensorflow.keras.optimizers.Adam()
)
Looking at the tensorflow github, I am not the only one with this error where calling the function rather than the class fixed the error.

How to import keras from tf.keras in Tensorflow?

import tensorflow as tf
import tensorflow
from tensorflow import keras
from keras.layers import Dense
I am getting the below error
from keras.layers import Input, Dense
Traceback (most recent call last):
File "<ipython-input-6-b5da44e251a5>", line 1, in <module>
from keras.layers import Input, Dense
ModuleNotFoundError: No module named 'keras'
How do I solve this?
Note: I am using Tensorflow version 1.4
Use the keras module from tensorflow like this:
import tensorflow as tf
Import classes
from tensorflow.python.keras.layers import Input, Dense
or use directly
dense = tf.keras.layers.Dense(...)
EDIT Tensorflow 2
from tensorflow.keras.layers import Input, Dense
and the rest stays the same.
Try from tensorflow.python import keras
with this, you can easily change keras dependent code to tensorflow in one line change.
You can also try from tensorflow.contrib import keras. This works on tensorflow 1.3
Edited: for tensorflow 1.10 and above you can use import tensorflow.keras as keras to get keras in tensorflow.
To make it simple I will take the two versions of the code in keras and tf.keras. The example here is a simple Neural Network Model with different layers in it.
In Keras (v2.1.5)
from keras.models import Sequential
from keras.layers import Dense
def get_model(n_x, n_h1, n_h2):
model = Sequential()
model.add(Dense(n_h1, input_dim=n_x, activation='relu'))
model.add(Dense(n_h2, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(4, activation='softmax'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
print(model.summary())
return model
In tf.keras (v1.9)
import tensorflow as tf
def get_model(n_x, n_h1, n_h2):
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(n_h1, input_dim=n_x, activation='relu'))
model.add(tf.keras.layers.Dense(n_h2, activation='relu'))
model.add(tf.keras.layers.Dropout(0.5))
model.add(tf.keras.layers.Dense(4, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
print(model.summary())
return model
or it can be imported the following way instead of the above-mentioned way
from tensorflow.keras.layers import Dense
The official documentation of tf.keras
Note: TensorFlow Version is 1.9
Its not quite fine to downgrade everytime, you may need to make following changes as shown below:
Tensorflow
import tensorflow as tf
#Keras
from tensorflow.keras.models import Sequential, Model, load_model, save_model
from tensorflow.keras.callbacks import ModelCheckpoint
from tensorflow.keras.layers import Dense, Activation, Dropout, Input, Masking, TimeDistributed, LSTM, Conv1D, Embedding
from tensorflow.keras.layers import GRU, Bidirectional, BatchNormalization, Reshape
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.layers import Reshape, Dropout, Dense,Multiply, Dot, Concatenate,Embedding
from tensorflow.keras import optimizers
from tensorflow.keras.callbacks import ModelCheckpoint
The point is that instead of using
from keras.layers import Reshape, Dropout, Dense,Multiply, Dot, Concatenate,Embedding
you need to add
from tensorflow.keras.layers import Reshape, Dropout, Dense,Multiply, Dot, Concatenate,Embedding
Starting from TensorFlow 2.0, only PyCharm versions > 2019.3 are able to recognise tensorflow and keras inside tensorflow (tensorflow.keras). Francois Chollet himself (author of Keras) recommends that everybody switches to tensorflow.keras in place of plain keras.
There is also one important mentioning here:
IMPORTANT NOTE FOR TF >= 2.0
There (this) is an ongoing issue with JetBrains (in fact from TensorFlow side), it seems that this error comes up from time to time (https://youtrack.jetbrains.com/issue/PY-53599).
It sometimes happens that PyCharm is not able to correctly import/recognize keras inside tensorflow or other imports.
Depending on Python + TF + PyCharm versions, you may have to alternate between the following import types:
from tensorflow.keras.models import Model
OR
from tensorflow.python.keras.models import Model
this worked for me in tensorflow==1.4.0
from tensorflow.python import keras
I have a similar problem importing those libs. I am using Anaconda Navigator 1.8.2 with Spyder 3.2.8.
My code is the following:
import matplotlib.pyplot as plt
import tensorflow as tf
import numpy as np
import math
#from tf.keras.models import Sequential # This does not work!
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import InputLayer, Input
from tensorflow.python.keras.layers import Reshape, MaxPooling2D
from tensorflow.python.keras.layers import Conv2D, Dense, Flatten
I get the following error:
from tensorflow.python.keras.models import Sequential
ModuleNotFoundError: No module named 'tensorflow.python.keras'
I solve this erasing tensorflow.python
With this code I solve the error:
import matplotlib.pyplot as plt
import tensorflow as tf
import numpy as np
import math
#from tf.keras.models import Sequential # This does not work!
from keras.models import Sequential
from keras.layers import InputLayer, Input
from keras.layers import Reshape, MaxPooling2D
from keras.layers import Conv2D, Dense, Flatten
I had the same problem with Tensorflow 2.0.0 in PyCharm. PyCharm did not recognize tensorflow.keras; I updated my PyCharm and the problem was resolved!

Categories