Loss function stops working when y_pred is modified - python

I am trying to create a custom loss function but as soon as I try to create a copy of the y_pred (model predictions) tensor, the loss function stops working.
This function is working
def custom_loss(y_true, y_pred):
y_true = tf.cast(y_true, dtype=y_pred.dtype)
loss = binary_crossentropy(y_true, y_pred)
return loss
The output is
Epoch 1/10
26/26 [==============================] - 5s 169ms/step - loss: 56.1577 - accuracy: 0.7867 - val_loss: 14.7032 - val_accuracy: 0.9185
Epoch 2/10
26/26 [==============================] - 4s 159ms/step - loss: 18.6890 - accuracy: 0.8762 - val_loss: 9.4140 - val_accuracy: 0.9185
Epoch 3/10
26/26 [==============================] - 4s 158ms/step - loss: 13.7425 - accuracy: 0.8437 - val_loss: 7.7499 - val_accuracy: 0.9185
Epoch 4/10
26/26 [==============================] - 4s 159ms/step - loss: 10.5267 - accuracy: 0.8510 - val_loss: 6.1037 - val_accuracy: 0.9185
Epoch 5/10
26/26 [==============================] - 4s 160ms/step - loss: 7.5695 - accuracy: 0.8544 - val_loss: 3.9937 - val_accuracy: 0.9185
Epoch 6/10
26/26 [==============================] - 4s 159ms/step - loss: 5.1320 - accuracy: 0.8538 - val_loss: 2.6940 - val_accuracy: 0.9185
Epoch 7/10
26/26 [==============================] - 4s 160ms/step - loss: 3.3265 - accuracy: 0.8557 - val_loss: 1.6613 - val_accuracy: 0.9185
Epoch 8/10
26/26 [==============================] - 4s 160ms/step - loss: 2.1421 - accuracy: 0.8538 - val_loss: 1.0443 - val_accuracy: 0.9185
Epoch 9/10
26/26 [==============================] - 4s 160ms/step - loss: 1.3384 - accuracy: 0.8601 - val_loss: 0.5159 - val_accuracy: 0.9184
Epoch 10/10
26/26 [==============================] - 4s 173ms/step - loss: 0.6041 - accuracy: 0.8895 - val_loss: 0.3164 - val_accuracy: 0.9185
testing
**********Testing model**********
training AUC : 0.6204090733263475
testing AUC: 0.6196677312833667
But this is not working
def custom_loss(y_true, y_pred):
y_true = tf.cast(y_true, dtype=y_pred.dtype)
y_p = tf.identity(y_pred)
loss = binary_crossentropy(y_true, y_p)
return loss
I am getting this output
Epoch 1/10
26/26 [==============================] - 11s 179ms/step - loss: 1.3587 - accuracy: 0.9106 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 2/10
26/26 [==============================] - 4s 159ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 3/10
26/26 [==============================] - 4s 158ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 4/10
26/26 [==============================] - 4s 158ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 5/10
26/26 [==============================] - 4s 158ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 6/10
26/26 [==============================] - 4s 158ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 7/10
26/26 [==============================] - 4s 159ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 8/10
26/26 [==============================] - 4s 159ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 9/10
26/26 [==============================] - 4s 160ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
Epoch 10/10
26/26 [==============================] - 4s 159ms/step - loss: 1.2572 - accuracy: 0.9185 - val_loss: 1.2569 - val_accuracy: 0.9185
testing
**********Testing model**********
training AUC : 0.5
testing AUC : 0.5
Is there a problem with tf.identity() which is causing the issue?
Or is there any other way to copy tensors which I should be using?

Related

Tensorflow fit history is noisier than expected

I've fitted a model and this is the plot i get with te following code:
hist = model.fit(
xs, ys, epochs=300, batch_size=100, validation_split=0.1,
callbacks=[K.callbacks.EarlyStopping(patience=30)]
)
plt.figure(dpi=200)
plt.plot(hist.history["loss"])
plt.plot(hist.history["val_loss"])
plt.legend(["loss", "val_loss"])
However, the training logs are the following:
...
Epoch 200/300
9/9 [==============================] - 3s 384ms/step - loss: 514.2175 - val_loss: 584.2152
Epoch 201/300
9/9 [==============================] - 3s 385ms/step - loss: 510.9814 - val_loss: 581.8872
Epoch 202/300
9/9 [==============================] - 3s 391ms/step - loss: 518.9771 - val_loss: 582.4727
Epoch 203/300
9/9 [==============================] - 3s 383ms/step - loss: 521.8132 - val_loss: 582.9196
Epoch 204/300
9/9 [==============================] - 4s 393ms/step - loss: 516.8439 - val_loss: 584.0792
Epoch 205/300
9/9 [==============================] - 3s 391ms/step - loss: 513.7325 - val_loss: 582.6438
Epoch 206/300
9/9 [==============================] - 3s 390ms/step - loss: 514.4469 - val_loss: 583.5629
Epoch 207/300
9/9 [==============================] - 3s 391ms/step - loss: 522.0557 - val_loss: 581.7162
Epoch 208/300
9/9 [==============================] - 3s 389ms/step - loss: 518.6336 - val_loss: 582.8070
Epoch 209/300
9/9 [==============================] - 3s 391ms/step - loss: 518.0827 - val_loss: 582.4284
Epoch 210/300
9/9 [==============================] - 3s 389ms/step - loss: 514.1886 - val_loss: 582.4635
Epoch 211/300
9/9 [==============================] - 3s 390ms/step - loss: 514.4373 - val_loss: 582.1906
Epoch 212/300
9/9 [==============================] - 3s 391ms/step - loss: 514.9708 - val_loss: 582.1699
Epoch 213/300
9/9 [==============================] - 3s 388ms/step - loss: 521.1622 - val_loss: 582.3545
Epoch 214/300
9/9 [==============================] - 3s 388ms/step - loss: 513.5198 - val_loss: 582.7703
Epoch 215/300
9/9 [==============================] - 3s 392ms/step - loss: 514.6642 - val_loss: 582.3327
Epoch 216/300
9/9 [==============================] - 3s 392ms/step - loss: 514.0926 - val_loss: 583.9896
Epoch 217/300
9/9 [==============================] - 3s 385ms/step - loss: 520.9324 - val_loss: 583.9265
Epoch 218/300
9/9 [==============================] - 4s 397ms/step - loss: 510.2536 - val_loss: 584.6587
Epoch 219/300
9/9 [==============================] - 4s 394ms/step - loss: 515.7706 - val_loss: 583.2895
Epoch 220/300
9/9 [==============================] - 3s 391ms/step - loss: 520.9758 - val_loss: 582.2515
Epoch 221/300
9/9 [==============================] - 3s 386ms/step - loss: 517.8850 - val_loss: 582.1981
Epoch 222/300
9/9 [==============================] - 4s 395ms/step - loss: 514.2051 - val_loss: 583.0013
Epoch 223/300
9/9 [==============================] - 4s 402ms/step - loss: 509.3330 - val_loss: 583.7137
Epoch 224/300
9/9 [==============================] - 3s 388ms/step - loss: 516.6832 - val_loss: 582.0773
Epoch 225/300
9/9 [==============================] - 3s 387ms/step - loss: 515.5243 - val_loss: 582.2585
Epoch 226/300
9/9 [==============================] - 3s 389ms/step - loss: 517.6601 - val_loss: 582.3940
Epoch 227/300
9/9 [==============================] - 3s 388ms/step - loss: 515.7537 - val_loss: 582.3862
Epoch 228/300
9/9 [==============================] - 4s 394ms/step - loss: 516.1107 - val_loss: 582.7234
Epoch 229/300
9/9 [==============================] - 3s 389ms/step - loss: 517.5703 - val_loss: 583.3829
Epoch 230/300
9/9 [==============================] - 3s 388ms/step - loss: 516.7491 - val_loss: 583.4712
Epoch 231/300
9/9 [==============================] - 4s 392ms/step - loss: 520.6753 - val_loss: 583.2650
Epoch 232/300
9/9 [==============================] - 3s 388ms/step - loss: 516.1927 - val_loss: 581.9255
Epoch 233/300
9/9 [==============================] - 4s 393ms/step - loss: 512.5476 - val_loss: 583.1275
Epoch 234/300
9/9 [==============================] - 4s 392ms/step - loss: 513.5744 - val_loss: 583.0643
Epoch 235/300
9/9 [==============================] - 3s 385ms/step - loss: 520.2017 - val_loss: 582.6875
Epoch 236/300
9/9 [==============================] - 3s 386ms/step - loss: 518.7263 - val_loss: 583.0582
Epoch 237/300
9/9 [==============================] - 3s 382ms/step - loss: 521.4882 - val_loss: 582.8977
Infact, if with some regex I extract the training loss, and plot it, I get the following plot:
What am I missing about History?...

Val_accuracy not increasing

I'm new to this technology, so I was trying to build a model on image dataset.
I have used this architecture -
model = keras.Sequential()
model.add(layers.Conv2D(filters=6, kernel_size=(3, 3), activation='relu', input_shape=(32,32,1)))
model.add(layers.AveragePooling2D())
model.add(layers.Conv2D(filters=16, kernel_size=(3, 3), activation='relu'))
model.add(layers.AveragePooling2D())
model.add(layers.Flatten())
model.add(layers.Dense(units=120, activation='relu'))
model.add(layers.Dense(units=84, activation='relu'))
model.add(layers.Dense(units=1, activation = 'sigmoid'))
The accuracy and loss seems pretty good but not the validation accuracy -
Epoch 1/50
10/10 [==============================] - 17s 2s/step - loss: 20.8554 - accuracy: 0.5170 -
val_loss: 0.8757 - val_accuracy: 0.5946
Epoch 2/50
10/10 [==============================] - 14s 1s/step - loss: 1.5565 - accuracy: 0.5612 -
val_loss: 0.8725 - val_accuracy: 0.5811
Epoch 3/50
10/10 [==============================] - 14s 1s/step - loss: 0.8374 - accuracy: 0.6293 -
val_loss: 0.8483 - val_accuracy: 0.5405
Epoch 4/50
10/10 [==============================] - 14s 1s/step - loss: 1.0340 - accuracy: 0.5748 -
val_loss: 1.6252 - val_accuracy: 0.5135
Epoch 5/50
10/10 [==============================] - 14s 1s/step - loss: 1.1054 - accuracy: 0.5816 -
val_loss: 0.7324 - val_accuracy: 0.6486
Epoch 6/50
10/10 [==============================] - 15s 1s/step - loss: 0.5942 - accuracy: 0.7041 -
val_loss: 0.7412 - val_accuracy: 0.6351
Epoch 7/50
10/10 [==============================] - 15s 2s/step - loss: 0.6041 - accuracy: 0.6939 -
val_loss: 0.6918 - val_accuracy: 0.6622
Epoch 8/50
10/10 [==============================] - 14s 1s/step - loss: 0.4944 - accuracy: 0.7687 -
val_loss: 0.7083 - val_accuracy: 0.6216
Epoch 9/50
10/10 [==============================] - 14s 1s/step - loss: 0.5231 - accuracy: 0.7007 -
val_loss: 1.0332 - val_accuracy: 0.5270
Epoch 10/50
10/10 [==============================] - 14s 1s/step - loss: 0.5133 - accuracy: 0.7313 -
val_loss: 0.6859 - val_accuracy: 0.5811
Epoch 11/50
10/10 [==============================] - 14s 1s/step - loss: 0.6177 - accuracy: 0.6735 -
val_loss: 1.0781 - val_accuracy: 0.5135
Epoch 12/50
10/10 [==============================] - 14s 1s/step - loss: 0.9852 - accuracy: 0.6701 -
val_loss: 3.0853 - val_accuracy: 0.4865
Epoch 13/50
10/10 [==============================] - 13s 1s/step - loss: 1.0099 - accuracy: 0.6259 -
val_loss: 1.8193 - val_accuracy: 0.5000
Epoch 14/50
10/10 [==============================] - 13s 1s/step - loss: 0.7179 - accuracy: 0.7041 -
val_loss: 1.5659 - val_accuracy: 0.5135
Epoch 15/50
10/10 [==============================] - 14s 1s/step - loss: 0.4575 - accuracy: 0.7857 -
val_loss: 0.6865 - val_accuracy: 0.5946
Epoch 16/50
10/10 [==============================] - 14s 1s/step - loss: 0.6540 - accuracy: 0.7177 -
val_loss: 1.7108 - val_accuracy: 0.5405
Epoch 17/50
10/10 [==============================] - 13s 1s/step - loss: 1.3617 - accuracy: 0.6156 -
val_loss: 1.1215 - val_accuracy: 0.5811
Epoch 18/50
10/10 [==============================] - 14s 1s/step - loss: 0.6983 - accuracy: 0.7245 -
val_loss: 2.1121 - val_accuracy: 0.5135
Epoch 19/50
10/10 [==============================] - 15s 1s/step - loss: 0.6669 - accuracy: 0.7415 -
val_loss: 0.8061 - val_accuracy: 0.6216
Epoch 20/50
10/10 [==============================] - 14s 1s/step - loss: 0.3853 - accuracy: 0.8129 -
val_loss: 0.7368 - val_accuracy: 0.6757
Epoch 21/50
10/10 [==============================] - 13s 1s/step - loss: 0.5672 - accuracy: 0.7347 -
val_loss: 1.4207 - val_accuracy: 0.5270
Epoch 22/50
10/10 [==============================] - 14s 1s/step - loss: 0.4770 - accuracy: 0.7551 -
val_loss: 1.6060 - val_accuracy: 0.5135
Epoch 23/50
10/10 [==============================] - 14s 1s/step - loss: 0.7212 - accuracy: 0.7041 -
val_loss: 1.1835 - val_accuracy: 0.5811
Epoch 24/50
10/10 [==============================] - 14s 1s/step - loss: 0.5231 - accuracy: 0.7483 -
val_loss: 0.6802 - val_accuracy: 0.7027
Epoch 25/50
10/10 [==============================] - 13s 1s/step - loss: 0.3185 - accuracy: 0.8367 -
val_loss: 0.6644 - val_accuracy: 0.7027
Epoch 26/50
10/10 [==============================] - 14s 1s/step - loss: 0.2500 - accuracy: 0.8912 -
val_loss: 0.8569 - val_accuracy: 0.6486
Epoch 27/50
10/10 [==============================] - 14s 1s/step - loss: 0.2279 - accuracy: 0.9082 -
val_loss: 0.7515 - val_accuracy: 0.7162
Epoch 28/50
10/10 [==============================] - 14s 1s/step - loss: 0.2349 - accuracy: 0.9082 -
val_loss: 0.9439 - val_accuracy: 0.5811
Epoch 29/50
10/10 [==============================] - 13s 1s/step - loss: 0.2051 - accuracy: 0.9184 -
val_loss: 0.7895 - val_accuracy: 0.7027
Epoch 30/50
10/10 [==============================] - 14s 1s/step - loss: 0.1236 - accuracy: 0.9592 -
val_loss: 0.7387 - val_accuracy: 0.7297
Epoch 31/50
10/10 [==============================] - 14s 1s/step - loss: 0.1370 - accuracy: 0.9524 -
val_loss: 0.7387 - val_accuracy: 0.7297
Epoch 32/50
10/10 [==============================] - 14s 1s/step - loss: 0.0980 - accuracy: 0.9796 -
val_loss: 0.6901 - val_accuracy: 0.7162
Epoch 33/50
10/10 [==============================] - 14s 1s/step - loss: 0.0989 - accuracy: 0.9762 -
val_loss: 0.7754 - val_accuracy: 0.7162
Epoch 34/50
10/10 [==============================] - 14s 1s/step - loss: 0.1195 - accuracy: 0.9592 -
val_loss: 0.6639 - val_accuracy: 0.6622
Epoch 35/50
10/10 [==============================] - 14s 1s/step - loss: 0.0805 - accuracy: 0.9898 -
val_loss: 0.7666 - val_accuracy: 0.7162
Epoch 36/50
10/10 [==============================] - 14s 1s/step - loss: 0.0649 - accuracy: 0.9966 -
val_loss: 0.7543 - val_accuracy: 0.7162
Epoch 37/50
10/10 [==============================] - 14s 1s/step - loss: 0.0604 - accuracy: 0.9898 -
val_loss: 0.7472 - val_accuracy: 0.7297
Epoch 38/50
10/10 [==============================] - 14s 1s/step - loss: 0.0538 - accuracy: 1.0000 -
val_loss: 0.7287 - val_accuracy: 0.7432
Epoch 39/50
10/10 [==============================] - 13s 1s/step - loss: 0.0430 - accuracy: 0.9966 -
val_loss: 0.8989 - val_accuracy: 0.6622
Epoch 40/50
10/10 [==============================] - 14s 1s/step - loss: 0.0386 - accuracy: 1.0000 -
val_loss: 0.6951 - val_accuracy: 0.6892
Epoch 41/50
10/10 [==============================] - 13s 1s/step - loss: 0.0379 - accuracy: 1.0000 -
val_loss: 0.8485 - val_accuracy: 0.6892
Epoch 42/50
10/10 [==============================] - 14s 1s/step - loss: 0.0276 - accuracy: 1.0000 -
val_loss: 0.9726 - val_accuracy: 0.6486
Epoch 43/50
10/10 [==============================] - 13s 1s/step - loss: 0.0329 - accuracy: 1.0000 -
val_loss: 0.7336 - val_accuracy: 0.7568
Epoch 44/50
10/10 [==============================] - 14s 1s/step - loss: 0.0226 - accuracy: 1.0000 -
val_loss: 0.8846 - val_accuracy: 0.6892
Epoch 45/50
10/10 [==============================] - 13s 1s/step - loss: 0.0249 - accuracy: 1.0000 -
val_loss: 0.9542 - val_accuracy: 0.6892
Epoch 46/50
10/10 [==============================] - 14s 1s/step - loss: 0.0171 - accuracy: 1.0000 -
val_loss: 0.8792 - val_accuracy: 0.6892
Epoch 47/50
10/10 [==============================] - 15s 1s/step - loss: 0.0122 - accuracy: 1.0000 -
val_loss: 0.8564 - val_accuracy: 0.7162
Epoch 48/50
10/10 [==============================] - 13s 1s/step - loss: 0.0114 - accuracy: 1.0000 -
val_loss: 0.8900 - val_accuracy: 0.7027
Epoch 49/50
10/10 [==============================] - 13s 1s/step - loss: 0.0084 - accuracy: 1.0000 -
val_loss: 0.8981 - val_accuracy: 0.7027
I tried changing the parameters too yet no result. Would be helpful if I can get to know what's wrong with the val_accuracy. Thanks in advance.
You are using less dataset specially test dataset for validation. Try adding some more data to train the model and for validation, then you can see the difference in val_accuracy. You can also try by adding more layers to the model.
There are some other methods available like, data augmentation, dropout, regularizers to increase the accuracy of the model by avoiding overfitting problem.
Please follow this reference to overcome the overfitting problem and to best train your model.

high mse loss with autoencoder for tabular data

I am trying to run an autoencoder for dimensionality reduction on a Fraud Detection dataset (https://www.kaggle.com/kartik2112/fraud-detection?select=fraudTest.csv) and am receiving very high loss values for each iteration. Below is the autoencoder code.
nb_epoch = 100
batch_size = 128
input_dim = X_train.shape[1]
encoding_dim = 14
hidden_dim = int(encoding_dim / 2)
learning_rate = 1e-7
input_layer = Input(shape=(input_dim, ))
encoder = Dense(encoding_dim, activation="tanh", activity_regularizer=regularizers.l1(learning_rate))(input_layer)
encoder = Dense(hidden_dim, activation="relu")(encoder)
decoder = Dense(hidden_dim, activation='tanh')(encoder)
decoder = Dense(input_dim, activation='relu')(decoder)
autoencoder = Model(inputs=input_layer, outputs=decoder)
autoencoder.compile(metrics=['accuracy'],
loss='mean_squared_error',
optimizer='adam')
cp = ModelCheckpoint(filepath="autoencoder_fraud.h5",
save_best_only=True,
verbose=0)
tb = TensorBoard(log_dir='./logs',
histogram_freq=0,
write_graph=True,
write_images=True)
history = autoencoder.fit(X_train, X_train,
epochs=nb_epoch,
batch_size=batch_size,
shuffle=True,
validation_data=(X_test, X_test),
verbose=1,
callbacks=[cp, tb]).history
here is a snippet of the loss values.
Epoch 1/100
10131/10131 [==============================] - 32s 3ms/step - loss: 52445827358.6230 - accuracy: 0.3389 - val_loss: 9625651200.0000 - val_accuracy: 0.5083
Epoch 2/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52393605025.8066 - accuracy: 0.5083 - val_loss: 9621398528.0000 - val_accuracy: 0.5083
Epoch 3/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52486496629.1354 - accuracy: 0.5082 - val_loss: 9617147904.0000 - val_accuracy: 0.5083
Epoch 4/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52514002255.9432 - accuracy: 0.5070 - val_loss: 9612887040.0000 - val_accuracy: 0.5083
Epoch 5/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52436489238.6388 - accuracy: 0.5076 - val_loss: 9608664064.0000 - val_accuracy: 0.5083
Epoch 6/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52430005774.7556 - accuracy: 0.5081 - val_loss: 9604417536.0000 - val_accuracy: 0.5083
Epoch 7/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52474495714.5898 - accuracy: 0.5079 - val_loss: 9600195584.0000 - val_accuracy: 0.5083
Epoch 8/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52423052560.0695 - accuracy: 0.5076 - val_loss: 9595947008.0000 - val_accuracy: 0.5083
Epoch 9/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52442358260.0742 - accuracy: 0.5072 - val_loss: 9591708672.0000 - val_accuracy: 0.5083
Epoch 10/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52402494704.5369 - accuracy: 0.5089 - val_loss: 9587487744.0000 - val_accuracy: 0.5083
Epoch 11/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52396583628.3553 - accuracy: 0.5081 - val_loss: 9583238144.0000 - val_accuracy: 0.5083
Epoch 12/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52349824708.2700 - accuracy: 0.5076 - val_loss: 9579020288.0000 - val_accuracy: 0.5083
Epoch 13/100
10131/10131 [==============================] - 31s 3ms/step - loss: 52332072133.6850 - accuracy: 0.5083 - val_loss: 9574786048.0000 - val_accuracy: 0.5083
Epoch 14/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52353680011.6731 - accuracy: 0.5086 - val_loss: 9570555904.0000 - val_accuracy: 0.5083
Epoch 15/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52347432594.5456 - accuracy: 0.5088 - val_loss: 9566344192.0000 - val_accuracy: 0.5083
Epoch 16/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52327825554.3435 - accuracy: 0.5076 - val_loss: 9562103808.0000 - val_accuracy: 0.5083
Epoch 17/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52347251610.1255 - accuracy: 0.5080 - val_loss: 9557892096.0000 - val_accuracy: 0.5083
Epoch 18/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52292632667.3636 - accuracy: 0.5079 - val_loss: 9553654784.0000 - val_accuracy: 0.5083
Epoch 19/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52354135093.7671 - accuracy: 0.5083 - val_loss: 9549425664.0000 - val_accuracy: 0.5083
Epoch 20/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52295668148.2006 - accuracy: 0.5086 - val_loss: 9545219072.0000 - val_accuracy: 0.5083
Epoch 21/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52314219115.3320 - accuracy: 0.5079 - val_loss: 9540980736.0000 - val_accuracy: 0.5083
Epoch 22/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52328022934.0829 - accuracy: 0.5079 - val_loss: 9536788480.0000 - val_accuracy: 0.5083
Epoch 23/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52268139834.5172 - accuracy: 0.5074 - val_loss: 9532554240.0000 - val_accuracy: 0.5083
Epoch 24/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52308370726.3040 - accuracy: 0.5077 - val_loss: 9528341504.0000 - val_accuracy: 0.5083
Epoch 25/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52224468101.4070 - accuracy: 0.5081 - val_loss: 9524126720.0000 - val_accuracy: 0.5083
Epoch 26/100
10131/10131 [==============================] - 30s 3ms/step - loss: 52200100823.1694 - accuracy: 0.5080 - val_loss: 9519915008.0000 - val_accuracy: 0.5083
Any advice/solution will be highly appreciated. Thank you
I have scaled the numarical data using StandardScaler and encoded
categorical data using LabelEncoder
First of all, check what numerical data you scaled.
I think you wrongly scaled cc_num, because cc_num is a categorical column.
This should solve your problem with high loss, but it doen't mean your model will be good.
You should first make a good check on the features and try to get some useful relationships between label and features (data preprocessing/featurezation)

Training CNN on Matlab gives different results compared to training the same network on python using keras

I'm using Keras to train a netwok for classification problem on python, the model that i am using is as follows:
filter_size = (2,2)
maxpool_size = (2, 2)
dr = 0.5
inputs = Input((12,8,1), name='main_input')
main_branch = Conv2D(20, kernel_size=filter_size, padding="same", kernel_regularizer=l2(0.0001),bias_regularizer=l2(0.0001))(inputs)
main_branch = BatchNormalization(momentum=0.9)(main_branch)
main_branch = Activation("relu")(main_branch)
main_branch = MaxPooling2D(pool_size=maxpool_size,strides=(1, 1))(main_branch)
main_branch = Conv2D(40, kernel_size=filter_size, padding="same", kernel_regularizer=l2(0.0001),bias_regularizer=l2(0.0001))(main_branch)
main_branch = BatchNormalization(momentum=0.9)(main_branch)
main_branch = Activation("relu")(main_branch)
main_branch = Flatten()(main_branch)
main_branch = Dense(100,kernel_regularizer=l2(0.0001),bias_regularizer=l2(0.0001))(main_branch)
main_branch = Dense(100, kernel_regularizer=l2(0.0001),bias_regularizer=l2(0.0001))(main_branch)
SubArray_branch = Dense(496, activation='softmax', name='SubArray_output')(main_branch)
model = Model(inputs = inputs,
outputs = SubArray_branch)
opt = keras.optimizers.Adam(lr=1e-3, epsilon=1e-08, clipnorm=1.0)
model.compile(optimizer=opt,
loss={'SubArray_output': 'sparse_categorical_crossentropy'},
metrics=['accuracy'] )
history = model.fit({'main_input': Channel},
{'SubArray_output': array_indx},
validation_data=(test_Data,test_array),
epochs=100, batch_size=128,
verbose=1,
validation_split=0.2
)
when I am training this network on my training data I get high validation-loss compared to training-loss as you can see in below:
471/471 [==============================] - 5s 10ms/step - loss: 0.5723 - accuracy: 0.9010 - val_loss: 20.2040 - val_accuracy: 0.0126
Epoch 33/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5486 - accuracy: 0.9087 - val_loss: 35.2516 - val_accuracy: 0.0037
Epoch 34/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5342 - accuracy: 0.9159 - val_loss: 50.2577 - val_accuracy: 0.0043
Epoch 35/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5345 - accuracy: 0.9132 - val_loss: 26.0221 - val_accuracy: 0.0051
Epoch 36/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5333 - accuracy: 0.9140 - val_loss: 71.2754 - val_accuracy: 0.0043
Epoch 37/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5149 - accuracy: 0.9231 - val_loss: 67.2646 - val_accuracy: 3.3227e-04
Epoch 38/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5269 - accuracy: 0.9162 - val_loss: 17.7448 - val_accuracy: 0.0206
Epoch 39/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5198 - accuracy: 0.9201 - val_loss: 92.7240 - val_accuracy: 0.0015
Epoch 40/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5157 - accuracy: 0.9247 - val_loss: 30.9589 - val_accuracy: 0.0082
Epoch 41/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4961 - accuracy: 0.9316 - val_loss: 20.0444 - val_accuracy: 0.0141
Epoch 42/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5093 - accuracy: 0.9256 - val_loss: 16.7269 - val_accuracy: 0.0172
Epoch 43/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5092 - accuracy: 0.9267 - val_loss: 15.6939 - val_accuracy: 0.0320
Epoch 44/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5104 - accuracy: 0.9270 - val_loss: 103.2581 - val_accuracy: 0.0027
Epoch 45/100
471/471 [==============================] - 5s 10ms/step - loss: 0.5074 - accuracy: 0.9286 - val_loss: 28.3097 - val_accuracy: 0.0154
Epoch 46/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4977 - accuracy: 0.9303 - val_loss: 28.6676 - val_accuracy: 0.0167
Epoch 47/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4823 - accuracy: 0.9375 - val_loss: 47.4671 - val_accuracy: 0.0015
Epoch 48/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5053 - accuracy: 0.9291 - val_loss: 39.3356 - val_accuracy: 0.0082
Epoch 49/100
471/471 [==============================] - 5s 11ms/step - loss: 0.5110 - accuracy: 0.9287 - val_loss: 42.8834 - val_accuracy: 0.0082
Epoch 50/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4895 - accuracy: 0.9366 - val_loss: 11.7254 - val_accuracy: 0.0700
Epoch 51/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4909 - accuracy: 0.9351 - val_loss: 14.5519 - val_accuracy: 0.0276
Epoch 52/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4846 - accuracy: 0.9380 - val_loss: 22.5101 - val_accuracy: 0.0122
Epoch 53/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4991 - accuracy: 0.9315 - val_loss: 16.1494 - val_accuracy: 0.0283
Epoch 54/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4782 - accuracy: 0.9423 - val_loss: 14.8626 - val_accuracy: 0.0551
Epoch 55/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4807 - accuracy: 0.9401 - val_loss: 100.8670 - val_accuracy: 9.9681e-04
Epoch 56/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4759 - accuracy: 0.9420 - val_loss: 34.8571 - val_accuracy: 0.0047
Epoch 57/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4802 - accuracy: 0.9406 - val_loss: 23.2134 - val_accuracy: 0.0524
Epoch 58/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4998 - accuracy: 0.9334 - val_loss: 20.9038 - val_accuracy: 0.0207
Epoch 59/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4813 - accuracy: 0.9400 - val_loss: 19.5474 - val_accuracy: 0.0393
Epoch 60/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4846 - accuracy: 0.9399 - val_loss: 15.1594 - val_accuracy: 0.0439
Epoch 61/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4718 - accuracy: 0.9436 - val_loss: 30.0164 - val_accuracy: 0.0078
Epoch 62/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4897 - accuracy: 0.9375 - val_loss: 60.0498 - val_accuracy: 0.0144
Epoch 63/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4668 - accuracy: 0.9461 - val_loss: 18.8190 - val_accuracy: 0.0298
Epoch 64/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4598 - accuracy: 0.9485 - val_loss: 26.1101 - val_accuracy: 0.0231
Epoch 65/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4672 - accuracy: 0.9442 - val_loss: 108.7207 - val_accuracy: 2.6582e-04
Epoch 66/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4910 - accuracy: 0.9378 - val_loss: 45.6070 - val_accuracy: 0.0052
Epoch 67/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4805 - accuracy: 0.9429 - val_loss: 39.3904 - val_accuracy: 0.0057
Epoch 68/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4682 - accuracy: 0.9451 - val_loss: 21.5525 - val_accuracy: 0.0328
Epoch 69/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4613 - accuracy: 0.9472 - val_loss: 46.7714 - val_accuracy: 0.0027
Epoch 70/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4786 - accuracy: 0.9417 - val_loss: 13.4834 - val_accuracy: 0.0708
Epoch 71/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4756 - accuracy: 0.9442 - val_loss: 41.8796 - val_accuracy: 0.0199
Epoch 72/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4655 - accuracy: 0.9464 - val_loss: 57.7453 - val_accuracy: 0.0017
Epoch 73/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4795 - accuracy: 0.9428 - val_loss: 16.1949 - val_accuracy: 0.0285
Epoch 74/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4755 - accuracy: 0.9440 - val_loss: 68.2349 - val_accuracy: 0.0139
Epoch 75/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4807 - accuracy: 0.9425 - val_loss: 43.4699 - val_accuracy: 0.0233
Epoch 76/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4515 - accuracy: 0.9524 - val_loss: 175.2205 - val_accuracy: 0.0019
Epoch 77/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4715 - accuracy: 0.9467 - val_loss: 92.2833 - val_accuracy: 0.0017
Epoch 78/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4736 - accuracy: 0.9447 - val_loss: 94.7209 - val_accuracy: 0.0059
Epoch 79/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4661 - accuracy: 0.9473 - val_loss: 17.8870 - val_accuracy: 0.0386
Epoch 80/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4614 - accuracy: 0.9492 - val_loss: 28.1883 - val_accuracy: 0.0042
Epoch 81/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4569 - accuracy: 0.9507 - val_loss: 49.2823 - val_accuracy: 0.0032
Epoch 82/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4623 - accuracy: 0.9485 - val_loss: 29.8972 - val_accuracy: 0.0100
Epoch 83/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4799 - accuracy: 0.9429 - val_loss: 109.5044 - val_accuracy: 0.0062
Epoch 84/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4810 - accuracy: 0.9444 - val_loss: 71.2103 - val_accuracy: 0.0051
Epoch 85/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4452 - accuracy: 0.9552 - val_loss: 30.7861 - val_accuracy: 0.0100
Epoch 86/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4805 - accuracy: 0.9423 - val_loss: 48.1887 - val_accuracy: 0.0031
Epoch 87/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4564 - accuracy: 0.9512 - val_loss: 189.6711 - val_accuracy: 1.3291e-04
Epoch 88/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4479 - accuracy: 0.9537 - val_loss: 58.6349 - val_accuracy: 0.0199
Epoch 89/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4667 - accuracy: 0.9476 - val_loss: 95.7323 - val_accuracy: 0.0041
Epoch 90/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4808 - accuracy: 0.9436 - val_loss: 28.7513 - val_accuracy: 0.0191
Epoch 91/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4583 - accuracy: 0.9511 - val_loss: 16.4281 - val_accuracy: 0.0431
Epoch 92/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4458 - accuracy: 0.9541 - val_loss: 15.3890 - val_accuracy: 0.0517
Epoch 93/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4628 - accuracy: 0.9491 - val_loss: 37.3123 - val_accuracy: 0.0024
Epoch 94/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4716 - accuracy: 0.9481 - val_loss: 24.8934 - val_accuracy: 0.0123
Epoch 95/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4646 - accuracy: 0.9469 - val_loss: 54.6682 - val_accuracy: 5.9809e-04
Epoch 96/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4665 - accuracy: 0.9492 - val_loss: 89.1835 - val_accuracy: 0.0064
Epoch 97/100
471/471 [==============================] - 5s 10ms/step - loss: 0.4533 - accuracy: 0.9527 - val_loss: 60.9850 - val_accuracy: 0.0035
Epoch 98/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4597 - accuracy: 0.9491 - val_loss: 41.6088 - val_accuracy: 0.0023
Epoch 99/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4511 - accuracy: 0.9537 - val_loss: 28.2131 - val_accuracy: 0.0025
Epoch 100/100
471/471 [==============================] - 5s 11ms/step - loss: 0.4568 - accuracy: 0.9509 - val_loss: 121.8944 - val_accuracy: 0.0041
I am well aware that the problem I am facing is due to overfitting, but when I train the same network with the same training data on Matlab, the values of training-loss and validation-loss are close to each other. The picture of Training Progress on Matlab is linked as:
Training Progress
I would appreciate it, if anyone could explain to me why I can’t get the same result on python? What would you suggest to solve this problem?

why my model performance performing so slow?

i have this CNN model with 3 block of VGG architecture
import tensorflow as tf
from tensorflow.keras import datasets, layers, models
import matplotlib.pyplot as plt
from keras.preprocessing import image
from keras.preprocessing.image import ImageDataGenerator
from keras.regularizers import L2, L1, L1L2
from keras.optimizers import SGD, Adam, Adagrad, RMSprop
from keras.models import load_model, Model
import numpy as np
import keras as k
#Load data dan split data
(train_images, train_labels),(test_images, test_labels) = datasets.cifar10.load_data()
#Normalize Data
train_images = train_images / 255.0
test_images = test_images / 255.0
#Convert menjadi one-hot-encode
num_classes = 10
train_labels = k.utils.to_categorical(train_labels, num_classes)
test_labels = k.utils.to_categorical(test_labels, num_classes)
# Data Augmentation
datagen = ImageDataGenerator(
width_shift_range=0.1,
height_shift_range=0.1,
horizontal_flip=True,)
datagen.fit(train_images)
reg=None
num_filters=32
ac='relu'
adm=Adam(lr=0.001,decay=0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
sgd=SGD(lr=0.01, momentum=0.9)
rms=RMSprop(lr=0.0001,decay=1e-6)
agr=Adagrad(learning_rate=0.0001,initial_accumulator_value=0.1,epsilon=1e-08)
opt=adm
drop_dense=0.5
drop_conv=0.2
model = models.Sequential()
model.add(layers.Conv2D(num_filters, (3, 3), activation=ac, kernel_regularizer=reg, input_shape=(32, 32, 3),padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))
model.add(layers.Dropout(drop_conv))
model.add(layers.Conv2D(2*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(2*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))
model.add(layers.Dropout(2 * drop_conv))
model.add(layers.Conv2D(4*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(4*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))
model.add(layers.Dropout(3 * drop_conv))
model.add(layers.Flatten())
model.add(layers.Dense(512, activation=ac,kernel_regularizer=reg))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(drop_dense))
model.add(layers.Dense(num_classes, activation='softmax'))
model.compile(loss='categorical_crossentropy', metrics=['accuracy'],optimizer=adm)
model.summary()
history=model.fit_generator(datagen.flow(train_images, train_labels, batch_size=256),
steps_per_epoch = len(train_images) / 256, epochs=200,
validation_data=(test_images, test_labels))
loss, accuracy = model.evaluate(test_images, test_labels)
print("Accuracy is : ", accuracy * 100)
print("Loss is : ", loss)
N = 200
plt.style.use("ggplot")
plt.figure()
plt.plot(np.arange(0, N), history.history["loss"], label="train_loss")
plt.plot(np.arange(0, N), history.history["val_loss"], label="val_loss")
plt.plot(np.arange(0, N), history.history["accuracy"], label="train_acc")
plt.plot(np.arange(0, N), history.history["val_accuracy"], label="val_acc")
plt.title("Training Loss and Accuracy")
plt.xlabel("Epochs")
plt.ylabel("Loss/Accuracy")
plt.legend(loc="upper left")
plt.show()
model.save("model_test_9.h5") # serialize weights to HDF5
FileLink(r'model_test_9.h5')
# ADM Improve Dropout dataaugment
output :
Epoch 40/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4334 - accuracy: 0.8507 - val_loss: 0.5041 - val_accuracy: 0.8357
Epoch 41/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4289 - accuracy: 0.8522 - val_loss: 0.5354 - val_accuracy: 0.8284
Epoch 42/200
195/195 [==============================] - 21s 110ms/step - loss: 0.4333 - accuracy: 0.8490 - val_loss: 0.4560 - val_accuracy: 0.8499: 0.4334 - ac - ETA: 1s - loss:
Epoch 43/200
195/195 [==============================] - 21s 110ms/step - loss: 0.4198 - accuracy: 0.8555 - val_loss: 0.4817 - val_accuracy: 0.8429
Epoch 44/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4130 - accuracy: 0.8556 - val_loss: 0.4768 - val_accuracy: 0.8407ccuracy: 0. - ETA: 5s - los
Epoch 45/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4180 - accuracy: 0.8544 - val_loss: 0.4526 - val_accuracy: 0.8483 accuracy
Epoch 46/200
195/195 [==============================] - 21s 108ms/step - loss: 0.4113 - accuracy: 0.8565 - val_loss: 0.4129 - val_accuracy: 0.8618
Epoch 47/200
195/195 [==============================] - 21s 108ms/step - loss: 0.4078 - accuracy: 0.8584 - val_loss: 0.4108 - val_accuracy: 0.8659
Epoch 48/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4184 - accuracy: 0.8538 - val_loss: 0.4370 - val_accuracy: 0.8557
Epoch 49/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3926 - accuracy: 0.8641 - val_loss: 0.3817 - val_accuracy: 0.8685
Epoch 50/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4044 - accuracy: 0.8587 - val_loss: 0.4225 - val_accuracy: 0.8571
Epoch 51/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3919 - accuracy: 0.8640 - val_loss: 0.4101 - val_accuracy: 0.8625
Epoch 52/200
195/195 [==============================] - 21s 106ms/step - loss: 0.4035 - accuracy: 0.8623 - val_loss: 0.4341 - val_accuracy: 0.8561059 - accuracy: 0.86 - ETA: 8s - loss: 0.4059 - accuracy: - ETA: 7s - loss: 0.4057 - ac - ETA: 7s - loss: 0.4054 - ac - ETA: 6s - - ETA: 0s - loss: 0.4036 - accura
Epoch 53/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3963 - accuracy: 0.8619 - val_loss: 0.4180 - val_accuracy: 0.8576
Epoch 54/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3901 - accuracy: 0.8635 - val_loss: 0.3744 - val_accuracy: 0.8712
Epoch 55/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3917 - accuracy: 0.8640 - val_loss: 0.3751 - val_accuracy: 0.87363909 - accu - ETA: 2s - loss: 0.3911 - ac
Epoch 56/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3795 - accuracy: 0.8679 - val_loss: 0.4697 - val_accuracy: 0.8445ss: 0.3764 - ac - ETA: 15s - loss: 0.3758 - accuracy:
Epoch 57/200
195/195 [==============================] - 22s 111ms/step - loss: 0.3844 - accuracy: 0.8656 - val_loss: 0.4058 - val_accuracy: 0.8620- los - ETA: 0s - loss: 0.3842 - accuracy - ETA: 0s - loss: 0.3843 - accura
Epoch 58/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3864 - accuracy: 0.8656 - val_loss: 0.4226 - val_accuracy: 0.8588
Epoch 59/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3821 - accuracy: 0.8684 - val_loss: 0.3986 - val_accuracy: 0.8666
Epoch 60/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3728 - accuracy: 0.8708 - val_loss: 0.4196 - val_accuracy: 0.8638
Epoch 61/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3724 - accuracy: 0.8699 - val_loss: 0.3928 - val_accuracy: 0.8654loss: 0 - ETA: 3s - loss: 0.3 -
Epoch 62/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3724 - accuracy: 0.8712 - val_loss: 0.3615 - val_accuracy: 0.8782
Epoch 63/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3758 - accuracy: 0.8691 - val_loss: 0.3976 - val_accuracy: 0.8707
Epoch 64/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3698 - accuracy: 0.8714 - val_loss: 0.4429 - val_accuracy: 0.8554
Epoch 65/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3570 - accuracy: 0.8750 - val_loss: 0.3702 - val_accuracy: 0.8740
Epoch 66/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3588 - accuracy: 0.8751 - val_loss: 0.3885 - val_accuracy: 0.8717
Epoch 67/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3597 - accuracy: 0.8749 - val_loss: 0.3781 - val_accuracy: 0.8777
Epoch 68/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3590 - accuracy: 0.8756 - val_loss: 0.4230 - val_accuracy: 0.8613
Epoch 69/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3540 - accuracy: 0.8756 - val_loss: 0.3972 - val_accuracy: 0.8694
Epoch 70/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3588 - accuracy: 0.8729 - val_loss: 0.4242 - val_accuracy: 0.8598
Epoch 71/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3608 - accuracy: 0.8748 - val_loss: 0.3887 - val_accuracy: 0.8683
Epoch 72/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3511 - accuracy: 0.8783 - val_loss: 0.3912 - val_accuracy: 0.8716
Epoch 73/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3516 - accuracy: 0.8769 - val_loss: 0.4673 - val_accuracy: 0.8515
Epoch 74/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3484 - accuracy: 0.8787 - val_loss: 0.3990 - val_accuracy: 0.8664
Epoch 75/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3506 - accuracy: 0.8780 - val_loss: 0.3869 - val_accuracy: 0.8666
Epoch 76/200
195/195 [==============================] - 20s 105ms/step - loss: 0.3484 - accuracy: 0.8795 - val_loss: 0.3447 - val_accuracy: 0.8853
Epoch 77/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3493 - accuracy: 0.8774 - val_loss: 0.3644 - val_accuracy: 0.8794
Epoch 78/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3443 - accuracy: 0.8813 - val_loss: 0.4117 - val_accuracy: 0.8665
Epoch 79/200
195/195 [==============================] - 20s 104ms/step - loss: 0.3436 - accuracy: 0.8796 - val_loss: 0.3695 - val_accuracy: 0.8758
Epoch 80/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3487 - accuracy: 0.8788 - val_loss: 0.3583 - val_accuracy: 0.8789
Epoch 81/200
accuracy: - ETA: 1s - los
Epoch 92/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3320 - accuracy: 0.8834 - val_loss: 0.3658 - val_accuracy: 0.8794
Epoch 93/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3251 - accuracy: 0.8858 - val_loss: 0.4003 - val_accuracy: 0.8646
Epoch 94/200
195/195 [==============================] - 20s 103ms/step - loss: 0.3202 - accuracy: 0.8894 - val_loss: 0.3943 - val_accuracy: 0.8695
Epoch 95/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3238 - accuracy: 0.8887 - val_loss: 0.3232 - val_accuracy: 0.8931
Epoch 96/200
195/195 [==============================] - 21s 105ms/step - loss: 0.3236 - accuracy: 0.8881 - val_loss: 0.3659 - val_accuracy: 0.8777
Epoch 97/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3116 - accuracy: 0.8912 - val_loss: 0.4218 - val_accuracy: 0.8634
Epoch 98/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3189 - accuracy: 0.8893 - val_loss: 0.3783 - val_accuracy: 0.8740s - loss: 0.3189 - accuracy - ETA: 0s - loss: 0.3189 - ac
Epoch 99/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3260 - accuracy: 0.8845 - val_loss: 0.3418 - val_accuracy: 0.8875
Epoch 100/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3143 - accuracy: 0.8893 - val_loss: 0.3974 - val_accuracy: 0.8671loss: 0.3141 - accu - ETA: 0s - loss: 0.314
Epoch 101/200
195/195 [==============================] - 20s 105ms/step - loss: 0.3209 - accuracy: 0.8898 - val_loss: 0.3688 - val_accuracy: 0.8780
Epoch 102/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3205 - accuracy: 0.8885 - val_loss: 0.3689 - val_accuracy: 0.8791
Epoch 103/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3157 - accuracy: 0.8884 - val_loss: 0.3420 - val_accuracy: 0.8857
Epoch 104/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3163 - accuracy: 0.8878 - val_loss: 0.3580 - val_accuracy: 0.8821
Epoch 105/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3105 - accuracy: 0.8915 - val_loss: 0.3696 - val_accuracy: 0.8800
Epoch 106/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3127 - accuracy: 0.8893 - val_loss: 0.3701 - val_accuracy: 0.8799
Epoch 107/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3087 - accuracy: 0.8917 - val_loss: 0.3604 - val_accuracy: 0.8831
Epoch 108/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3097 - accuracy: 0.8916 - val_loss: 0.3311 - val_accuracy: 0.8923
Epoch 109/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3096 - accuracy: 0.8907 - val_loss: 0.3421 - val_accuracy: 0.8880
Epoch 110/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3082 - accuracy: 0.8925 - val_loss: 0.3207 - val_accuracy: 0.8933
Epoch 111/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2997 - accuracy: 0.8967 - val_loss: 0.3400 - val_accuracy: 0.8858
Epoch 112/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3026 - accuracy: 0.8929 - val_loss: 0.3821 - val_accuracy: 0.8769
Epoch 113/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2996 - accuracy: 0.8940 - val_loss: 0.3453 - val_accuracy: 0.886193 - ac
Epoch 114/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3033 - accuracy: 0.8935 - val_loss: 0.3850 - val_accuracy: 0.8733
Epoch 115/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3046 - accuracy: 0.8942 - val_loss: 0.3396 - val_accuracy: 0.8880
Epoch 116/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2998 - accuracy: 0.8946 - val_loss: 0.3496 - val_accuracy: 0.8826
Epoch 117/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3100 - accuracy: 0.8914 - val_loss: 0.4213 - val_accuracy: 0.8632
Epoch 118/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3099 - accuracy: 0.8905 - val_loss: 0.3623 - val_accuracy: 0.8787- l -
Epoch 119/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3096 - accuracy: 0.8929 - val_loss: 0.3523 - val_accuracy: 0.8841ss: 0.3098 - accu - ETA: 3s - loss: 0.3097 - ac - ETA: 2s - loss: 0.3097 - accu - ETA: 1s - loss: 0.3097 - accuracy - ETA: 1s -
Epoch 120/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2990 - accuracy: 0.8952 - val_loss: 0.3645 - val_accuracy: 0.8803
Epoch 121/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2986 - accuracy: 0.8940 - val_loss: 0.3947 - val_accuracy: 0.8701
Epoch 122/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3002 - accuracy: 0.8934 - val_loss: 0.3854 - val_accuracy: 0.8746
Epoch 123/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2957 - accuracy: 0.8962 - val_loss: 0.3649 - val_accuracy: 0.8787
Epoch 124/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2926 - accuracy: 0.8967 - val_loss: 0.3245 - val_accuracy: 0.8948
Epoch 125/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3024 - accuracy: 0.8933 - val_loss: 0.3376 - val_accuracy: 0.8896
Epoch 126/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2904 - accuracy: 0.8984 - val_loss: 0.3394 - val_accuracy: 0.8867
Epoch 127/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2974 - accuracy: 0.8974 - val_loss: 0.3591 - val_accuracy: 0.8842
Epoch 128/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2942 - accuracy: 0.8978 - val_loss: 0.3455 - val_accuracy: 0.8848
Epoch 129/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2940 - accuracy: 0.8970 - val_loss: 0.3400 - val_accuracy: 0.8883
Epoch 130/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2973 - accuracy: 0.8973 - val_loss: 0.3286 - val_accuracy: 0.8905
Epoch 131/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2903 - accuracy: 0.8948 - val_loss: 0.4064 - val_accuracy: 0.8707
Epoch 132/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2962 - accuracy: 0.8963 - val_loss: 0.3689 - val_accuracy: 0.8773
Epoch 133/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2918 - accuracy: 0.8971 - val_loss: 0.3666 - val_accuracy: 0.8808
Epoch 134/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2894 - accuracy: 0.8991 - val_loss: 0.3306 - val_accuracy: 0.8918
Epoch 135/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2809 - accuracy: 0.9020 - val_loss: 0.3157 - val_accuracy: 0.8940
Epoch 136/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2878 - accuracy: 0.8996 - val_loss: 0.3568 - val_accuracy: 0.8847
Epoch 137/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2903 - accuracy: 0.8981 - val_loss: 0.3422 - val_accuracy: 0.8914
Epoch 138/200
195/195 [==============================] - 20s 104ms/step - loss: 0.2841 - accuracy: 0.8986 - val_loss: 0.3276 - val_accuracy: 0.8910
Epoch 139/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2892 - accuracy: 0.8994 - val_loss: 0.3350 - val_accuracy: 0.8909
Epoch 140/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2863 - accuracy: 0.9000 - val_loss: 0.3634 - val_accuracy: 0.8817
Epoch 141/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2884 - accuracy: 0.8983 - val_loss: 0.3368 - val_accuracy: 0.8903
Epoch 142/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2903 - accuracy: 0.8988 - val_loss: 0.3643 - val_accuracy: 0.8820
Epoch 143/200
195/195 [==============================] - 21s 105ms/step - loss: 0.2818 - accuracy: 0.8997 - val_loss: 0.3178 - val_accuracy: 0.8933
Epoch 144/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2713 - accuracy: 0.9042 - val_loss: 0.3584 - val_accuracy: 0.88400
Epoch 145/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2907 - accuracy: 0.8990 - val_loss: 0.3286 - val_accuracy: 0.8921loss: 0.2911 - accura - ETA
Epoch 146/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2745 - accuracy: 0.9045 - val_loss: 0.3450 - val_accuracy: 0.8890
Epoch 147/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2816 - accuracy: 0.9028 - val_loss: 0.3895 - val_accuracy: 0.8715
Epoch 148/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2777 - accuracy: 0.9041 - val_loss: 0.3372 - val_accuracy: 0.8896- loss: 0.2776 - accuracy:
Epoch 149/200
195/195 [==============================] - 21s 105ms/step - loss: 0.2700 - accuracy: 0.9070 - val_loss: 0.3615 - val_accuracy: 0.8803
Epoch 150/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2741 - accuracy: 0.9033 - val_loss: 0.3605 - val_accuracy: 0.8813
Epoch 151/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2890 - accuracy: 0.8979 - val_loss: 0.3490 - val_accuracy: 0.8854
Epoch 152/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2784 - accuracy: 0.9008 - val_loss: 0.3543 - val_accuracy: 0.8838s - los
Epoch 153/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2803 - accuracy: 0.9014 - val_loss: 0.3356 - val_accuracy: 0.8876
Epoch 154/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2719 - accuracy: 0.9031 - val_loss: 0.3338 - val_accuracy: 0.8894
Epoch 155/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2830 - accuracy: 0.9019 - val_loss: 0.3505 - val_accuracy: 0.8893
Epoch 156/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2830 - accuracy: 0.9002 - val_loss: 0.3173 - val_accuracy: 0.8983
Epoch 157/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2764 - accuracy: 0.9015 - val_loss: 0.3789 - val_accuracy: 0.8765
Epoch 158/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2742 - accuracy: 0.9040 - val_loss: 0.3245 - val_accuracy: 0.8941
Epoch 159/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2801 - accuracy: 0.9014 - val_loss: 0.3342 - val_accuracy: 0.8905
Epoch 160/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2640 - accuracy: 0.9064 - val_loss: 0.3632 - val_accuracy: 0.8818
Epoch 161/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2754 - accuracy: 0.9026 - val_loss: 0.3204 - val_accuracy: 0.8936
Epoch 162/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2745 - accuracy: 0.9040 - val_loss: 0.3921 - val_accuracy: 0.8769
Epoch 163/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2731 - accuracy: 0.9031 - val_loss: 0.3234 - val_accuracy: 0.8939
Epoch 164/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2699 - accuracy: 0.9062 - val_loss: 0.3466 - val_accuracy: 0.8873
Epoch 165/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2866 - accuracy: 0.9002 - val_loss: 0.3669 - val_accuracy: 0.8820
Epoch 166/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2657 - accuracy: 0.9058 - val_loss: 0.3208 - val_accuracy: 0.8930
Epoch 167/200
195/195 [==============================] - 20s 105ms/step - loss: 0.2769 - accuracy: 0.9014 - val_loss: 0.3339 - val_accuracy: 0.8912
Epoch 168/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2739 - accuracy: 0.9037 - val_loss: 0.3357 - val_accuracy: 0.8885
Epoch 169/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2739 - accuracy: 0.9059 - val_loss: 0.4047 - val_accuracy: 0.8727
Epoch 170/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2666 - accuracy: 0.9063 - val_loss: 0.3386 - val_accuracy: 0.8904
Epoch 171/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2660 - accuracy: 0.9073 - val_loss: 0.3169 - val_accuracy: 0.8945
Epoch 172/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2692 - accuracy: 0.9054 - val_loss: 0.3413 - val_accuracy: 0.8859
Epoch 173/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2672 - accuracy: 0.9050 - val_loss: 0.3230 - val_accuracy: 0.8930
Epoch 174/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2776 - accuracy: 0.9026 - val_loss: 0.3204 - val_accuracy: 0.8966
Epoch 175/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2646 - accuracy: 0.9073 - val_loss: 0.3433 - val_accuracy: 0.8937
Epoch 176/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2670 - accuracy: 0.9057 - val_loss: 0.3301 - val_accuracy: 0.8927
Epoch 177/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2697 - accuracy: 0.9046 - val_loss: 0.3110 - val_accuracy: 0.8979
Epoch 178/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2711 - accuracy: 0.9043 - val_loss: 0.3240 - val_accuracy: 0.8944712 - accuracy: 0. - ETA:
Epoch 179/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2628 - accuracy: 0.9072 - val_loss: 0.3265 - val_accuracy: 0.8931
Epoch 180/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2642 - accuracy: 0.9070 - val_loss: 0.3192 - val_accuracy: 0.8954
Epoch 181/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2626 - accuracy: 0.9067 - val_loss: 0.3404 - val_accuracy: 0.8875
Epoch 182/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2635 - accuracy: 0.9080 - val_loss: 0.3463 - val_accuracy: 0.8874
Epoch 183/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2630 - accuracy: 0.9075 - val_loss: 0.3342 - val_accuracy: 0.8909
Epoch 184/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2666 - accuracy: 0.9036 - val_loss: 0.2964 - val_accuracy: 0.9011
Epoch 185/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2671 - accuracy: 0.9067 - val_loss: 0.3400 - val_accuracy: 0.8905
Epoch 186/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2625 - accuracy: 0.9084 - val_loss: 0.3446 - val_accuracy: 0.8889
Epoch 187/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2606 - accuracy: 0.9097 - val_loss: 0.3242 - val_accuracy: 0.8955
Epoch 188/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2588 - accuracy: 0.9094 - val_loss: 0.3240 - val_accuracy: 0.8958
Epoch 189/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2649 - accuracy: 0.9070 - val_loss: 0.3216 - val_accuracy: 0.8980
Epoch 190/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2587 - accuracy: 0.9077 - val_loss: 0.3403 - val_accuracy: 0.8891
Epoch 191/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2678 - accuracy: 0.9033 - val_loss: 0.3099 - val_accuracy: 0.9008
Epoch 192/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2538 - accuracy: 0.9094 - val_loss: 0.3170 - val_accuracy: 0.8968
Epoch 193/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2613 - accuracy: 0.9076 - val_loss: 0.2916 - val_accuracy: 0.9046
Epoch 194/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2651 - accuracy: 0.9077 - val_loss: 0.3159 - val_accuracy: 0.8968
Epoch 195/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2576 - accuracy: 0.9097 - val_loss: 0.3446 - val_accuracy: 0.8901
Epoch 196/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2554 - accuracy: 0.9094 - val_loss: 0.3227 - val_accuracy: 0.8978curacy:
Epoch 197/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2620 - accuracy: 0.9090 - val_loss: 0.3174 - val_accuracy: 0.8958
Epoch 198/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2583 - accuracy: 0.9082 - val_loss: 0.3186 - val_accuracy: 0.8964
Epoch 199/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2546 - accuracy: 0.9103 - val_loss: 0.3183 - val_accuracy: 0.8968
Epoch 200/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2544 - accuracy: 0.9082 - val_loss: 0.3327 - val_accuracy: 0.8948
313/313 [==============================] - 1s 3ms/step - loss: 0.3327 - accuracy: 0.8948
Accuracy is : 89.48000073432922
Loss is : 0.3326900005340576
(i cut the first 40 epochs bcs body is limited to 30000 char, 1-40 epoch keep improving but so slow)
i have tried with 100 epoch, it gives me result of ~88% validation accuracy
in this code i add another 100 epochs and it give me only 1% improvement (~89%)
My questions are,
Do my model performed an overfitting model ?
Why my model performed so slow ?
do my model can improve if i add more epochs ?
how to increase the accuracy and decrease the loss, because it seems like stagnant to me ?
Model Performance plot here
No - because valuation loss not increasing
Your plots look fine. It is expected that the training process goes slower
Yes, but it doesn't make sense. If you train any model for infinity - its performance will permanently improved - e.g. you can get 89.5% accuracy (which is better than 89.48%) if you train it for year.
Try decaying learning rate with different schedules

Categories