CNN conversion flow tutorial

This tutorial illustrates how to use the CNN2SNN toolkit to convert CNN networks to SNN networks compatible with the Akida NSoC in a few steps. You can refer to our CNN2SNN toolkit user guide for further explanation.

The CNN2SNN tool is based on Keras, TensorFlow high-level API for building and training deep learning models.

Note

Please refer to TensorFlow tf.keras.models module for model creation/import details and TensorFlow Guide for details of how TensorFlow works.

MNIST example below is light enough so you do not need a GPU to run the CNN2SNN tool.

../../_images/cnn2snn_flow_small.jpg

1. Load and reshape MNIST dataset

After loading, we make 2 transformations on the dataset:

  1. Reshape the sample content data (x values) into a num_samples x width x height x channels matrix.

Note

At this point, we’ll set aside the raw data for testing our converted model in the Akida Execution Engine later.

  1. Rescale the 8-bit loaded data to the range 0-to-1 for training.

Note

Input data normalization is a common step dealing with CNN (rationale is to keep data in a range that works with selected optimizers, some reading can be found here.

This shift makes almost no difference in the current example, but for some datasets rescaling the absolute values (and also shifting to zero-mean) can make a really major difference.

Also note that we store the scaling values input_scaling for use when preparing the model for the Akida Execution Engine. The implementation of the Akida neural network allows us to completely skip the rescaling step (i.e. the Akida model should be fed with the raw 8-bit values) but that does require information about what scaling was applied prior to training - see below for more details.

import tensorflow as tf
from tensorflow import keras

# Load MNIST dataset
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()

# Reshape x-data
x_train = x_train.reshape(60000, 28, 28, 1)
x_test = x_test.reshape(10000, 28, 28, 1)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')
raw_y_test = y_test

# Rescale x-data
a = 255
b = 0
input_scaling = (a, b)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

2. Model definition

Note that at this stage, there is nothing specific to the Akida NSoC. This start point is very much a completely standard CNN as defined within Keras.

An appropriate model for MNIST (inspired by this example) might look something like the following:

model_keras = keras.models.Sequential([
    keras.layers.Conv2D(filters=32, kernel_size=3, input_shape=(28, 28, 1)),
    keras.layers.MaxPool2D(),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.SeparableConv2D(filters=64, kernel_size=3, padding='same'),
    keras.layers.MaxPool2D(padding='same'),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.Flatten(),
    keras.layers.Dense(10)
], 'mnistnet')

model_keras.summary()

Out:

Model: "mnistnet"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (Conv2D)              (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
batch_normalization (BatchNo (None, 13, 13, 32)        128
_________________________________________________________________
re_lu (ReLU)                 (None, 13, 13, 32)        0
_________________________________________________________________
separable_conv2d (SeparableC (None, 13, 13, 64)        2400
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
batch_normalization_1 (Batch (None, 7, 7, 64)          256
_________________________________________________________________
re_lu_1 (ReLU)               (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (Dense)                (None, 10)                31370
=================================================================
Total params: 34,474
Trainable params: 34,282
Non-trainable params: 192
_________________________________________________________________

The model defined above is compatible for conversion into an Akida model, i.e. the model doesn’t include any layers or operations that aren’t Akida-compatible (please refer to the CNN2SNN toolkit documentation for full details):

  • Standard Conv2D and Dense layers are supported

  • Hidden layers must be followed by a ReLU layer.

  • BatchNormalization must always happen before activations.

  • Convolutional blocks can optionally be followed by a MaxPooling.

The CNN2SNN toolkit provides the check_model_compatibility function to ensure that the model can be converted into an Akida model. If the model is not fully compatible, substitutes will be needed for the relevant layers/operations (guidelines included in the documentation).

from cnn2snn import check_model_compatibility

print("Model compatible for Akida conversion:",
      check_model_compatibility(model_keras))

Out:

Model compatible for Akida conversion: True

3. Model training

Before going any further, train the model and get its performance. The created model should have achieved a test accuracy a little over 99% after 10 epochs.

model_keras.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

model_keras.fit(x_train, y_train, epochs=10, validation_split=0.1)

score = model_keras.evaluate(x_test, y_test, verbose=0)
print('Test score:', score[0])
print('Test accuracy:', score[1])

Out:

Epoch 1/10

   1/1688 [..............................] - ETA: 10:12 - loss: 2.7855 - accuracy: 0.0000e+00
  30/1688 [..............................] - ETA: 2s - loss: 1.6587 - accuracy: 0.4620       
  60/1688 [>.............................] - ETA: 2s - loss: 1.2537 - accuracy: 0.5957
  89/1688 [>.............................] - ETA: 2s - loss: 1.0450 - accuracy: 0.6638
 119/1688 [=>............................] - ETA: 2s - loss: 0.9074 - accuracy: 0.7091
 148/1688 [=>............................] - ETA: 2s - loss: 0.8144 - accuracy: 0.7401
 178/1688 [==>...........................] - ETA: 2s - loss: 0.7419 - accuracy: 0.7641
 207/1688 [==>...........................] - ETA: 2s - loss: 0.6869 - accuracy: 0.7821
 237/1688 [===>..........................] - ETA: 2s - loss: 0.6402 - accuracy: 0.7973
 266/1688 [===>..........................] - ETA: 2s - loss: 0.6029 - accuracy: 0.8095
 296/1688 [====>.........................] - ETA: 2s - loss: 0.5701 - accuracy: 0.8201
 326/1688 [====>.........................] - ETA: 2s - loss: 0.5418 - accuracy: 0.8293
 356/1688 [=====>........................] - ETA: 2s - loss: 0.5171 - accuracy: 0.8374
 385/1688 [=====>........................] - ETA: 2s - loss: 0.4960 - accuracy: 0.8442
 414/1688 [======>.......................] - ETA: 2s - loss: 0.4772 - accuracy: 0.8503
 444/1688 [======>.......................] - ETA: 2s - loss: 0.4600 - accuracy: 0.8558
 473/1688 [=======>......................] - ETA: 2s - loss: 0.4450 - accuracy: 0.8606
 503/1688 [=======>......................] - ETA: 2s - loss: 0.4308 - accuracy: 0.8651
 533/1688 [========>.....................] - ETA: 1s - loss: 0.4179 - accuracy: 0.8692
 563/1688 [=========>....................] - ETA: 1s - loss: 0.4061 - accuracy: 0.8730
 593/1688 [=========>....................] - ETA: 1s - loss: 0.3953 - accuracy: 0.8765
 623/1688 [==========>...................] - ETA: 1s - loss: 0.3852 - accuracy: 0.8797
 652/1688 [==========>...................] - ETA: 1s - loss: 0.3763 - accuracy: 0.8825
 682/1688 [===========>..................] - ETA: 1s - loss: 0.3676 - accuracy: 0.8852
 712/1688 [===========>..................] - ETA: 1s - loss: 0.3596 - accuracy: 0.8878
 741/1688 [============>.................] - ETA: 1s - loss: 0.3523 - accuracy: 0.8901
 771/1688 [============>.................] - ETA: 1s - loss: 0.3451 - accuracy: 0.8924
 800/1688 [=============>................] - ETA: 1s - loss: 0.3386 - accuracy: 0.8944
 830/1688 [=============>................] - ETA: 1s - loss: 0.3323 - accuracy: 0.8964
 859/1688 [==============>...............] - ETA: 1s - loss: 0.3264 - accuracy: 0.8983
 889/1688 [==============>...............] - ETA: 1s - loss: 0.3207 - accuracy: 0.9001
 919/1688 [===============>..............] - ETA: 1s - loss: 0.3153 - accuracy: 0.9018
 948/1688 [===============>..............] - ETA: 1s - loss: 0.3103 - accuracy: 0.9034
 978/1688 [================>.............] - ETA: 1s - loss: 0.3054 - accuracy: 0.9050
1007/1688 [================>.............] - ETA: 1s - loss: 0.3009 - accuracy: 0.9064
1037/1688 [=================>............] - ETA: 1s - loss: 0.2965 - accuracy: 0.9078
1066/1688 [=================>............] - ETA: 1s - loss: 0.2923 - accuracy: 0.9091
1096/1688 [==================>...........] - ETA: 1s - loss: 0.2882 - accuracy: 0.9104
1125/1688 [==================>...........] - ETA: 0s - loss: 0.2844 - accuracy: 0.9116
1155/1688 [===================>..........] - ETA: 0s - loss: 0.2807 - accuracy: 0.9128
1184/1688 [====================>.........] - ETA: 0s - loss: 0.2772 - accuracy: 0.9139
1214/1688 [====================>.........] - ETA: 0s - loss: 0.2737 - accuracy: 0.9150
1244/1688 [=====================>........] - ETA: 0s - loss: 0.2704 - accuracy: 0.9160
1274/1688 [=====================>........] - ETA: 0s - loss: 0.2672 - accuracy: 0.9170
1304/1688 [======================>.......] - ETA: 0s - loss: 0.2641 - accuracy: 0.9180
1333/1688 [======================>.......] - ETA: 0s - loss: 0.2612 - accuracy: 0.9189
1361/1688 [=======================>......] - ETA: 0s - loss: 0.2586 - accuracy: 0.9198
1390/1688 [=======================>......] - ETA: 0s - loss: 0.2559 - accuracy: 0.9206
1419/1688 [========================>.....] - ETA: 0s - loss: 0.2532 - accuracy: 0.9214
1449/1688 [========================>.....] - ETA: 0s - loss: 0.2506 - accuracy: 0.9223
1479/1688 [=========================>....] - ETA: 0s - loss: 0.2481 - accuracy: 0.9231
1508/1688 [=========================>....] - ETA: 0s - loss: 0.2457 - accuracy: 0.9238
1538/1688 [==========================>...] - ETA: 0s - loss: 0.2433 - accuracy: 0.9246
1568/1688 [==========================>...] - ETA: 0s - loss: 0.2410 - accuracy: 0.9253
1597/1688 [===========================>..] - ETA: 0s - loss: 0.2388 - accuracy: 0.9260
1627/1688 [===========================>..] - ETA: 0s - loss: 0.2366 - accuracy: 0.9267
1657/1688 [============================>.] - ETA: 0s - loss: 0.2344 - accuracy: 0.9274
1687/1688 [============================>.] - ETA: 0s - loss: 0.2324 - accuracy: 0.9280
1688/1688 [==============================] - 4s 2ms/step - loss: 0.2322 - accuracy: 0.9281 - val_loss: 0.1171 - val_accuracy: 0.9698
Epoch 2/10

   1/1688 [..............................] - ETA: 2s - loss: 0.0066 - accuracy: 1.0000
  32/1688 [..............................] - ETA: 2s - loss: 0.0678 - accuracy: 0.9793
  63/1688 [>.............................] - ETA: 2s - loss: 0.0611 - accuracy: 0.9814
  93/1688 [>.............................] - ETA: 2s - loss: 0.0581 - accuracy: 0.9823
 122/1688 [=>............................] - ETA: 2s - loss: 0.0568 - accuracy: 0.9825
 151/1688 [=>............................] - ETA: 2s - loss: 0.0564 - accuracy: 0.9826
 181/1688 [==>...........................] - ETA: 2s - loss: 0.0560 - accuracy: 0.9829
 210/1688 [==>...........................] - ETA: 2s - loss: 0.0558 - accuracy: 0.9830
 240/1688 [===>..........................] - ETA: 2s - loss: 0.0556 - accuracy: 0.9831
 270/1688 [===>..........................] - ETA: 2s - loss: 0.0555 - accuracy: 0.9831
 299/1688 [====>.........................] - ETA: 2s - loss: 0.0553 - accuracy: 0.9832
 328/1688 [====>.........................] - ETA: 2s - loss: 0.0550 - accuracy: 0.9833
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0546 - accuracy: 0.9834
 388/1688 [=====>........................] - ETA: 2s - loss: 0.0542 - accuracy: 0.9835
 418/1688 [======>.......................] - ETA: 2s - loss: 0.0538 - accuracy: 0.9836
 447/1688 [======>.......................] - ETA: 2s - loss: 0.0533 - accuracy: 0.9837
 477/1688 [=======>......................] - ETA: 2s - loss: 0.0529 - accuracy: 0.9838
 507/1688 [========>.....................] - ETA: 2s - loss: 0.0525 - accuracy: 0.9839
 537/1688 [========>.....................] - ETA: 1s - loss: 0.0522 - accuracy: 0.9840
 566/1688 [=========>....................] - ETA: 1s - loss: 0.0518 - accuracy: 0.9841
 596/1688 [=========>....................] - ETA: 1s - loss: 0.0515 - accuracy: 0.9842
 626/1688 [==========>...................] - ETA: 1s - loss: 0.0513 - accuracy: 0.9843
 655/1688 [==========>...................] - ETA: 1s - loss: 0.0510 - accuracy: 0.9843
 685/1688 [===========>..................] - ETA: 1s - loss: 0.0508 - accuracy: 0.9844
 714/1688 [===========>..................] - ETA: 1s - loss: 0.0505 - accuracy: 0.9845
 744/1688 [============>.................] - ETA: 1s - loss: 0.0503 - accuracy: 0.9845
 774/1688 [============>.................] - ETA: 1s - loss: 0.0501 - accuracy: 0.9846
 804/1688 [=============>................] - ETA: 1s - loss: 0.0499 - accuracy: 0.9846
 833/1688 [=============>................] - ETA: 1s - loss: 0.0498 - accuracy: 0.9846
 862/1688 [==============>...............] - ETA: 1s - loss: 0.0496 - accuracy: 0.9846
 892/1688 [==============>...............] - ETA: 1s - loss: 0.0495 - accuracy: 0.9847
 922/1688 [===============>..............] - ETA: 1s - loss: 0.0495 - accuracy: 0.9847
 952/1688 [===============>..............] - ETA: 1s - loss: 0.0494 - accuracy: 0.9847
 981/1688 [================>.............] - ETA: 1s - loss: 0.0493 - accuracy: 0.9847
1010/1688 [================>.............] - ETA: 1s - loss: 0.0493 - accuracy: 0.9847
1040/1688 [=================>............] - ETA: 1s - loss: 0.0493 - accuracy: 0.9847
1070/1688 [==================>...........] - ETA: 1s - loss: 0.0493 - accuracy: 0.9847
1100/1688 [==================>...........] - ETA: 1s - loss: 0.0492 - accuracy: 0.9847
1129/1688 [===================>..........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9847
1158/1688 [===================>..........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9847
1188/1688 [====================>.........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9847
1218/1688 [====================>.........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9848
1247/1688 [=====================>........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9848
1277/1688 [=====================>........] - ETA: 0s - loss: 0.0492 - accuracy: 0.9848
1307/1688 [======================>.......] - ETA: 0s - loss: 0.0492 - accuracy: 0.9848
1336/1688 [======================>.......] - ETA: 0s - loss: 0.0491 - accuracy: 0.9848
1365/1688 [=======================>......] - ETA: 0s - loss: 0.0491 - accuracy: 0.9848
1395/1688 [=======================>......] - ETA: 0s - loss: 0.0491 - accuracy: 0.9848
1424/1688 [========================>.....] - ETA: 0s - loss: 0.0491 - accuracy: 0.9848
1453/1688 [========================>.....] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1482/1688 [=========================>....] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1542/1688 [==========================>...] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1571/1688 [==========================>...] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1600/1688 [===========================>..] - ETA: 0s - loss: 0.0490 - accuracy: 0.9848
1630/1688 [===========================>..] - ETA: 0s - loss: 0.0489 - accuracy: 0.9848
1659/1688 [============================>.] - ETA: 0s - loss: 0.0489 - accuracy: 0.9848
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0489 - accuracy: 0.9848 - val_loss: 0.0632 - val_accuracy: 0.9830
Epoch 3/10

   1/1688 [..............................] - ETA: 3s - loss: 5.0869e-04 - accuracy: 1.0000
  31/1688 [..............................] - ETA: 2s - loss: 0.0364 - accuracy: 0.9936    
  61/1688 [>.............................] - ETA: 2s - loss: 0.0362 - accuracy: 0.9919
  90/1688 [>.............................] - ETA: 2s - loss: 0.0361 - accuracy: 0.9912
 120/1688 [=>............................] - ETA: 2s - loss: 0.0355 - accuracy: 0.9907
 150/1688 [=>............................] - ETA: 2s - loss: 0.0348 - accuracy: 0.9904
 180/1688 [==>...........................] - ETA: 2s - loss: 0.0342 - accuracy: 0.9901
 210/1688 [==>...........................] - ETA: 2s - loss: 0.0337 - accuracy: 0.9899
 240/1688 [===>..........................] - ETA: 2s - loss: 0.0335 - accuracy: 0.9897
 270/1688 [===>..........................] - ETA: 2s - loss: 0.0332 - accuracy: 0.9896
 300/1688 [====>.........................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9896
 330/1688 [====>.........................] - ETA: 2s - loss: 0.0326 - accuracy: 0.9895
 360/1688 [=====>........................] - ETA: 2s - loss: 0.0326 - accuracy: 0.9894
 390/1688 [=====>........................] - ETA: 2s - loss: 0.0327 - accuracy: 0.9893
 420/1688 [======>.......................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9893
 449/1688 [======>.......................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9892
 480/1688 [=======>......................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9892
 509/1688 [========>.....................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9892
 538/1688 [========>.....................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9891
 567/1688 [=========>....................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9891
 597/1688 [=========>....................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9890
 626/1688 [==========>...................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9890
 655/1688 [==========>...................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9890
 684/1688 [===========>..................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9890
 714/1688 [===========>..................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9889
 743/1688 [============>.................] - ETA: 1s - loss: 0.0328 - accuracy: 0.9889
 773/1688 [============>.................] - ETA: 1s - loss: 0.0329 - accuracy: 0.9889
 803/1688 [=============>................] - ETA: 1s - loss: 0.0329 - accuracy: 0.9888
 833/1688 [=============>................] - ETA: 1s - loss: 0.0330 - accuracy: 0.9888
 862/1688 [==============>...............] - ETA: 1s - loss: 0.0330 - accuracy: 0.9888
 891/1688 [==============>...............] - ETA: 1s - loss: 0.0330 - accuracy: 0.9887
 921/1688 [===============>..............] - ETA: 1s - loss: 0.0331 - accuracy: 0.9887
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0331 - accuracy: 0.9887
 979/1688 [================>.............] - ETA: 1s - loss: 0.0331 - accuracy: 0.9887
1009/1688 [================>.............] - ETA: 1s - loss: 0.0332 - accuracy: 0.9887
1039/1688 [=================>............] - ETA: 1s - loss: 0.0332 - accuracy: 0.9887
1069/1688 [=================>............] - ETA: 1s - loss: 0.0332 - accuracy: 0.9886
1098/1688 [==================>...........] - ETA: 1s - loss: 0.0333 - accuracy: 0.9886
1127/1688 [===================>..........] - ETA: 0s - loss: 0.0333 - accuracy: 0.9886
1157/1688 [===================>..........] - ETA: 0s - loss: 0.0333 - accuracy: 0.9886
1187/1688 [====================>.........] - ETA: 0s - loss: 0.0333 - accuracy: 0.9886
1217/1688 [====================>.........] - ETA: 0s - loss: 0.0334 - accuracy: 0.9886
1247/1688 [=====================>........] - ETA: 0s - loss: 0.0334 - accuracy: 0.9886
1277/1688 [=====================>........] - ETA: 0s - loss: 0.0335 - accuracy: 0.9886
1307/1688 [======================>.......] - ETA: 0s - loss: 0.0335 - accuracy: 0.9886
1336/1688 [======================>.......] - ETA: 0s - loss: 0.0335 - accuracy: 0.9886
1365/1688 [=======================>......] - ETA: 0s - loss: 0.0336 - accuracy: 0.9886
1395/1688 [=======================>......] - ETA: 0s - loss: 0.0336 - accuracy: 0.9886
1424/1688 [========================>.....] - ETA: 0s - loss: 0.0336 - accuracy: 0.9886
1454/1688 [========================>.....] - ETA: 0s - loss: 0.0336 - accuracy: 0.9886
1483/1688 [=========================>....] - ETA: 0s - loss: 0.0337 - accuracy: 0.9886
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0337 - accuracy: 0.9886
1541/1688 [==========================>...] - ETA: 0s - loss: 0.0337 - accuracy: 0.9886
1570/1688 [==========================>...] - ETA: 0s - loss: 0.0337 - accuracy: 0.9886
1599/1688 [===========================>..] - ETA: 0s - loss: 0.0337 - accuracy: 0.9886
1628/1688 [===========================>..] - ETA: 0s - loss: 0.0338 - accuracy: 0.9886
1657/1688 [============================>.] - ETA: 0s - loss: 0.0338 - accuracy: 0.9886
1687/1688 [============================>.] - ETA: 0s - loss: 0.0338 - accuracy: 0.9886
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0338 - accuracy: 0.9886 - val_loss: 0.0569 - val_accuracy: 0.9843
Epoch 4/10

   1/1688 [..............................] - ETA: 2s - loss: 3.7026e-04 - accuracy: 1.0000
  32/1688 [..............................] - ETA: 2s - loss: 0.0268 - accuracy: 0.9904    
  62/1688 [>.............................] - ETA: 2s - loss: 0.0260 - accuracy: 0.9902
  92/1688 [>.............................] - ETA: 2s - loss: 0.0245 - accuracy: 0.9908
 122/1688 [=>............................] - ETA: 2s - loss: 0.0233 - accuracy: 0.9913
 152/1688 [=>............................] - ETA: 2s - loss: 0.0227 - accuracy: 0.9918
 182/1688 [==>...........................] - ETA: 2s - loss: 0.0224 - accuracy: 0.9920
 211/1688 [==>...........................] - ETA: 2s - loss: 0.0223 - accuracy: 0.9922
 241/1688 [===>..........................] - ETA: 2s - loss: 0.0224 - accuracy: 0.9923
 271/1688 [===>..........................] - ETA: 2s - loss: 0.0225 - accuracy: 0.9923
 301/1688 [====>.........................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9924
 330/1688 [====>.........................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9925
 360/1688 [=====>........................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9925
 390/1688 [=====>........................] - ETA: 2s - loss: 0.0227 - accuracy: 0.9925
 420/1688 [======>.......................] - ETA: 2s - loss: 0.0227 - accuracy: 0.9926
 450/1688 [======>.......................] - ETA: 2s - loss: 0.0227 - accuracy: 0.9926
 479/1688 [=======>......................] - ETA: 2s - loss: 0.0228 - accuracy: 0.9926
 508/1688 [========>.....................] - ETA: 2s - loss: 0.0228 - accuracy: 0.9926
 537/1688 [========>.....................] - ETA: 1s - loss: 0.0229 - accuracy: 0.9926
 567/1688 [=========>....................] - ETA: 1s - loss: 0.0229 - accuracy: 0.9926
 596/1688 [=========>....................] - ETA: 1s - loss: 0.0230 - accuracy: 0.9926
 625/1688 [==========>...................] - ETA: 1s - loss: 0.0230 - accuracy: 0.9926
 654/1688 [==========>...................] - ETA: 1s - loss: 0.0230 - accuracy: 0.9926
 683/1688 [===========>..................] - ETA: 1s - loss: 0.0230 - accuracy: 0.9926
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0231 - accuracy: 0.9926
 742/1688 [============>.................] - ETA: 1s - loss: 0.0231 - accuracy: 0.9926
 771/1688 [============>.................] - ETA: 1s - loss: 0.0231 - accuracy: 0.9926
 799/1688 [=============>................] - ETA: 1s - loss: 0.0232 - accuracy: 0.9926
 829/1688 [=============>................] - ETA: 1s - loss: 0.0232 - accuracy: 0.9926
 858/1688 [==============>...............] - ETA: 1s - loss: 0.0233 - accuracy: 0.9926
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0233 - accuracy: 0.9926
 916/1688 [===============>..............] - ETA: 1s - loss: 0.0234 - accuracy: 0.9925
 946/1688 [===============>..............] - ETA: 1s - loss: 0.0234 - accuracy: 0.9925
 976/1688 [================>.............] - ETA: 1s - loss: 0.0235 - accuracy: 0.9925
1005/1688 [================>.............] - ETA: 1s - loss: 0.0236 - accuracy: 0.9925
1035/1688 [=================>............] - ETA: 1s - loss: 0.0236 - accuracy: 0.9925
1065/1688 [=================>............] - ETA: 1s - loss: 0.0237 - accuracy: 0.9925
1095/1688 [==================>...........] - ETA: 1s - loss: 0.0238 - accuracy: 0.9924
1125/1688 [==================>...........] - ETA: 0s - loss: 0.0238 - accuracy: 0.9924
1155/1688 [===================>..........] - ETA: 0s - loss: 0.0239 - accuracy: 0.9924
1184/1688 [====================>.........] - ETA: 0s - loss: 0.0239 - accuracy: 0.9924
1213/1688 [====================>.........] - ETA: 0s - loss: 0.0240 - accuracy: 0.9923
1242/1688 [=====================>........] - ETA: 0s - loss: 0.0240 - accuracy: 0.9923
1272/1688 [=====================>........] - ETA: 0s - loss: 0.0241 - accuracy: 0.9923
1302/1688 [======================>.......] - ETA: 0s - loss: 0.0242 - accuracy: 0.9923
1332/1688 [======================>.......] - ETA: 0s - loss: 0.0243 - accuracy: 0.9923
1362/1688 [=======================>......] - ETA: 0s - loss: 0.0243 - accuracy: 0.9922
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0244 - accuracy: 0.9922
1422/1688 [========================>.....] - ETA: 0s - loss: 0.0245 - accuracy: 0.9922
1452/1688 [========================>.....] - ETA: 0s - loss: 0.0245 - accuracy: 0.9922
1482/1688 [=========================>....] - ETA: 0s - loss: 0.0246 - accuracy: 0.9921
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0247 - accuracy: 0.9921
1542/1688 [==========================>...] - ETA: 0s - loss: 0.0247 - accuracy: 0.9921
1572/1688 [==========================>...] - ETA: 0s - loss: 0.0248 - accuracy: 0.9921
1602/1688 [===========================>..] - ETA: 0s - loss: 0.0248 - accuracy: 0.9921
1631/1688 [===========================>..] - ETA: 0s - loss: 0.0249 - accuracy: 0.9920
1660/1688 [============================>.] - ETA: 0s - loss: 0.0249 - accuracy: 0.9920
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0250 - accuracy: 0.9920 - val_loss: 0.0395 - val_accuracy: 0.9890
Epoch 5/10

   1/1688 [..............................] - ETA: 2s - loss: 0.0010 - accuracy: 1.0000
  31/1688 [..............................] - ETA: 2s - loss: 0.0100 - accuracy: 0.9950
  61/1688 [>.............................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9961
  91/1688 [>.............................] - ETA: 2s - loss: 0.0111 - accuracy: 0.9960
 121/1688 [=>............................] - ETA: 2s - loss: 0.0126 - accuracy: 0.9957
 151/1688 [=>............................] - ETA: 2s - loss: 0.0132 - accuracy: 0.9956
 180/1688 [==>...........................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9957
 209/1688 [==>...........................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9957
 239/1688 [===>..........................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9957
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0135 - accuracy: 0.9957
 298/1688 [====>.........................] - ETA: 2s - loss: 0.0135 - accuracy: 0.9957
 327/1688 [====>.........................] - ETA: 2s - loss: 0.0135 - accuracy: 0.9957
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0136 - accuracy: 0.9957
 386/1688 [=====>........................] - ETA: 2s - loss: 0.0136 - accuracy: 0.9956
 415/1688 [======>.......................] - ETA: 2s - loss: 0.0136 - accuracy: 0.9956
 444/1688 [======>.......................] - ETA: 2s - loss: 0.0137 - accuracy: 0.9955
 473/1688 [=======>......................] - ETA: 2s - loss: 0.0138 - accuracy: 0.9955
 503/1688 [=======>......................] - ETA: 2s - loss: 0.0139 - accuracy: 0.9954
 533/1688 [========>.....................] - ETA: 1s - loss: 0.0141 - accuracy: 0.9953
 563/1688 [=========>....................] - ETA: 1s - loss: 0.0142 - accuracy: 0.9953
 593/1688 [=========>....................] - ETA: 1s - loss: 0.0143 - accuracy: 0.9952
 623/1688 [==========>...................] - ETA: 1s - loss: 0.0144 - accuracy: 0.9951
 652/1688 [==========>...................] - ETA: 1s - loss: 0.0145 - accuracy: 0.9951
 682/1688 [===========>..................] - ETA: 1s - loss: 0.0146 - accuracy: 0.9951
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0146 - accuracy: 0.9950
 742/1688 [============>.................] - ETA: 1s - loss: 0.0147 - accuracy: 0.9950
 771/1688 [============>.................] - ETA: 1s - loss: 0.0147 - accuracy: 0.9950
 800/1688 [=============>................] - ETA: 1s - loss: 0.0148 - accuracy: 0.9949
 829/1688 [=============>................] - ETA: 1s - loss: 0.0148 - accuracy: 0.9949
 856/1688 [==============>...............] - ETA: 1s - loss: 0.0149 - accuracy: 0.9949
 885/1688 [==============>...............] - ETA: 1s - loss: 0.0150 - accuracy: 0.9949
 914/1688 [===============>..............] - ETA: 1s - loss: 0.0150 - accuracy: 0.9948
 944/1688 [===============>..............] - ETA: 1s - loss: 0.0151 - accuracy: 0.9948
 974/1688 [================>.............] - ETA: 1s - loss: 0.0151 - accuracy: 0.9948
1003/1688 [================>.............] - ETA: 1s - loss: 0.0152 - accuracy: 0.9948
1032/1688 [=================>............] - ETA: 1s - loss: 0.0152 - accuracy: 0.9948
1062/1688 [=================>............] - ETA: 1s - loss: 0.0153 - accuracy: 0.9947
1092/1688 [==================>...........] - ETA: 1s - loss: 0.0154 - accuracy: 0.9947
1122/1688 [==================>...........] - ETA: 0s - loss: 0.0154 - accuracy: 0.9947
1151/1688 [===================>..........] - ETA: 0s - loss: 0.0155 - accuracy: 0.9947
1180/1688 [===================>..........] - ETA: 0s - loss: 0.0155 - accuracy: 0.9946
1210/1688 [====================>.........] - ETA: 0s - loss: 0.0156 - accuracy: 0.9946
1239/1688 [=====================>........] - ETA: 0s - loss: 0.0156 - accuracy: 0.9946
1268/1688 [=====================>........] - ETA: 0s - loss: 0.0157 - accuracy: 0.9946
1298/1688 [======================>.......] - ETA: 0s - loss: 0.0158 - accuracy: 0.9945
1328/1688 [======================>.......] - ETA: 0s - loss: 0.0158 - accuracy: 0.9945
1358/1688 [=======================>......] - ETA: 0s - loss: 0.0159 - accuracy: 0.9945
1387/1688 [=======================>......] - ETA: 0s - loss: 0.0159 - accuracy: 0.9945
1416/1688 [========================>.....] - ETA: 0s - loss: 0.0160 - accuracy: 0.9944
1443/1688 [========================>.....] - ETA: 0s - loss: 0.0161 - accuracy: 0.9944
1473/1688 [=========================>....] - ETA: 0s - loss: 0.0161 - accuracy: 0.9944
1503/1688 [=========================>....] - ETA: 0s - loss: 0.0162 - accuracy: 0.9944
1532/1688 [==========================>...] - ETA: 0s - loss: 0.0163 - accuracy: 0.9943
1562/1688 [==========================>...] - ETA: 0s - loss: 0.0163 - accuracy: 0.9943
1592/1688 [===========================>..] - ETA: 0s - loss: 0.0164 - accuracy: 0.9943
1622/1688 [===========================>..] - ETA: 0s - loss: 0.0165 - accuracy: 0.9943
1652/1688 [============================>.] - ETA: 0s - loss: 0.0166 - accuracy: 0.9942
1681/1688 [============================>.] - ETA: 0s - loss: 0.0166 - accuracy: 0.9942
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0167 - accuracy: 0.9942 - val_loss: 0.0403 - val_accuracy: 0.9895
Epoch 6/10

   1/1688 [..............................] - ETA: 2s - loss: 0.0022 - accuracy: 1.0000
  32/1688 [..............................] - ETA: 2s - loss: 0.0080 - accuracy: 0.9981
  62/1688 [>.............................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9973
  91/1688 [>.............................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9972
 120/1688 [=>............................] - ETA: 2s - loss: 0.0098 - accuracy: 0.9973
 150/1688 [=>............................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9973
 180/1688 [==>...........................] - ETA: 2s - loss: 0.0096 - accuracy: 0.9974
 210/1688 [==>...........................] - ETA: 2s - loss: 0.0095 - accuracy: 0.9974
 239/1688 [===>..........................] - ETA: 2s - loss: 0.0095 - accuracy: 0.9974
 269/1688 [===>..........................] - ETA: 2s - loss: 0.0095 - accuracy: 0.9974
 299/1688 [====>.........................] - ETA: 2s - loss: 0.0096 - accuracy: 0.9974
 329/1688 [====>.........................] - ETA: 2s - loss: 0.0096 - accuracy: 0.9973
 359/1688 [=====>........................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9973
 388/1688 [=====>........................] - ETA: 2s - loss: 0.0098 - accuracy: 0.9972
 418/1688 [======>.......................] - ETA: 2s - loss: 0.0100 - accuracy: 0.9972
 448/1688 [======>.......................] - ETA: 2s - loss: 0.0101 - accuracy: 0.9971
 478/1688 [=======>......................] - ETA: 2s - loss: 0.0102 - accuracy: 0.9971
 508/1688 [========>.....................] - ETA: 2s - loss: 0.0103 - accuracy: 0.9970
 538/1688 [========>.....................] - ETA: 1s - loss: 0.0105 - accuracy: 0.9969
 567/1688 [=========>....................] - ETA: 1s - loss: 0.0106 - accuracy: 0.9969
 596/1688 [=========>....................] - ETA: 1s - loss: 0.0107 - accuracy: 0.9968
 626/1688 [==========>...................] - ETA: 1s - loss: 0.0109 - accuracy: 0.9968
 656/1688 [==========>...................] - ETA: 1s - loss: 0.0110 - accuracy: 0.9968
 685/1688 [===========>..................] - ETA: 1s - loss: 0.0111 - accuracy: 0.9967
 715/1688 [===========>..................] - ETA: 1s - loss: 0.0111 - accuracy: 0.9967
 744/1688 [============>.................] - ETA: 1s - loss: 0.0112 - accuracy: 0.9967
 773/1688 [============>.................] - ETA: 1s - loss: 0.0113 - accuracy: 0.9966
 803/1688 [=============>................] - ETA: 1s - loss: 0.0114 - accuracy: 0.9966
 833/1688 [=============>................] - ETA: 1s - loss: 0.0115 - accuracy: 0.9966
 862/1688 [==============>...............] - ETA: 1s - loss: 0.0116 - accuracy: 0.9965
 892/1688 [==============>...............] - ETA: 1s - loss: 0.0117 - accuracy: 0.9965
 921/1688 [===============>..............] - ETA: 1s - loss: 0.0118 - accuracy: 0.9965
 951/1688 [===============>..............] - ETA: 1s - loss: 0.0119 - accuracy: 0.9964
 981/1688 [================>.............] - ETA: 1s - loss: 0.0120 - accuracy: 0.9964
1011/1688 [================>.............] - ETA: 1s - loss: 0.0122 - accuracy: 0.9963
1040/1688 [=================>............] - ETA: 1s - loss: 0.0123 - accuracy: 0.9963
1069/1688 [=================>............] - ETA: 1s - loss: 0.0124 - accuracy: 0.9962
1098/1688 [==================>...........] - ETA: 1s - loss: 0.0125 - accuracy: 0.9962
1128/1688 [===================>..........] - ETA: 0s - loss: 0.0126 - accuracy: 0.9961
1157/1688 [===================>..........] - ETA: 0s - loss: 0.0127 - accuracy: 0.9961
1186/1688 [====================>.........] - ETA: 0s - loss: 0.0128 - accuracy: 0.9960
1216/1688 [====================>.........] - ETA: 0s - loss: 0.0130 - accuracy: 0.9960
1245/1688 [=====================>........] - ETA: 0s - loss: 0.0131 - accuracy: 0.9960
1274/1688 [=====================>........] - ETA: 0s - loss: 0.0131 - accuracy: 0.9959
1304/1688 [======================>.......] - ETA: 0s - loss: 0.0132 - accuracy: 0.9959
1334/1688 [======================>.......] - ETA: 0s - loss: 0.0133 - accuracy: 0.9959
1363/1688 [=======================>......] - ETA: 0s - loss: 0.0134 - accuracy: 0.9958
1393/1688 [=======================>......] - ETA: 0s - loss: 0.0135 - accuracy: 0.9958
1423/1688 [========================>.....] - ETA: 0s - loss: 0.0136 - accuracy: 0.9957
1452/1688 [========================>.....] - ETA: 0s - loss: 0.0137 - accuracy: 0.9957
1481/1688 [=========================>....] - ETA: 0s - loss: 0.0138 - accuracy: 0.9957
1511/1688 [=========================>....] - ETA: 0s - loss: 0.0138 - accuracy: 0.9956
1541/1688 [==========================>...] - ETA: 0s - loss: 0.0139 - accuracy: 0.9956
1570/1688 [==========================>...] - ETA: 0s - loss: 0.0140 - accuracy: 0.9956
1600/1688 [===========================>..] - ETA: 0s - loss: 0.0141 - accuracy: 0.9956
1629/1688 [===========================>..] - ETA: 0s - loss: 0.0141 - accuracy: 0.9955
1659/1688 [============================>.] - ETA: 0s - loss: 0.0142 - accuracy: 0.9955
1688/1688 [==============================] - ETA: 0s - loss: 0.0143 - accuracy: 0.9955
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0143 - accuracy: 0.9955 - val_loss: 0.0554 - val_accuracy: 0.9880
Epoch 7/10

   1/1688 [..............................] - ETA: 3s - loss: 7.1154e-04 - accuracy: 1.0000
  30/1688 [..............................] - ETA: 2s - loss: 0.0248 - accuracy: 0.9930    
  60/1688 [>.............................] - ETA: 2s - loss: 0.0214 - accuracy: 0.9936
  90/1688 [>.............................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9942
 119/1688 [=>............................] - ETA: 2s - loss: 0.0175 - accuracy: 0.9945
 149/1688 [=>............................] - ETA: 2s - loss: 0.0164 - accuracy: 0.9948
 179/1688 [==>...........................] - ETA: 2s - loss: 0.0158 - accuracy: 0.9950
 209/1688 [==>...........................] - ETA: 2s - loss: 0.0153 - accuracy: 0.9951
 239/1688 [===>..........................] - ETA: 2s - loss: 0.0150 - accuracy: 0.9951
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0147 - accuracy: 0.9952
 297/1688 [====>.........................] - ETA: 2s - loss: 0.0144 - accuracy: 0.9952
 327/1688 [====>.........................] - ETA: 2s - loss: 0.0141 - accuracy: 0.9953
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0138 - accuracy: 0.9954
 387/1688 [=====>........................] - ETA: 2s - loss: 0.0136 - accuracy: 0.9955
 417/1688 [======>.......................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9955
 447/1688 [======>.......................] - ETA: 2s - loss: 0.0132 - accuracy: 0.9956
 476/1688 [=======>......................] - ETA: 2s - loss: 0.0130 - accuracy: 0.9956
 505/1688 [=======>......................] - ETA: 2s - loss: 0.0128 - accuracy: 0.9956
 534/1688 [========>.....................] - ETA: 1s - loss: 0.0127 - accuracy: 0.9957
 564/1688 [=========>....................] - ETA: 1s - loss: 0.0125 - accuracy: 0.9957
 593/1688 [=========>....................] - ETA: 1s - loss: 0.0124 - accuracy: 0.9958
 623/1688 [==========>...................] - ETA: 1s - loss: 0.0123 - accuracy: 0.9958
 652/1688 [==========>...................] - ETA: 1s - loss: 0.0122 - accuracy: 0.9959
 682/1688 [===========>..................] - ETA: 1s - loss: 0.0121 - accuracy: 0.9959
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0120 - accuracy: 0.9959
 742/1688 [============>.................] - ETA: 1s - loss: 0.0119 - accuracy: 0.9960
 772/1688 [============>.................] - ETA: 1s - loss: 0.0118 - accuracy: 0.9960
 801/1688 [=============>................] - ETA: 1s - loss: 0.0118 - accuracy: 0.9960
 831/1688 [=============>................] - ETA: 1s - loss: 0.0117 - accuracy: 0.9961
 861/1688 [==============>...............] - ETA: 1s - loss: 0.0117 - accuracy: 0.9961
 890/1688 [==============>...............] - ETA: 1s - loss: 0.0116 - accuracy: 0.9961
 920/1688 [===============>..............] - ETA: 1s - loss: 0.0116 - accuracy: 0.9961
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0115 - accuracy: 0.9961
 979/1688 [================>.............] - ETA: 1s - loss: 0.0115 - accuracy: 0.9961
1009/1688 [================>.............] - ETA: 1s - loss: 0.0115 - accuracy: 0.9962
1038/1688 [=================>............] - ETA: 1s - loss: 0.0114 - accuracy: 0.9962
1067/1688 [=================>............] - ETA: 1s - loss: 0.0114 - accuracy: 0.9962
1096/1688 [==================>...........] - ETA: 1s - loss: 0.0114 - accuracy: 0.9962
1126/1688 [===================>..........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1155/1688 [===================>..........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1184/1688 [====================>.........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1214/1688 [====================>.........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1244/1688 [=====================>........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1273/1688 [=====================>........] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1303/1688 [======================>.......] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1333/1688 [======================>.......] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1362/1688 [=======================>......] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1391/1688 [=======================>......] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1421/1688 [========================>.....] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1450/1688 [========================>.....] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1479/1688 [=========================>....] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1508/1688 [=========================>....] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1537/1688 [==========================>...] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1567/1688 [==========================>...] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1596/1688 [===========================>..] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1626/1688 [===========================>..] - ETA: 0s - loss: 0.0114 - accuracy: 0.9962
1655/1688 [============================>.] - ETA: 0s - loss: 0.0115 - accuracy: 0.9962
1685/1688 [============================>.] - ETA: 0s - loss: 0.0115 - accuracy: 0.9962
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0115 - accuracy: 0.9962 - val_loss: 0.0418 - val_accuracy: 0.9885
Epoch 8/10

   1/1688 [..............................] - ETA: 3s - loss: 0.0063 - accuracy: 1.0000
  33/1688 [..............................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9971
  62/1688 [>.............................] - ETA: 2s - loss: 0.0091 - accuracy: 0.9973
  91/1688 [>.............................] - ETA: 2s - loss: 0.0086 - accuracy: 0.9973
 121/1688 [=>............................] - ETA: 2s - loss: 0.0087 - accuracy: 0.9973
 150/1688 [=>............................] - ETA: 2s - loss: 0.0090 - accuracy: 0.9972
 179/1688 [==>...........................] - ETA: 2s - loss: 0.0092 - accuracy: 0.9971
 208/1688 [==>...........................] - ETA: 2s - loss: 0.0092 - accuracy: 0.9971
 238/1688 [===>..........................] - ETA: 2s - loss: 0.0092 - accuracy: 0.9970
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9969
 298/1688 [====>.........................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9969
 328/1688 [====>.........................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9968
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9968
 387/1688 [=====>........................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9968
 416/1688 [======>.......................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9968
 445/1688 [======>.......................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9968
 474/1688 [=======>......................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9968
 504/1688 [=======>......................] - ETA: 2s - loss: 0.0093 - accuracy: 0.9968
 533/1688 [========>.....................] - ETA: 1s - loss: 0.0093 - accuracy: 0.9968
 563/1688 [=========>....................] - ETA: 1s - loss: 0.0093 - accuracy: 0.9968
 592/1688 [=========>....................] - ETA: 1s - loss: 0.0093 - accuracy: 0.9969
 622/1688 [==========>...................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 652/1688 [==========>...................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 682/1688 [===========>..................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 711/1688 [===========>..................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 741/1688 [============>.................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 771/1688 [============>.................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 801/1688 [=============>................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 830/1688 [=============>................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 860/1688 [==============>...............] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 889/1688 [==============>...............] - ETA: 1s - loss: 0.0092 - accuracy: 0.9969
 919/1688 [===============>..............] - ETA: 1s - loss: 0.0093 - accuracy: 0.9969
 948/1688 [===============>..............] - ETA: 1s - loss: 0.0093 - accuracy: 0.9969
 978/1688 [================>.............] - ETA: 1s - loss: 0.0093 - accuracy: 0.9969
1007/1688 [================>.............] - ETA: 1s - loss: 0.0093 - accuracy: 0.9968
1037/1688 [=================>............] - ETA: 1s - loss: 0.0094 - accuracy: 0.9968
1066/1688 [=================>............] - ETA: 1s - loss: 0.0094 - accuracy: 0.9968
1096/1688 [==================>...........] - ETA: 1s - loss: 0.0094 - accuracy: 0.9968
1125/1688 [==================>...........] - ETA: 0s - loss: 0.0095 - accuracy: 0.9968
1154/1688 [===================>..........] - ETA: 0s - loss: 0.0096 - accuracy: 0.9968
1184/1688 [====================>.........] - ETA: 0s - loss: 0.0096 - accuracy: 0.9967
1213/1688 [====================>.........] - ETA: 0s - loss: 0.0097 - accuracy: 0.9967
1242/1688 [=====================>........] - ETA: 0s - loss: 0.0097 - accuracy: 0.9967
1271/1688 [=====================>........] - ETA: 0s - loss: 0.0098 - accuracy: 0.9967
1300/1688 [======================>.......] - ETA: 0s - loss: 0.0098 - accuracy: 0.9967
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0099 - accuracy: 0.9966
1358/1688 [=======================>......] - ETA: 0s - loss: 0.0099 - accuracy: 0.9966
1387/1688 [=======================>......] - ETA: 0s - loss: 0.0100 - accuracy: 0.9966
1416/1688 [========================>.....] - ETA: 0s - loss: 0.0101 - accuracy: 0.9966
1446/1688 [========================>.....] - ETA: 0s - loss: 0.0101 - accuracy: 0.9966
1476/1688 [=========================>....] - ETA: 0s - loss: 0.0102 - accuracy: 0.9965
1505/1688 [=========================>....] - ETA: 0s - loss: 0.0103 - accuracy: 0.9965
1535/1688 [==========================>...] - ETA: 0s - loss: 0.0103 - accuracy: 0.9965
1565/1688 [==========================>...] - ETA: 0s - loss: 0.0104 - accuracy: 0.9965
1595/1688 [===========================>..] - ETA: 0s - loss: 0.0104 - accuracy: 0.9964
1624/1688 [===========================>..] - ETA: 0s - loss: 0.0105 - accuracy: 0.9964
1653/1688 [============================>.] - ETA: 0s - loss: 0.0106 - accuracy: 0.9964
1683/1688 [============================>.] - ETA: 0s - loss: 0.0106 - accuracy: 0.9964
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0106 - accuracy: 0.9964 - val_loss: 0.0437 - val_accuracy: 0.9900
Epoch 9/10

   1/1688 [..............................] - ETA: 3s - loss: 0.0018 - accuracy: 1.0000
  31/1688 [..............................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9990
  60/1688 [>.............................] - ETA: 2s - loss: 0.0059 - accuracy: 0.9988
  90/1688 [>.............................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9985
 119/1688 [=>............................] - ETA: 2s - loss: 0.0064 - accuracy: 0.9983
 149/1688 [=>............................] - ETA: 2s - loss: 0.0064 - accuracy: 0.9983
 179/1688 [==>...........................] - ETA: 2s - loss: 0.0063 - accuracy: 0.9983
 208/1688 [==>...........................] - ETA: 2s - loss: 0.0063 - accuracy: 0.9983
 238/1688 [===>..........................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9983
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9983
 298/1688 [====>.........................] - ETA: 2s - loss: 0.0061 - accuracy: 0.9983
 327/1688 [====>.........................] - ETA: 2s - loss: 0.0061 - accuracy: 0.9984
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0060 - accuracy: 0.9984
 386/1688 [=====>........................] - ETA: 2s - loss: 0.0060 - accuracy: 0.9984
 415/1688 [======>.......................] - ETA: 2s - loss: 0.0059 - accuracy: 0.9984
 445/1688 [======>.......................] - ETA: 2s - loss: 0.0059 - accuracy: 0.9984
 475/1688 [=======>......................] - ETA: 2s - loss: 0.0059 - accuracy: 0.9984
 504/1688 [=======>......................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9984
 534/1688 [========>.....................] - ETA: 1s - loss: 0.0058 - accuracy: 0.9984
 564/1688 [=========>....................] - ETA: 1s - loss: 0.0058 - accuracy: 0.9984
 594/1688 [=========>....................] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 623/1688 [==========>...................] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 652/1688 [==========>...................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 682/1688 [===========>..................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 711/1688 [===========>..................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 740/1688 [============>.................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 769/1688 [============>.................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 798/1688 [=============>................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9984
 827/1688 [=============>................] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 856/1688 [==============>...............] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 886/1688 [==============>...............] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 915/1688 [===============>..............] - ETA: 1s - loss: 0.0057 - accuracy: 0.9984
 944/1688 [===============>..............] - ETA: 1s - loss: 0.0058 - accuracy: 0.9984
 973/1688 [================>.............] - ETA: 1s - loss: 0.0058 - accuracy: 0.9984
1002/1688 [================>.............] - ETA: 1s - loss: 0.0058 - accuracy: 0.9984
1032/1688 [=================>............] - ETA: 1s - loss: 0.0058 - accuracy: 0.9983
1061/1688 [=================>............] - ETA: 1s - loss: 0.0058 - accuracy: 0.9983
1091/1688 [==================>...........] - ETA: 1s - loss: 0.0059 - accuracy: 0.9983
1121/1688 [==================>...........] - ETA: 0s - loss: 0.0059 - accuracy: 0.9983
1151/1688 [===================>..........] - ETA: 0s - loss: 0.0059 - accuracy: 0.9983
1180/1688 [===================>..........] - ETA: 0s - loss: 0.0059 - accuracy: 0.9983
1211/1688 [====================>.........] - ETA: 0s - loss: 0.0060 - accuracy: 0.9983
1241/1688 [=====================>........] - ETA: 0s - loss: 0.0060 - accuracy: 0.9983
1270/1688 [=====================>........] - ETA: 0s - loss: 0.0060 - accuracy: 0.9983
1300/1688 [======================>.......] - ETA: 0s - loss: 0.0060 - accuracy: 0.9983
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0061 - accuracy: 0.9982
1358/1688 [=======================>......] - ETA: 0s - loss: 0.0061 - accuracy: 0.9982
1388/1688 [=======================>......] - ETA: 0s - loss: 0.0061 - accuracy: 0.9982
1417/1688 [========================>.....] - ETA: 0s - loss: 0.0062 - accuracy: 0.9982
1446/1688 [========================>.....] - ETA: 0s - loss: 0.0062 - accuracy: 0.9982
1475/1688 [=========================>....] - ETA: 0s - loss: 0.0062 - accuracy: 0.9982
1505/1688 [=========================>....] - ETA: 0s - loss: 0.0063 - accuracy: 0.9982
1534/1688 [==========================>...] - ETA: 0s - loss: 0.0063 - accuracy: 0.9981
1564/1688 [==========================>...] - ETA: 0s - loss: 0.0063 - accuracy: 0.9981
1594/1688 [===========================>..] - ETA: 0s - loss: 0.0064 - accuracy: 0.9981
1624/1688 [===========================>..] - ETA: 0s - loss: 0.0064 - accuracy: 0.9981
1653/1688 [============================>.] - ETA: 0s - loss: 0.0065 - accuracy: 0.9981
1682/1688 [============================>.] - ETA: 0s - loss: 0.0065 - accuracy: 0.9981
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0065 - accuracy: 0.9981 - val_loss: 0.0476 - val_accuracy: 0.9900
Epoch 10/10

   1/1688 [..............................] - ETA: 2s - loss: 8.8622e-06 - accuracy: 1.0000
  31/1688 [..............................] - ETA: 2s - loss: 0.0075 - accuracy: 0.9970    
  61/1688 [>.............................] - ETA: 2s - loss: 0.0080 - accuracy: 0.9969
  90/1688 [>.............................] - ETA: 2s - loss: 0.0076 - accuracy: 0.9971
 120/1688 [=>............................] - ETA: 2s - loss: 0.0074 - accuracy: 0.9973
 149/1688 [=>............................] - ETA: 2s - loss: 0.0073 - accuracy: 0.9974
 179/1688 [==>...........................] - ETA: 2s - loss: 0.0071 - accuracy: 0.9975
 208/1688 [==>...........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9975
 238/1688 [===>..........................] - ETA: 2s - loss: 0.0069 - accuracy: 0.9976
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0068 - accuracy: 0.9976
 297/1688 [====>.........................] - ETA: 2s - loss: 0.0067 - accuracy: 0.9977
 327/1688 [====>.........................] - ETA: 2s - loss: 0.0066 - accuracy: 0.9978
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0065 - accuracy: 0.9978
 386/1688 [=====>........................] - ETA: 2s - loss: 0.0064 - accuracy: 0.9979
 415/1688 [======>.......................] - ETA: 2s - loss: 0.0063 - accuracy: 0.9979
 444/1688 [======>.......................] - ETA: 2s - loss: 0.0063 - accuracy: 0.9979
 473/1688 [=======>......................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9980
 501/1688 [=======>......................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9980
 530/1688 [========>.....................] - ETA: 1s - loss: 0.0062 - accuracy: 0.9980
 560/1688 [========>.....................] - ETA: 1s - loss: 0.0062 - accuracy: 0.9980
 589/1688 [=========>....................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 618/1688 [=========>....................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 648/1688 [==========>...................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 678/1688 [===========>..................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 708/1688 [===========>..................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 737/1688 [============>.................] - ETA: 1s - loss: 0.0061 - accuracy: 0.9980
 766/1688 [============>.................] - ETA: 1s - loss: 0.0062 - accuracy: 0.9980
 795/1688 [=============>................] - ETA: 1s - loss: 0.0062 - accuracy: 0.9980
 825/1688 [=============>................] - ETA: 1s - loss: 0.0062 - accuracy: 0.9980
 854/1688 [==============>...............] - ETA: 1s - loss: 0.0063 - accuracy: 0.9980
 884/1688 [==============>...............] - ETA: 1s - loss: 0.0064 - accuracy: 0.9979
 914/1688 [===============>..............] - ETA: 1s - loss: 0.0065 - accuracy: 0.9979
 943/1688 [===============>..............] - ETA: 1s - loss: 0.0065 - accuracy: 0.9979
 973/1688 [================>.............] - ETA: 1s - loss: 0.0066 - accuracy: 0.9979
1002/1688 [================>.............] - ETA: 1s - loss: 0.0067 - accuracy: 0.9978
1032/1688 [=================>............] - ETA: 1s - loss: 0.0067 - accuracy: 0.9978
1061/1688 [=================>............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9978
1091/1688 [==================>...........] - ETA: 1s - loss: 0.0068 - accuracy: 0.9978
1120/1688 [==================>...........] - ETA: 0s - loss: 0.0069 - accuracy: 0.9977
1149/1688 [===================>..........] - ETA: 0s - loss: 0.0069 - accuracy: 0.9977
1178/1688 [===================>..........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9977
1207/1688 [====================>.........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9977
1236/1688 [====================>.........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9977
1266/1688 [=====================>........] - ETA: 0s - loss: 0.0071 - accuracy: 0.9977
1295/1688 [======================>.......] - ETA: 0s - loss: 0.0071 - accuracy: 0.9977
1325/1688 [======================>.......] - ETA: 0s - loss: 0.0072 - accuracy: 0.9976
1355/1688 [=======================>......] - ETA: 0s - loss: 0.0072 - accuracy: 0.9976
1385/1688 [=======================>......] - ETA: 0s - loss: 0.0073 - accuracy: 0.9976
1415/1688 [========================>.....] - ETA: 0s - loss: 0.0073 - accuracy: 0.9976
1445/1688 [========================>.....] - ETA: 0s - loss: 0.0074 - accuracy: 0.9976
1474/1688 [=========================>....] - ETA: 0s - loss: 0.0074 - accuracy: 0.9976
1504/1688 [=========================>....] - ETA: 0s - loss: 0.0075 - accuracy: 0.9975
1534/1688 [==========================>...] - ETA: 0s - loss: 0.0075 - accuracy: 0.9975
1564/1688 [==========================>...] - ETA: 0s - loss: 0.0075 - accuracy: 0.9975
1593/1688 [===========================>..] - ETA: 0s - loss: 0.0076 - accuracy: 0.9975
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0076 - accuracy: 0.9975
1653/1688 [============================>.] - ETA: 0s - loss: 0.0076 - accuracy: 0.9975
1682/1688 [============================>.] - ETA: 0s - loss: 0.0077 - accuracy: 0.9975
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0077 - accuracy: 0.9974 - val_loss: 0.0703 - val_accuracy: 0.9877
Test score: 0.0639532282948494
Test accuracy: 0.9842000007629395

4. Model quantization

We can now turn to quantization to get a discretized version of the model, where the weights and activations are quantized so as to be suitable for implementation in the Akida NSoC.

For this, we just have to quantize the Keras model using the quantize function. Here, we decide to quantize to the maximum allowed bitwidths for the first layer weights (8-bit), the subsequent layer weights (4-bit) and the activations (4-bit).

The quantized model is a Keras model where the neural layers (Conv2D, Dense) and the ReLU layers are replaced with custom CNN2SNN quantized layers (QuantizedConv2D, QuantizedDense, QuantizedReLU). All Keras API functions can be applied on this new model: summary(), compile(), fit(). etc.

Note

The quantize function folds the batch normalization layers into the corresponding neural layer. The new weights are computed according to this folding operation.

from cnn2snn import quantize

model_quantized = quantize(model_keras,
                           input_weight_quantization=8,
                           weight_quantization=4,
                           activ_quantization=4)
model_quantized.summary()

Out:

Model: "mnistnet"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (QuantizedConv2D)     (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
re_lu (ActivationDiscreteRel (None, 13, 13, 32)        0
_________________________________________________________________
separable_conv2d (QuantizedS (None, 13, 13, 64)        2400
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
re_lu_1 (ActivationDiscreteR (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (QuantizedDense)       (None, 10)                31370
=================================================================
Total params: 34,090
Trainable params: 34,090
Non-trainable params: 0
_________________________________________________________________

Check the quantized model accuracy.

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after 8-4-4 quantization:', score[1])

Out:

Test accuracy after 8-4-4 quantization: 0.9750999808311462

Since we used the maximum allowed bitwidths for weights and activations, the accuracy of the quantized model is equivalent to the one of the base model, but for lower bitwidth, the quantization usually introduces a performance drop.

Let’s try to quantize specific layers to a lower bitwidth. The CNN2SNN toolkit provides the quantize_layer function: each layer can be individually quantized.

Here, we quantize the “re_lu_1” layer to binary activations (bitwidth=1) and the “dense” layer with 2-bit weights.

from cnn2snn import quantize_layer

model_quantized = quantize_layer(model_quantized, "re_lu_1", bitwidth=1)
model_quantized = quantize_layer(model_quantized, "dense", bitwidth=2)

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after low bitwidth quantization:', score[1])

# To recover the original model accuracy, a quantization-aware training phase
# is required.

Out:

Test accuracy after low bitwidth quantization: 0.7035999894142151

5. Model fine tuning (quantization-aware training)

This quantization-aware training (fine tuning) allows to cover the performance drop due to the quantization step.

Note that since this step is a fine tuning, the number of epochs can be lowered, compared to the training from scratch of the standard model.

model_quantized.fit(x_train, y_train, epochs=5, validation_split=0.1)

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after fine tuning:', score[1])

Out:

Epoch 1/5

   1/1688 [..............................] - ETA: 17:53 - loss: 0.7226 - accuracy: 0.7188
  22/1688 [..............................] - ETA: 4s - loss: 0.4976 - accuracy: 0.8352   
  43/1688 [..............................] - ETA: 3s - loss: 0.3545 - accuracy: 0.8895
  64/1688 [>.............................] - ETA: 3s - loss: 0.2853 - accuracy: 0.9136
  85/1688 [>.............................] - ETA: 3s - loss: 0.2512 - accuracy: 0.9239
 106/1688 [>.............................] - ETA: 3s - loss: 0.2270 - accuracy: 0.9301
 127/1688 [=>............................] - ETA: 3s - loss: 0.2077 - accuracy: 0.9355
 148/1688 [=>............................] - ETA: 3s - loss: 0.1936 - accuracy: 0.9407
 169/1688 [==>...........................] - ETA: 3s - loss: 0.1815 - accuracy: 0.9438
 190/1688 [==>...........................] - ETA: 3s - loss: 0.1691 - accuracy: 0.9477
 211/1688 [==>...........................] - ETA: 3s - loss: 0.1600 - accuracy: 0.9502
 232/1688 [===>..........................] - ETA: 3s - loss: 0.1529 - accuracy: 0.9520
 253/1688 [===>..........................] - ETA: 3s - loss: 0.1456 - accuracy: 0.9545
 274/1688 [===>..........................] - ETA: 3s - loss: 0.1395 - accuracy: 0.9562
 295/1688 [====>.........................] - ETA: 3s - loss: 0.1347 - accuracy: 0.9573
 316/1688 [====>.........................] - ETA: 3s - loss: 0.1308 - accuracy: 0.9586
 337/1688 [====>.........................] - ETA: 3s - loss: 0.1283 - accuracy: 0.9598
 358/1688 [=====>........................] - ETA: 3s - loss: 0.1247 - accuracy: 0.9606
 379/1688 [=====>........................] - ETA: 3s - loss: 0.1205 - accuracy: 0.9618
 400/1688 [======>.......................] - ETA: 3s - loss: 0.1163 - accuracy: 0.9633
 421/1688 [======>.......................] - ETA: 3s - loss: 0.1124 - accuracy: 0.9646
 442/1688 [======>.......................] - ETA: 3s - loss: 0.1100 - accuracy: 0.9653
 463/1688 [=======>......................] - ETA: 2s - loss: 0.1082 - accuracy: 0.9656
 484/1688 [=======>......................] - ETA: 2s - loss: 0.1051 - accuracy: 0.9664
 505/1688 [=======>......................] - ETA: 2s - loss: 0.1030 - accuracy: 0.9669
 526/1688 [========>.....................] - ETA: 2s - loss: 0.1006 - accuracy: 0.9677
 547/1688 [========>.....................] - ETA: 2s - loss: 0.0991 - accuracy: 0.9682
 568/1688 [=========>....................] - ETA: 2s - loss: 0.0969 - accuracy: 0.9689
 589/1688 [=========>....................] - ETA: 2s - loss: 0.0950 - accuracy: 0.9694
 610/1688 [=========>....................] - ETA: 2s - loss: 0.0936 - accuracy: 0.9699
 631/1688 [==========>...................] - ETA: 2s - loss: 0.0916 - accuracy: 0.9707
 652/1688 [==========>...................] - ETA: 2s - loss: 0.0904 - accuracy: 0.9710
 673/1688 [==========>...................] - ETA: 2s - loss: 0.0894 - accuracy: 0.9713
 694/1688 [===========>..................] - ETA: 2s - loss: 0.0877 - accuracy: 0.9716
 716/1688 [===========>..................] - ETA: 2s - loss: 0.0864 - accuracy: 0.9720
 737/1688 [============>.................] - ETA: 2s - loss: 0.0848 - accuracy: 0.9724
 758/1688 [============>.................] - ETA: 2s - loss: 0.0838 - accuracy: 0.9727
 779/1688 [============>.................] - ETA: 2s - loss: 0.0825 - accuracy: 0.9731
 800/1688 [=============>................] - ETA: 2s - loss: 0.0812 - accuracy: 0.9736
 821/1688 [=============>................] - ETA: 2s - loss: 0.0799 - accuracy: 0.9739
 842/1688 [=============>................] - ETA: 2s - loss: 0.0785 - accuracy: 0.9744
 863/1688 [==============>...............] - ETA: 1s - loss: 0.0775 - accuracy: 0.9745
 884/1688 [==============>...............] - ETA: 1s - loss: 0.0766 - accuracy: 0.9749
 905/1688 [===============>..............] - ETA: 1s - loss: 0.0755 - accuracy: 0.9752
 926/1688 [===============>..............] - ETA: 1s - loss: 0.0746 - accuracy: 0.9754
 947/1688 [===============>..............] - ETA: 1s - loss: 0.0737 - accuracy: 0.9756
 968/1688 [================>.............] - ETA: 1s - loss: 0.0728 - accuracy: 0.9759
 989/1688 [================>.............] - ETA: 1s - loss: 0.0719 - accuracy: 0.9761
1010/1688 [================>.............] - ETA: 1s - loss: 0.0709 - accuracy: 0.9765
1031/1688 [=================>............] - ETA: 1s - loss: 0.0704 - accuracy: 0.9766
1052/1688 [=================>............] - ETA: 1s - loss: 0.0699 - accuracy: 0.9767
1073/1688 [==================>...........] - ETA: 1s - loss: 0.0691 - accuracy: 0.9769
1094/1688 [==================>...........] - ETA: 1s - loss: 0.0686 - accuracy: 0.9770
1115/1688 [==================>...........] - ETA: 1s - loss: 0.0683 - accuracy: 0.9771
1136/1688 [===================>..........] - ETA: 1s - loss: 0.0677 - accuracy: 0.9772
1157/1688 [===================>..........] - ETA: 1s - loss: 0.0671 - accuracy: 0.9774
1178/1688 [===================>..........] - ETA: 1s - loss: 0.0663 - accuracy: 0.9777
1199/1688 [====================>.........] - ETA: 1s - loss: 0.0657 - accuracy: 0.9779
1220/1688 [====================>.........] - ETA: 1s - loss: 0.0652 - accuracy: 0.9781
1241/1688 [=====================>........] - ETA: 1s - loss: 0.0647 - accuracy: 0.9783
1262/1688 [=====================>........] - ETA: 1s - loss: 0.0642 - accuracy: 0.9785
1283/1688 [=====================>........] - ETA: 0s - loss: 0.0638 - accuracy: 0.9786
1304/1688 [======================>.......] - ETA: 0s - loss: 0.0634 - accuracy: 0.9787
1325/1688 [======================>.......] - ETA: 0s - loss: 0.0628 - accuracy: 0.9789
1346/1688 [======================>.......] - ETA: 0s - loss: 0.0622 - accuracy: 0.9791
1367/1688 [=======================>......] - ETA: 0s - loss: 0.0617 - accuracy: 0.9793
1388/1688 [=======================>......] - ETA: 0s - loss: 0.0613 - accuracy: 0.9794
1409/1688 [========================>.....] - ETA: 0s - loss: 0.0608 - accuracy: 0.9796
1430/1688 [========================>.....] - ETA: 0s - loss: 0.0604 - accuracy: 0.9797
1451/1688 [========================>.....] - ETA: 0s - loss: 0.0600 - accuracy: 0.9799
1472/1688 [=========================>....] - ETA: 0s - loss: 0.0595 - accuracy: 0.9800
1493/1688 [=========================>....] - ETA: 0s - loss: 0.0594 - accuracy: 0.9802
1514/1688 [=========================>....] - ETA: 0s - loss: 0.0591 - accuracy: 0.9802
1535/1688 [==========================>...] - ETA: 0s - loss: 0.0591 - accuracy: 0.9803
1556/1688 [==========================>...] - ETA: 0s - loss: 0.0587 - accuracy: 0.9803
1578/1688 [===========================>..] - ETA: 0s - loss: 0.0584 - accuracy: 0.9804
1599/1688 [===========================>..] - ETA: 0s - loss: 0.0583 - accuracy: 0.9805
1621/1688 [===========================>..] - ETA: 0s - loss: 0.0581 - accuracy: 0.9806
1642/1688 [============================>.] - ETA: 0s - loss: 0.0578 - accuracy: 0.9807
1663/1688 [============================>.] - ETA: 0s - loss: 0.0574 - accuracy: 0.9808
1684/1688 [============================>.] - ETA: 0s - loss: 0.0569 - accuracy: 0.9810
1688/1688 [==============================] - 5s 3ms/step - loss: 0.0568 - accuracy: 0.9810 - val_loss: 0.0543 - val_accuracy: 0.9862
Epoch 2/5

   1/1688 [..............................] - ETA: 4s - loss: 0.0257 - accuracy: 1.0000
  22/1688 [..............................] - ETA: 4s - loss: 0.0302 - accuracy: 0.9901
  43/1688 [..............................] - ETA: 3s - loss: 0.0257 - accuracy: 0.9913
  64/1688 [>.............................] - ETA: 3s - loss: 0.0214 - accuracy: 0.9932
  85/1688 [>.............................] - ETA: 3s - loss: 0.0208 - accuracy: 0.9934
 106/1688 [>.............................] - ETA: 3s - loss: 0.0215 - accuracy: 0.9929
 128/1688 [=>............................] - ETA: 3s - loss: 0.0227 - accuracy: 0.9924
 149/1688 [=>............................] - ETA: 3s - loss: 0.0228 - accuracy: 0.9922
 170/1688 [==>...........................] - ETA: 3s - loss: 0.0250 - accuracy: 0.9919
 191/1688 [==>...........................] - ETA: 3s - loss: 0.0251 - accuracy: 0.9917
 212/1688 [==>...........................] - ETA: 3s - loss: 0.0255 - accuracy: 0.9916
 233/1688 [===>..........................] - ETA: 3s - loss: 0.0262 - accuracy: 0.9916
 255/1688 [===>..........................] - ETA: 3s - loss: 0.0256 - accuracy: 0.9918
 276/1688 [===>..........................] - ETA: 3s - loss: 0.0283 - accuracy: 0.9906
 297/1688 [====>.........................] - ETA: 3s - loss: 0.0284 - accuracy: 0.9903
 318/1688 [====>.........................] - ETA: 3s - loss: 0.0278 - accuracy: 0.9906
 339/1688 [=====>........................] - ETA: 3s - loss: 0.0278 - accuracy: 0.9905
 360/1688 [=====>........................] - ETA: 3s - loss: 0.0281 - accuracy: 0.9903
 381/1688 [=====>........................] - ETA: 3s - loss: 0.0290 - accuracy: 0.9902
 402/1688 [======>.......................] - ETA: 3s - loss: 0.0287 - accuracy: 0.9903
 423/1688 [======>.......................] - ETA: 3s - loss: 0.0286 - accuracy: 0.9904
 444/1688 [======>.......................] - ETA: 3s - loss: 0.0280 - accuracy: 0.9906
 465/1688 [=======>......................] - ETA: 2s - loss: 0.0277 - accuracy: 0.9906
 486/1688 [=======>......................] - ETA: 2s - loss: 0.0270 - accuracy: 0.9909
 507/1688 [========>.....................] - ETA: 2s - loss: 0.0271 - accuracy: 0.9908
 528/1688 [========>.....................] - ETA: 2s - loss: 0.0273 - accuracy: 0.9907
 549/1688 [========>.....................] - ETA: 2s - loss: 0.0271 - accuracy: 0.9909
 570/1688 [=========>....................] - ETA: 2s - loss: 0.0270 - accuracy: 0.9910
 592/1688 [=========>....................] - ETA: 2s - loss: 0.0265 - accuracy: 0.9911
 613/1688 [=========>....................] - ETA: 2s - loss: 0.0261 - accuracy: 0.9913
 634/1688 [==========>...................] - ETA: 2s - loss: 0.0261 - accuracy: 0.9913
 655/1688 [==========>...................] - ETA: 2s - loss: 0.0267 - accuracy: 0.9911
 677/1688 [===========>..................] - ETA: 2s - loss: 0.0268 - accuracy: 0.9910
 698/1688 [===========>..................] - ETA: 2s - loss: 0.0265 - accuracy: 0.9912
 719/1688 [===========>..................] - ETA: 2s - loss: 0.0265 - accuracy: 0.9912
 740/1688 [============>.................] - ETA: 2s - loss: 0.0267 - accuracy: 0.9912
 761/1688 [============>.................] - ETA: 2s - loss: 0.0265 - accuracy: 0.9911
 782/1688 [============>.................] - ETA: 2s - loss: 0.0263 - accuracy: 0.9912
 803/1688 [=============>................] - ETA: 2s - loss: 0.0261 - accuracy: 0.9913
 824/1688 [=============>................] - ETA: 2s - loss: 0.0264 - accuracy: 0.9912
 845/1688 [==============>...............] - ETA: 2s - loss: 0.0269 - accuracy: 0.9911
 866/1688 [==============>...............] - ETA: 1s - loss: 0.0272 - accuracy: 0.9910
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0272 - accuracy: 0.9909
 908/1688 [===============>..............] - ETA: 1s - loss: 0.0271 - accuracy: 0.9909
 929/1688 [===============>..............] - ETA: 1s - loss: 0.0273 - accuracy: 0.9908
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0270 - accuracy: 0.9910
 972/1688 [================>.............] - ETA: 1s - loss: 0.0266 - accuracy: 0.9911
 993/1688 [================>.............] - ETA: 1s - loss: 0.0269 - accuracy: 0.9911
1014/1688 [=================>............] - ETA: 1s - loss: 0.0274 - accuracy: 0.9908
1035/1688 [=================>............] - ETA: 1s - loss: 0.0274 - accuracy: 0.9908
1056/1688 [=================>............] - ETA: 1s - loss: 0.0272 - accuracy: 0.9908
1077/1688 [==================>...........] - ETA: 1s - loss: 0.0270 - accuracy: 0.9909
1098/1688 [==================>...........] - ETA: 1s - loss: 0.0271 - accuracy: 0.9908
1119/1688 [==================>...........] - ETA: 1s - loss: 0.0272 - accuracy: 0.9908
1140/1688 [===================>..........] - ETA: 1s - loss: 0.0271 - accuracy: 0.9908
1161/1688 [===================>..........] - ETA: 1s - loss: 0.0268 - accuracy: 0.9909
1182/1688 [====================>.........] - ETA: 1s - loss: 0.0272 - accuracy: 0.9908
1203/1688 [====================>.........] - ETA: 1s - loss: 0.0270 - accuracy: 0.9909
1224/1688 [====================>.........] - ETA: 1s - loss: 0.0268 - accuracy: 0.9910
1245/1688 [=====================>........] - ETA: 1s - loss: 0.0266 - accuracy: 0.9911
1266/1688 [=====================>........] - ETA: 1s - loss: 0.0266 - accuracy: 0.9911
1287/1688 [=====================>........] - ETA: 0s - loss: 0.0266 - accuracy: 0.9911
1308/1688 [======================>.......] - ETA: 0s - loss: 0.0266 - accuracy: 0.9911
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0265 - accuracy: 0.9912
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0266 - accuracy: 0.9911
1371/1688 [=======================>......] - ETA: 0s - loss: 0.0266 - accuracy: 0.9911
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0266 - accuracy: 0.9912
1413/1688 [========================>.....] - ETA: 0s - loss: 0.0267 - accuracy: 0.9911
1434/1688 [========================>.....] - ETA: 0s - loss: 0.0267 - accuracy: 0.9911
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0267 - accuracy: 0.9911
1476/1688 [=========================>....] - ETA: 0s - loss: 0.0266 - accuracy: 0.9911
1497/1688 [=========================>....] - ETA: 0s - loss: 0.0268 - accuracy: 0.9909
1518/1688 [=========================>....] - ETA: 0s - loss: 0.0271 - accuracy: 0.9909
1539/1688 [==========================>...] - ETA: 0s - loss: 0.0270 - accuracy: 0.9909
1560/1688 [==========================>...] - ETA: 0s - loss: 0.0275 - accuracy: 0.9908
1581/1688 [===========================>..] - ETA: 0s - loss: 0.0277 - accuracy: 0.9907
1602/1688 [===========================>..] - ETA: 0s - loss: 0.0280 - accuracy: 0.9906
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0279 - accuracy: 0.9906
1644/1688 [============================>.] - ETA: 0s - loss: 0.0278 - accuracy: 0.9906
1665/1688 [============================>.] - ETA: 0s - loss: 0.0282 - accuracy: 0.9905
1686/1688 [============================>.] - ETA: 0s - loss: 0.0283 - accuracy: 0.9905
1688/1688 [==============================] - 4s 3ms/step - loss: 0.0283 - accuracy: 0.9904 - val_loss: 0.0524 - val_accuracy: 0.9870
Epoch 3/5

   1/1688 [..............................] - ETA: 4s - loss: 7.9493e-04 - accuracy: 1.0000
  22/1688 [..............................] - ETA: 3s - loss: 0.0175 - accuracy: 0.9915    
  43/1688 [..............................] - ETA: 3s - loss: 0.0242 - accuracy: 0.9913
  64/1688 [>.............................] - ETA: 3s - loss: 0.0211 - accuracy: 0.9927
  86/1688 [>.............................] - ETA: 3s - loss: 0.0238 - accuracy: 0.9924
 107/1688 [>.............................] - ETA: 3s - loss: 0.0246 - accuracy: 0.9924
 128/1688 [=>............................] - ETA: 3s - loss: 0.0225 - accuracy: 0.9929
 149/1688 [=>............................] - ETA: 3s - loss: 0.0219 - accuracy: 0.9927
 170/1688 [==>...........................] - ETA: 3s - loss: 0.0203 - accuracy: 0.9932
 191/1688 [==>...........................] - ETA: 3s - loss: 0.0211 - accuracy: 0.9931
 213/1688 [==>...........................] - ETA: 3s - loss: 0.0200 - accuracy: 0.9934
 234/1688 [===>..........................] - ETA: 3s - loss: 0.0203 - accuracy: 0.9928
 255/1688 [===>..........................] - ETA: 3s - loss: 0.0202 - accuracy: 0.9928
 276/1688 [===>..........................] - ETA: 3s - loss: 0.0197 - accuracy: 0.9930
 297/1688 [====>.........................] - ETA: 3s - loss: 0.0205 - accuracy: 0.9930
 318/1688 [====>.........................] - ETA: 3s - loss: 0.0206 - accuracy: 0.9930
 339/1688 [=====>........................] - ETA: 3s - loss: 0.0209 - accuracy: 0.9930
 360/1688 [=====>........................] - ETA: 3s - loss: 0.0213 - accuracy: 0.9929
 381/1688 [=====>........................] - ETA: 3s - loss: 0.0212 - accuracy: 0.9929
 402/1688 [======>.......................] - ETA: 3s - loss: 0.0211 - accuracy: 0.9928
 423/1688 [======>.......................] - ETA: 3s - loss: 0.0206 - accuracy: 0.9931
 444/1688 [======>.......................] - ETA: 3s - loss: 0.0211 - accuracy: 0.9930
 465/1688 [=======>......................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9927
 486/1688 [=======>......................] - ETA: 2s - loss: 0.0216 - accuracy: 0.9928
 507/1688 [========>.....................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9927
 528/1688 [========>.....................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9927
 549/1688 [========>.....................] - ETA: 2s - loss: 0.0221 - accuracy: 0.9925
 570/1688 [=========>....................] - ETA: 2s - loss: 0.0221 - accuracy: 0.9925
 592/1688 [=========>....................] - ETA: 2s - loss: 0.0216 - accuracy: 0.9927
 613/1688 [=========>....................] - ETA: 2s - loss: 0.0220 - accuracy: 0.9925
 634/1688 [==========>...................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9925
 655/1688 [==========>...................] - ETA: 2s - loss: 0.0215 - accuracy: 0.9926
 677/1688 [===========>..................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9926
 698/1688 [===========>..................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9926
 719/1688 [===========>..................] - ETA: 2s - loss: 0.0217 - accuracy: 0.9926
 740/1688 [============>.................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9926
 761/1688 [============>.................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9925
 782/1688 [============>.................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9925
 803/1688 [=============>................] - ETA: 2s - loss: 0.0216 - accuracy: 0.9926
 824/1688 [=============>................] - ETA: 2s - loss: 0.0217 - accuracy: 0.9926
 845/1688 [==============>...............] - ETA: 2s - loss: 0.0216 - accuracy: 0.9925
 866/1688 [==============>...............] - ETA: 1s - loss: 0.0216 - accuracy: 0.9925
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0215 - accuracy: 0.9925
 908/1688 [===============>..............] - ETA: 1s - loss: 0.0214 - accuracy: 0.9926
 929/1688 [===============>..............] - ETA: 1s - loss: 0.0221 - accuracy: 0.9924
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0221 - accuracy: 0.9925
 971/1688 [================>.............] - ETA: 1s - loss: 0.0220 - accuracy: 0.9924
 992/1688 [================>.............] - ETA: 1s - loss: 0.0218 - accuracy: 0.9925
1013/1688 [=================>............] - ETA: 1s - loss: 0.0218 - accuracy: 0.9924
1034/1688 [=================>............] - ETA: 1s - loss: 0.0219 - accuracy: 0.9925
1055/1688 [=================>............] - ETA: 1s - loss: 0.0220 - accuracy: 0.9924
1076/1688 [==================>...........] - ETA: 1s - loss: 0.0223 - accuracy: 0.9924
1097/1688 [==================>...........] - ETA: 1s - loss: 0.0222 - accuracy: 0.9925
1118/1688 [==================>...........] - ETA: 1s - loss: 0.0224 - accuracy: 0.9924
1139/1688 [===================>..........] - ETA: 1s - loss: 0.0224 - accuracy: 0.9924
1160/1688 [===================>..........] - ETA: 1s - loss: 0.0228 - accuracy: 0.9921
1181/1688 [===================>..........] - ETA: 1s - loss: 0.0231 - accuracy: 0.9921
1202/1688 [====================>.........] - ETA: 1s - loss: 0.0234 - accuracy: 0.9920
1223/1688 [====================>.........] - ETA: 1s - loss: 0.0236 - accuracy: 0.9919
1244/1688 [=====================>........] - ETA: 1s - loss: 0.0235 - accuracy: 0.9919
1265/1688 [=====================>........] - ETA: 1s - loss: 0.0234 - accuracy: 0.9920
1286/1688 [=====================>........] - ETA: 0s - loss: 0.0232 - accuracy: 0.9920
1307/1688 [======================>.......] - ETA: 0s - loss: 0.0231 - accuracy: 0.9921
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0231 - accuracy: 0.9921
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1371/1688 [=======================>......] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1413/1688 [========================>.....] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1434/1688 [========================>.....] - ETA: 0s - loss: 0.0232 - accuracy: 0.9920
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0234 - accuracy: 0.9920
1475/1688 [=========================>....] - ETA: 0s - loss: 0.0233 - accuracy: 0.9920
1496/1688 [=========================>....] - ETA: 0s - loss: 0.0232 - accuracy: 0.9921
1517/1688 [=========================>....] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1538/1688 [==========================>...] - ETA: 0s - loss: 0.0230 - accuracy: 0.9921
1559/1688 [==========================>...] - ETA: 0s - loss: 0.0232 - accuracy: 0.9920
1580/1688 [===========================>..] - ETA: 0s - loss: 0.0235 - accuracy: 0.9920
1602/1688 [===========================>..] - ETA: 0s - loss: 0.0235 - accuracy: 0.9919
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0236 - accuracy: 0.9919
1644/1688 [============================>.] - ETA: 0s - loss: 0.0238 - accuracy: 0.9918
1665/1688 [============================>.] - ETA: 0s - loss: 0.0239 - accuracy: 0.9918
1687/1688 [============================>.] - ETA: 0s - loss: 0.0238 - accuracy: 0.9918
1688/1688 [==============================] - 4s 3ms/step - loss: 0.0238 - accuracy: 0.9918 - val_loss: 0.0577 - val_accuracy: 0.9875
Epoch 4/5

   1/1688 [..............................] - ETA: 4s - loss: 0.0011 - accuracy: 1.0000
  22/1688 [..............................] - ETA: 4s - loss: 0.0161 - accuracy: 0.9915
  43/1688 [..............................] - ETA: 3s - loss: 0.0171 - accuracy: 0.9920
  64/1688 [>.............................] - ETA: 3s - loss: 0.0186 - accuracy: 0.9917
  85/1688 [>.............................] - ETA: 3s - loss: 0.0197 - accuracy: 0.9919
 106/1688 [>.............................] - ETA: 3s - loss: 0.0183 - accuracy: 0.9923
 127/1688 [=>............................] - ETA: 3s - loss: 0.0185 - accuracy: 0.9921
 148/1688 [=>............................] - ETA: 3s - loss: 0.0183 - accuracy: 0.9926
 169/1688 [==>...........................] - ETA: 3s - loss: 0.0179 - accuracy: 0.9928
 190/1688 [==>...........................] - ETA: 3s - loss: 0.0180 - accuracy: 0.9928
 211/1688 [==>...........................] - ETA: 3s - loss: 0.0175 - accuracy: 0.9930
 232/1688 [===>..........................] - ETA: 3s - loss: 0.0166 - accuracy: 0.9935
 254/1688 [===>..........................] - ETA: 3s - loss: 0.0168 - accuracy: 0.9937
 275/1688 [===>..........................] - ETA: 3s - loss: 0.0171 - accuracy: 0.9937
 296/1688 [====>.........................] - ETA: 3s - loss: 0.0167 - accuracy: 0.9938
 317/1688 [====>.........................] - ETA: 3s - loss: 0.0176 - accuracy: 0.9936
 338/1688 [=====>........................] - ETA: 3s - loss: 0.0177 - accuracy: 0.9936
 359/1688 [=====>........................] - ETA: 3s - loss: 0.0178 - accuracy: 0.9936
 380/1688 [=====>........................] - ETA: 3s - loss: 0.0176 - accuracy: 0.9937
 401/1688 [======>.......................] - ETA: 3s - loss: 0.0174 - accuracy: 0.9937
 422/1688 [======>.......................] - ETA: 3s - loss: 0.0178 - accuracy: 0.9935
 443/1688 [======>.......................] - ETA: 3s - loss: 0.0173 - accuracy: 0.9937
 464/1688 [=======>......................] - ETA: 2s - loss: 0.0178 - accuracy: 0.9935
 486/1688 [=======>......................] - ETA: 2s - loss: 0.0179 - accuracy: 0.9934
 507/1688 [========>.....................] - ETA: 2s - loss: 0.0182 - accuracy: 0.9934
 528/1688 [========>.....................] - ETA: 2s - loss: 0.0197 - accuracy: 0.9931
 549/1688 [========>.....................] - ETA: 2s - loss: 0.0199 - accuracy: 0.9932
 570/1688 [=========>....................] - ETA: 2s - loss: 0.0196 - accuracy: 0.9931
 591/1688 [=========>....................] - ETA: 2s - loss: 0.0193 - accuracy: 0.9932
 613/1688 [=========>....................] - ETA: 2s - loss: 0.0192 - accuracy: 0.9932
 634/1688 [==========>...................] - ETA: 2s - loss: 0.0192 - accuracy: 0.9932
 655/1688 [==========>...................] - ETA: 2s - loss: 0.0189 - accuracy: 0.9934
 676/1688 [===========>..................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9935
 697/1688 [===========>..................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9935
 718/1688 [===========>..................] - ETA: 2s - loss: 0.0195 - accuracy: 0.9933
 739/1688 [============>.................] - ETA: 2s - loss: 0.0196 - accuracy: 0.9932
 760/1688 [============>.................] - ETA: 2s - loss: 0.0193 - accuracy: 0.9933
 781/1688 [============>.................] - ETA: 2s - loss: 0.0189 - accuracy: 0.9935
 802/1688 [=============>................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9934
 823/1688 [=============>................] - ETA: 2s - loss: 0.0191 - accuracy: 0.9933
 844/1688 [==============>...............] - ETA: 2s - loss: 0.0191 - accuracy: 0.9933
 865/1688 [==============>...............] - ETA: 1s - loss: 0.0190 - accuracy: 0.9932
 886/1688 [==============>...............] - ETA: 1s - loss: 0.0187 - accuracy: 0.9933
 907/1688 [===============>..............] - ETA: 1s - loss: 0.0189 - accuracy: 0.9934
 928/1688 [===============>..............] - ETA: 1s - loss: 0.0189 - accuracy: 0.9934
 949/1688 [===============>..............] - ETA: 1s - loss: 0.0191 - accuracy: 0.9933
 970/1688 [================>.............] - ETA: 1s - loss: 0.0190 - accuracy: 0.9933
 991/1688 [================>.............] - ETA: 1s - loss: 0.0189 - accuracy: 0.9933
1012/1688 [================>.............] - ETA: 1s - loss: 0.0187 - accuracy: 0.9934
1033/1688 [=================>............] - ETA: 1s - loss: 0.0185 - accuracy: 0.9934
1054/1688 [=================>............] - ETA: 1s - loss: 0.0186 - accuracy: 0.9934
1075/1688 [==================>...........] - ETA: 1s - loss: 0.0186 - accuracy: 0.9934
1096/1688 [==================>...........] - ETA: 1s - loss: 0.0187 - accuracy: 0.9934
1117/1688 [==================>...........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9934
1138/1688 [===================>..........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9934
1159/1688 [===================>..........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9934
1181/1688 [===================>..........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9934
1202/1688 [====================>.........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9935
1223/1688 [====================>.........] - ETA: 1s - loss: 0.0189 - accuracy: 0.9934
1245/1688 [=====================>........] - ETA: 1s - loss: 0.0188 - accuracy: 0.9934
1266/1688 [=====================>........] - ETA: 1s - loss: 0.0190 - accuracy: 0.9932
1287/1688 [=====================>........] - ETA: 0s - loss: 0.0190 - accuracy: 0.9932
1308/1688 [======================>.......] - ETA: 0s - loss: 0.0191 - accuracy: 0.9931
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0190 - accuracy: 0.9932
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0190 - accuracy: 0.9932
1371/1688 [=======================>......] - ETA: 0s - loss: 0.0191 - accuracy: 0.9932
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0194 - accuracy: 0.9930
1413/1688 [========================>.....] - ETA: 0s - loss: 0.0192 - accuracy: 0.9931
1434/1688 [========================>.....] - ETA: 0s - loss: 0.0191 - accuracy: 0.9932
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0191 - accuracy: 0.9932
1476/1688 [=========================>....] - ETA: 0s - loss: 0.0194 - accuracy: 0.9931
1497/1688 [=========================>....] - ETA: 0s - loss: 0.0196 - accuracy: 0.9930
1518/1688 [=========================>....] - ETA: 0s - loss: 0.0195 - accuracy: 0.9930
1539/1688 [==========================>...] - ETA: 0s - loss: 0.0198 - accuracy: 0.9930
1560/1688 [==========================>...] - ETA: 0s - loss: 0.0198 - accuracy: 0.9930
1581/1688 [===========================>..] - ETA: 0s - loss: 0.0199 - accuracy: 0.9930
1602/1688 [===========================>..] - ETA: 0s - loss: 0.0200 - accuracy: 0.9930
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0199 - accuracy: 0.9930
1644/1688 [============================>.] - ETA: 0s - loss: 0.0200 - accuracy: 0.9930
1665/1688 [============================>.] - ETA: 0s - loss: 0.0200 - accuracy: 0.9930
1686/1688 [============================>.] - ETA: 0s - loss: 0.0199 - accuracy: 0.9930
1688/1688 [==============================] - 4s 3ms/step - loss: 0.0199 - accuracy: 0.9930 - val_loss: 0.0561 - val_accuracy: 0.9873
Epoch 5/5

   1/1688 [..............................] - ETA: 4s - loss: 0.0067 - accuracy: 1.0000
  22/1688 [..............................] - ETA: 3s - loss: 0.0169 - accuracy: 0.9929
  43/1688 [..............................] - ETA: 3s - loss: 0.0187 - accuracy: 0.9920
  64/1688 [>.............................] - ETA: 3s - loss: 0.0245 - accuracy: 0.9912
  85/1688 [>.............................] - ETA: 3s - loss: 0.0217 - accuracy: 0.9926
 106/1688 [>.............................] - ETA: 3s - loss: 0.0212 - accuracy: 0.9926
 127/1688 [=>............................] - ETA: 3s - loss: 0.0221 - accuracy: 0.9916
 148/1688 [=>............................] - ETA: 3s - loss: 0.0216 - accuracy: 0.9920
 169/1688 [==>...........................] - ETA: 3s - loss: 0.0214 - accuracy: 0.9915
 190/1688 [==>...........................] - ETA: 3s - loss: 0.0228 - accuracy: 0.9908
 211/1688 [==>...........................] - ETA: 3s - loss: 0.0238 - accuracy: 0.9910
 232/1688 [===>..........................] - ETA: 3s - loss: 0.0239 - accuracy: 0.9914
 254/1688 [===>..........................] - ETA: 3s - loss: 0.0230 - accuracy: 0.9916
 275/1688 [===>..........................] - ETA: 3s - loss: 0.0219 - accuracy: 0.9920
 296/1688 [====>.........................] - ETA: 3s - loss: 0.0209 - accuracy: 0.9923
 317/1688 [====>.........................] - ETA: 3s - loss: 0.0204 - accuracy: 0.9923
 338/1688 [=====>........................] - ETA: 3s - loss: 0.0199 - accuracy: 0.9926
 359/1688 [=====>........................] - ETA: 3s - loss: 0.0207 - accuracy: 0.9923
 380/1688 [=====>........................] - ETA: 3s - loss: 0.0206 - accuracy: 0.9922
 401/1688 [======>.......................] - ETA: 3s - loss: 0.0214 - accuracy: 0.9920
 422/1688 [======>.......................] - ETA: 3s - loss: 0.0216 - accuracy: 0.9919
 443/1688 [======>.......................] - ETA: 3s - loss: 0.0222 - accuracy: 0.9917
 464/1688 [=======>......................] - ETA: 2s - loss: 0.0230 - accuracy: 0.9913
 485/1688 [=======>......................] - ETA: 2s - loss: 0.0230 - accuracy: 0.9914
 506/1688 [=======>......................] - ETA: 2s - loss: 0.0230 - accuracy: 0.9914
 528/1688 [========>.....................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9917
 549/1688 [========>.....................] - ETA: 2s - loss: 0.0221 - accuracy: 0.9919
 570/1688 [=========>....................] - ETA: 2s - loss: 0.0224 - accuracy: 0.9919
 591/1688 [=========>....................] - ETA: 2s - loss: 0.0223 - accuracy: 0.9920
 612/1688 [=========>....................] - ETA: 2s - loss: 0.0230 - accuracy: 0.9917
 633/1688 [==========>...................] - ETA: 2s - loss: 0.0237 - accuracy: 0.9916
 654/1688 [==========>...................] - ETA: 2s - loss: 0.0233 - accuracy: 0.9918
 675/1688 [==========>...................] - ETA: 2s - loss: 0.0233 - accuracy: 0.9919
 696/1688 [===========>..................] - ETA: 2s - loss: 0.0233 - accuracy: 0.9919
 717/1688 [===========>..................] - ETA: 2s - loss: 0.0234 - accuracy: 0.9919
 738/1688 [============>.................] - ETA: 2s - loss: 0.0235 - accuracy: 0.9919
 759/1688 [============>.................] - ETA: 2s - loss: 0.0234 - accuracy: 0.9919
 780/1688 [============>.................] - ETA: 2s - loss: 0.0234 - accuracy: 0.9920
 801/1688 [=============>................] - ETA: 2s - loss: 0.0235 - accuracy: 0.9920
 822/1688 [=============>................] - ETA: 2s - loss: 0.0235 - accuracy: 0.9920
 843/1688 [=============>................] - ETA: 2s - loss: 0.0233 - accuracy: 0.9920
 864/1688 [==============>...............] - ETA: 1s - loss: 0.0232 - accuracy: 0.9921
 885/1688 [==============>...............] - ETA: 1s - loss: 0.0229 - accuracy: 0.9922
 906/1688 [===============>..............] - ETA: 1s - loss: 0.0230 - accuracy: 0.9920
 927/1688 [===============>..............] - ETA: 1s - loss: 0.0230 - accuracy: 0.9920
 948/1688 [===============>..............] - ETA: 1s - loss: 0.0227 - accuracy: 0.9921
 969/1688 [================>.............] - ETA: 1s - loss: 0.0228 - accuracy: 0.9920
 990/1688 [================>.............] - ETA: 1s - loss: 0.0230 - accuracy: 0.9920
1012/1688 [================>.............] - ETA: 1s - loss: 0.0229 - accuracy: 0.9920
1033/1688 [=================>............] - ETA: 1s - loss: 0.0226 - accuracy: 0.9921
1054/1688 [=================>............] - ETA: 1s - loss: 0.0227 - accuracy: 0.9920
1075/1688 [==================>...........] - ETA: 1s - loss: 0.0228 - accuracy: 0.9920
1097/1688 [==================>...........] - ETA: 1s - loss: 0.0224 - accuracy: 0.9922
1118/1688 [==================>...........] - ETA: 1s - loss: 0.0221 - accuracy: 0.9923
1139/1688 [===================>..........] - ETA: 1s - loss: 0.0221 - accuracy: 0.9923
1160/1688 [===================>..........] - ETA: 1s - loss: 0.0223 - accuracy: 0.9923
1181/1688 [===================>..........] - ETA: 1s - loss: 0.0225 - accuracy: 0.9923
1202/1688 [====================>.........] - ETA: 1s - loss: 0.0227 - accuracy: 0.9923
1223/1688 [====================>.........] - ETA: 1s - loss: 0.0226 - accuracy: 0.9924
1244/1688 [=====================>........] - ETA: 1s - loss: 0.0226 - accuracy: 0.9924
1265/1688 [=====================>........] - ETA: 1s - loss: 0.0225 - accuracy: 0.9924
1286/1688 [=====================>........] - ETA: 0s - loss: 0.0224 - accuracy: 0.9924
1307/1688 [======================>.......] - ETA: 0s - loss: 0.0222 - accuracy: 0.9925
1328/1688 [======================>.......] - ETA: 0s - loss: 0.0224 - accuracy: 0.9924
1349/1688 [======================>.......] - ETA: 0s - loss: 0.0225 - accuracy: 0.9924
1370/1688 [=======================>......] - ETA: 0s - loss: 0.0224 - accuracy: 0.9924
1391/1688 [=======================>......] - ETA: 0s - loss: 0.0225 - accuracy: 0.9924
1412/1688 [========================>.....] - ETA: 0s - loss: 0.0226 - accuracy: 0.9924
1433/1688 [========================>.....] - ETA: 0s - loss: 0.0227 - accuracy: 0.9924
1454/1688 [========================>.....] - ETA: 0s - loss: 0.0227 - accuracy: 0.9923
1475/1688 [=========================>....] - ETA: 0s - loss: 0.0226 - accuracy: 0.9924
1496/1688 [=========================>....] - ETA: 0s - loss: 0.0226 - accuracy: 0.9924
1517/1688 [=========================>....] - ETA: 0s - loss: 0.0224 - accuracy: 0.9925
1538/1688 [==========================>...] - ETA: 0s - loss: 0.0224 - accuracy: 0.9925
1559/1688 [==========================>...] - ETA: 0s - loss: 0.0222 - accuracy: 0.9925
1580/1688 [===========================>..] - ETA: 0s - loss: 0.0222 - accuracy: 0.9925
1601/1688 [===========================>..] - ETA: 0s - loss: 0.0222 - accuracy: 0.9925
1622/1688 [===========================>..] - ETA: 0s - loss: 0.0223 - accuracy: 0.9925
1643/1688 [============================>.] - ETA: 0s - loss: 0.0225 - accuracy: 0.9924
1664/1688 [============================>.] - ETA: 0s - loss: 0.0226 - accuracy: 0.9923
1685/1688 [============================>.] - ETA: 0s - loss: 0.0225 - accuracy: 0.9924
1688/1688 [==============================] - 4s 3ms/step - loss: 0.0225 - accuracy: 0.9924 - val_loss: 0.0535 - val_accuracy: 0.9875
Test accuracy after fine tuning: 0.9851999878883362

6. Model conversion

After having obtained a quantized model with satisfactory performance, it can be converted to a model suitable to be used in the Akida NSoC in inference mode. The convert function returns a model in Akida format, ready for the Akida NSoC or the Akida Execution Engine.

Note

One needs to supply the coefficients used to rescale the input dataset before the training - here input_scaling.

As with Keras, the summary() method provides a textual representation of the Akida model.

from cnn2snn import convert

model_akida = convert(model_quantized, input_scaling=input_scaling)
model_akida.summary()

results = model_akida.predict(raw_x_test)
accuracy = (raw_y_test == results).mean()

print('Test accuracy after conversion:', accuracy)

# For non-regression purpose
assert accuracy > 0.97

Out:

                                     Model Summary
________________________________________________________________________________________
Layer (type)                               Output shape  Kernel shape
========================================================================================
conv2d (InputConvolutional)                [13, 13, 32]  (3, 3, 1, 32)
________________________________________________________________________________________
separable_conv2d (SeparableConvolutional)  [7, 7, 64]    (3, 3, 32, 1), (1, 1, 32, 64)
________________________________________________________________________________________
dense (FullyConnected)                     [1, 1, 10]    (1, 1, 3136, 10)
________________________________________________________________________________________
Input shape: 28, 28, 1
Backend type: Software - 1.8.13


Test accuracy after conversion: 0.9847

Depending on the number of samples you run, you should find a performance of around 98% (better results can be achieved using more epochs for training).

Total running time of the script: ( 1 minutes 0.102 seconds)

Gallery generated by Sphinx-Gallery