CNN conversion flow tutorial

This tutorial illustrates how to use the CNN2SNN toolkit to convert CNN networks to SNN networks compatible with the Akida NSoC in a few steps. You can refer to our CNN2SNN toolkit user guide for further explanation.

The CNN2SNN tool is based on Keras, TensorFlow high-level API for building and training deep learning models.

Note

Please refer to TensorFlow tf.keras.models module for model creation/import details and TensorFlow Guide for details of how TensorFlow works.

MNIST example below is light enough so you do not need a GPU to run the CNN2SNN tool.

../_images/cnn2snn_flow_small.jpg

1. Load and reshape MNIST dataset

After loading, we make 2 transformations on the dataset:

  1. Reshape the sample content data (x values) into a num_samples x width x height x channels matrix.

Note

At this point, we’ll set aside the raw data for testing our converted model in the Akida Execution Engine later.

  1. Rescale the 8-bit loaded data to the range 0-to-1 for training.

Note

Input data normalization is a common step dealing with CNN (rationale is to keep data in a range that works with selected optimizers, some reading can be found here.

This shift makes almost no difference in the current example, but for some datasets rescaling the absolute values (and also shifting to zero-mean) can make a really major difference.

Also note that we store the scaling values input_scaling for use when preparing the model for the Akida Execution Engine. The implementation of the Akida neural network allows us to completely skip the rescaling step (i.e. the Akida model should be fed with the raw 8-bit values) but that does require information about what scaling was applied prior to training - see below for more details.

import tensorflow as tf
from tensorflow import keras

# Load MNIST dataset
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()

# Reshape x-data
x_train = x_train.reshape(60000, 28, 28, 1)
x_test = x_test.reshape(10000, 28, 28, 1)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')
raw_y_test = y_test

# Rescale x-data
a = 255
b = 0
input_scaling = (a, b)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

2. Model definition

Note that at this stage, there is nothing specific to the Akida NSoC. This start point is very much a completely standard CNN as defined within Keras.

An appropriate model for MNIST (inspired by this example) might look something like the following:

model_keras = keras.models.Sequential([
    keras.layers.Conv2D(filters=32, kernel_size=3, input_shape=(28, 28, 1)),
    keras.layers.MaxPool2D(),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.Conv2D(filters=64, kernel_size=3, padding='same'),
    keras.layers.MaxPool2D(padding='same'),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.Flatten(),
    keras.layers.Dense(10)
], 'mnistnet')

model_keras.summary()

Out:

Model: "mnistnet"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (Conv2D)              (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
batch_normalization (BatchNo (None, 13, 13, 32)        128
_________________________________________________________________
re_lu (ReLU)                 (None, 13, 13, 32)        0
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
batch_normalization_1 (Batch (None, 7, 7, 64)          256
_________________________________________________________________
re_lu_1 (ReLU)               (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (Dense)                (None, 10)                31370
=================================================================
Total params: 50,570
Trainable params: 50,378
Non-trainable params: 192
_________________________________________________________________

The model defined above is compatible for conversion into an Akida model, i.e. the model doesn’t include any layers or operations that aren’t Akida-compatible (please refer to the CNN2SNN toolkit documentation for full details):

  • Standard Conv2D and Dense layers are supported

  • Hidden layers must be followed by a ReLU layer.

  • BatchNormalization must always happen before activations.

  • Convolutional blocks can optionally be followed by a MaxPooling.

3. Model training

Before going any further, train the model and get its performance. The created model should have achieved a test accuracy a little over 99% after 10 epochs.

model_keras.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

model_keras.fit(x_train, y_train, epochs=10, validation_split=0.1)

score = model_keras.evaluate(x_test, y_test, verbose=0)
print('Test score:', score[0])
print('Test accuracy:', score[1])

Out:

Epoch 1/10

   1/1688 [..............................] - ETA: 0s - loss: 3.3461 - accuracy: 0.0625
  33/1688 [..............................] - ETA: 2s - loss: 0.9474 - accuracy: 0.6809
  64/1688 [>.............................] - ETA: 2s - loss: 0.6218 - accuracy: 0.7959
  95/1688 [>.............................] - ETA: 2s - loss: 0.4862 - accuracy: 0.8414
 127/1688 [=>............................] - ETA: 2s - loss: 0.4037 - accuracy: 0.8713
 159/1688 [=>............................] - ETA: 2s - loss: 0.3518 - accuracy: 0.8882
 192/1688 [==>...........................] - ETA: 2s - loss: 0.3165 - accuracy: 0.9001
 224/1688 [==>...........................] - ETA: 2s - loss: 0.2900 - accuracy: 0.9093
 256/1688 [===>..........................] - ETA: 2s - loss: 0.2673 - accuracy: 0.9172
 288/1688 [====>.........................] - ETA: 2s - loss: 0.2519 - accuracy: 0.9218
 320/1688 [====>.........................] - ETA: 2s - loss: 0.2365 - accuracy: 0.9265
 352/1688 [=====>........................] - ETA: 2s - loss: 0.2246 - accuracy: 0.9303
 385/1688 [=====>........................] - ETA: 2s - loss: 0.2176 - accuracy: 0.9325
 417/1688 [======>.......................] - ETA: 2s - loss: 0.2068 - accuracy: 0.9360
 450/1688 [======>.......................] - ETA: 1s - loss: 0.1987 - accuracy: 0.9384
 482/1688 [=======>......................] - ETA: 1s - loss: 0.1904 - accuracy: 0.9413
 515/1688 [========>.....................] - ETA: 1s - loss: 0.1825 - accuracy: 0.9435
 548/1688 [========>.....................] - ETA: 1s - loss: 0.1755 - accuracy: 0.9457
 580/1688 [=========>....................] - ETA: 1s - loss: 0.1703 - accuracy: 0.9472
 612/1688 [=========>....................] - ETA: 1s - loss: 0.1650 - accuracy: 0.9489
 644/1688 [==========>...................] - ETA: 1s - loss: 0.1609 - accuracy: 0.9500
 677/1688 [===========>..................] - ETA: 1s - loss: 0.1557 - accuracy: 0.9512
 710/1688 [===========>..................] - ETA: 1s - loss: 0.1521 - accuracy: 0.9522
 742/1688 [============>.................] - ETA: 1s - loss: 0.1503 - accuracy: 0.9529
 777/1688 [============>.................] - ETA: 1s - loss: 0.1462 - accuracy: 0.9542
 811/1688 [=============>................] - ETA: 1s - loss: 0.1427 - accuracy: 0.9550
 846/1688 [==============>...............] - ETA: 1s - loss: 0.1407 - accuracy: 0.9555
 878/1688 [==============>...............] - ETA: 1s - loss: 0.1386 - accuracy: 0.9563
 910/1688 [===============>..............] - ETA: 1s - loss: 0.1358 - accuracy: 0.9571
 943/1688 [===============>..............] - ETA: 1s - loss: 0.1331 - accuracy: 0.9577
 975/1688 [================>.............] - ETA: 1s - loss: 0.1316 - accuracy: 0.9582
1009/1688 [================>.............] - ETA: 1s - loss: 0.1297 - accuracy: 0.9589
1042/1688 [=================>............] - ETA: 1s - loss: 0.1266 - accuracy: 0.9598
1074/1688 [==================>...........] - ETA: 0s - loss: 0.1247 - accuracy: 0.9604
1107/1688 [==================>...........] - ETA: 0s - loss: 0.1229 - accuracy: 0.9609
1139/1688 [===================>..........] - ETA: 0s - loss: 0.1218 - accuracy: 0.9612
1172/1688 [===================>..........] - ETA: 0s - loss: 0.1206 - accuracy: 0.9617
1204/1688 [====================>.........] - ETA: 0s - loss: 0.1186 - accuracy: 0.9623
1236/1688 [====================>.........] - ETA: 0s - loss: 0.1168 - accuracy: 0.9629
1268/1688 [=====================>........] - ETA: 0s - loss: 0.1157 - accuracy: 0.9632
1300/1688 [======================>.......] - ETA: 0s - loss: 0.1138 - accuracy: 0.9638
1332/1688 [======================>.......] - ETA: 0s - loss: 0.1124 - accuracy: 0.9643
1364/1688 [=======================>......] - ETA: 0s - loss: 0.1112 - accuracy: 0.9647
1396/1688 [=======================>......] - ETA: 0s - loss: 0.1099 - accuracy: 0.9651
1428/1688 [========================>.....] - ETA: 0s - loss: 0.1089 - accuracy: 0.9655
1461/1688 [========================>.....] - ETA: 0s - loss: 0.1076 - accuracy: 0.9660
1493/1688 [=========================>....] - ETA: 0s - loss: 0.1074 - accuracy: 0.9662
1525/1688 [==========================>...] - ETA: 0s - loss: 0.1059 - accuracy: 0.9667
1556/1688 [==========================>...] - ETA: 0s - loss: 0.1052 - accuracy: 0.9669
1588/1688 [===========================>..] - ETA: 0s - loss: 0.1046 - accuracy: 0.9671
1621/1688 [===========================>..] - ETA: 0s - loss: 0.1036 - accuracy: 0.9675
1653/1688 [============================>.] - ETA: 0s - loss: 0.1035 - accuracy: 0.9677
1685/1688 [============================>.] - ETA: 0s - loss: 0.1029 - accuracy: 0.9679
1688/1688 [==============================] - 3s 2ms/step - loss: 0.1029 - accuracy: 0.9679 - val_loss: 0.0509 - val_accuracy: 0.9855
Epoch 2/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0263 - accuracy: 0.9688
  36/1688 [..............................] - ETA: 2s - loss: 0.0657 - accuracy: 0.9800
  70/1688 [>.............................] - ETA: 2s - loss: 0.0568 - accuracy: 0.9839
 103/1688 [>.............................] - ETA: 2s - loss: 0.0480 - accuracy: 0.9857
 136/1688 [=>............................] - ETA: 2s - loss: 0.0451 - accuracy: 0.9862
 169/1688 [==>...........................] - ETA: 2s - loss: 0.0461 - accuracy: 0.9859
 201/1688 [==>...........................] - ETA: 2s - loss: 0.0445 - accuracy: 0.9866
 234/1688 [===>..........................] - ETA: 2s - loss: 0.0417 - accuracy: 0.9876
 266/1688 [===>..........................] - ETA: 2s - loss: 0.0396 - accuracy: 0.9884
 297/1688 [====>.........................] - ETA: 2s - loss: 0.0405 - accuracy: 0.9878
 330/1688 [====>.........................] - ETA: 2s - loss: 0.0400 - accuracy: 0.9875
 362/1688 [=====>........................] - ETA: 2s - loss: 0.0386 - accuracy: 0.9877
 394/1688 [======>.......................] - ETA: 2s - loss: 0.0388 - accuracy: 0.9879
 426/1688 [======>.......................] - ETA: 1s - loss: 0.0388 - accuracy: 0.9878
 458/1688 [=======>......................] - ETA: 1s - loss: 0.0387 - accuracy: 0.9879
 490/1688 [=======>......................] - ETA: 1s - loss: 0.0393 - accuracy: 0.9876
 522/1688 [========>.....................] - ETA: 1s - loss: 0.0412 - accuracy: 0.9870
 554/1688 [========>.....................] - ETA: 1s - loss: 0.0411 - accuracy: 0.9870
 586/1688 [=========>....................] - ETA: 1s - loss: 0.0430 - accuracy: 0.9865
 617/1688 [=========>....................] - ETA: 1s - loss: 0.0429 - accuracy: 0.9867
 649/1688 [==========>...................] - ETA: 1s - loss: 0.0435 - accuracy: 0.9865
 681/1688 [===========>..................] - ETA: 1s - loss: 0.0431 - accuracy: 0.9866
 714/1688 [===========>..................] - ETA: 1s - loss: 0.0422 - accuracy: 0.9868
 746/1688 [============>.................] - ETA: 1s - loss: 0.0420 - accuracy: 0.9868
 778/1688 [============>.................] - ETA: 1s - loss: 0.0410 - accuracy: 0.9871
 810/1688 [=============>................] - ETA: 1s - loss: 0.0411 - accuracy: 0.9871
 842/1688 [=============>................] - ETA: 1s - loss: 0.0402 - accuracy: 0.9874
 874/1688 [==============>...............] - ETA: 1s - loss: 0.0403 - accuracy: 0.9873
 906/1688 [===============>..............] - ETA: 1s - loss: 0.0412 - accuracy: 0.9871
 938/1688 [===============>..............] - ETA: 1s - loss: 0.0416 - accuracy: 0.9870
 970/1688 [================>.............] - ETA: 1s - loss: 0.0414 - accuracy: 0.9870
1002/1688 [================>.............] - ETA: 1s - loss: 0.0414 - accuracy: 0.9871
1034/1688 [=================>............] - ETA: 1s - loss: 0.0419 - accuracy: 0.9870
1066/1688 [=================>............] - ETA: 0s - loss: 0.0420 - accuracy: 0.9870
1098/1688 [==================>...........] - ETA: 0s - loss: 0.0419 - accuracy: 0.9869
1130/1688 [===================>..........] - ETA: 0s - loss: 0.0415 - accuracy: 0.9869
1162/1688 [===================>..........] - ETA: 0s - loss: 0.0416 - accuracy: 0.9869
1195/1688 [====================>.........] - ETA: 0s - loss: 0.0427 - accuracy: 0.9866
1226/1688 [====================>.........] - ETA: 0s - loss: 0.0429 - accuracy: 0.9865
1257/1688 [=====================>........] - ETA: 0s - loss: 0.0438 - accuracy: 0.9863
1288/1688 [=====================>........] - ETA: 0s - loss: 0.0438 - accuracy: 0.9863
1320/1688 [======================>.......] - ETA: 0s - loss: 0.0435 - accuracy: 0.9863
1352/1688 [=======================>......] - ETA: 0s - loss: 0.0432 - accuracy: 0.9865
1385/1688 [=======================>......] - ETA: 0s - loss: 0.0427 - accuracy: 0.9866
1417/1688 [========================>.....] - ETA: 0s - loss: 0.0426 - accuracy: 0.9867
1449/1688 [========================>.....] - ETA: 0s - loss: 0.0425 - accuracy: 0.9866
1481/1688 [=========================>....] - ETA: 0s - loss: 0.0422 - accuracy: 0.9867
1514/1688 [=========================>....] - ETA: 0s - loss: 0.0422 - accuracy: 0.9867
1546/1688 [==========================>...] - ETA: 0s - loss: 0.0419 - accuracy: 0.9867
1578/1688 [===========================>..] - ETA: 0s - loss: 0.0421 - accuracy: 0.9866
1610/1688 [===========================>..] - ETA: 0s - loss: 0.0423 - accuracy: 0.9865
1642/1688 [============================>.] - ETA: 0s - loss: 0.0426 - accuracy: 0.9864
1674/1688 [============================>.] - ETA: 0s - loss: 0.0425 - accuracy: 0.9865
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0425 - accuracy: 0.9865 - val_loss: 0.0601 - val_accuracy: 0.9830
Epoch 3/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0015 - accuracy: 1.0000
  37/1688 [..............................] - ETA: 2s - loss: 0.0171 - accuracy: 0.9958
  71/1688 [>.............................] - ETA: 2s - loss: 0.0183 - accuracy: 0.9943
 105/1688 [>.............................] - ETA: 2s - loss: 0.0194 - accuracy: 0.9946
 137/1688 [=>............................] - ETA: 2s - loss: 0.0174 - accuracy: 0.9950
 168/1688 [=>............................] - ETA: 2s - loss: 0.0159 - accuracy: 0.9953
 200/1688 [==>...........................] - ETA: 2s - loss: 0.0181 - accuracy: 0.9947
 232/1688 [===>..........................] - ETA: 2s - loss: 0.0187 - accuracy: 0.9946
 264/1688 [===>..........................] - ETA: 2s - loss: 0.0197 - accuracy: 0.9943
 296/1688 [====>.........................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9945
 329/1688 [====>.........................] - ETA: 2s - loss: 0.0188 - accuracy: 0.9943
 361/1688 [=====>........................] - ETA: 2s - loss: 0.0207 - accuracy: 0.9938
 393/1688 [=====>........................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9932
 426/1688 [======>.......................] - ETA: 1s - loss: 0.0238 - accuracy: 0.9928
 458/1688 [=======>......................] - ETA: 1s - loss: 0.0240 - accuracy: 0.9928
 490/1688 [=======>......................] - ETA: 1s - loss: 0.0244 - accuracy: 0.9924
 521/1688 [========>.....................] - ETA: 1s - loss: 0.0250 - accuracy: 0.9922
 553/1688 [========>.....................] - ETA: 1s - loss: 0.0261 - accuracy: 0.9920
 585/1688 [=========>....................] - ETA: 1s - loss: 0.0263 - accuracy: 0.9917
 617/1688 [=========>....................] - ETA: 1s - loss: 0.0262 - accuracy: 0.9918
 649/1688 [==========>...................] - ETA: 1s - loss: 0.0269 - accuracy: 0.9916
 680/1688 [===========>..................] - ETA: 1s - loss: 0.0267 - accuracy: 0.9915
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0266 - accuracy: 0.9914
 744/1688 [============>.................] - ETA: 1s - loss: 0.0263 - accuracy: 0.9916
 776/1688 [============>.................] - ETA: 1s - loss: 0.0257 - accuracy: 0.9917
 807/1688 [=============>................] - ETA: 1s - loss: 0.0257 - accuracy: 0.9917
 839/1688 [=============>................] - ETA: 1s - loss: 0.0261 - accuracy: 0.9916
 872/1688 [==============>...............] - ETA: 1s - loss: 0.0265 - accuracy: 0.9915
 904/1688 [===============>..............] - ETA: 1s - loss: 0.0268 - accuracy: 0.9914
 935/1688 [===============>..............] - ETA: 1s - loss: 0.0277 - accuracy: 0.9911
 967/1688 [================>.............] - ETA: 1s - loss: 0.0285 - accuracy: 0.9909
 999/1688 [================>.............] - ETA: 1s - loss: 0.0287 - accuracy: 0.9909
1031/1688 [=================>............] - ETA: 1s - loss: 0.0286 - accuracy: 0.9908
1064/1688 [=================>............] - ETA: 0s - loss: 0.0288 - accuracy: 0.9907
1097/1688 [==================>...........] - ETA: 0s - loss: 0.0289 - accuracy: 0.9907
1128/1688 [===================>..........] - ETA: 0s - loss: 0.0291 - accuracy: 0.9907
1161/1688 [===================>..........] - ETA: 0s - loss: 0.0293 - accuracy: 0.9907
1192/1688 [====================>.........] - ETA: 0s - loss: 0.0288 - accuracy: 0.9909
1224/1688 [====================>.........] - ETA: 0s - loss: 0.0290 - accuracy: 0.9909
1258/1688 [=====================>........] - ETA: 0s - loss: 0.0290 - accuracy: 0.9909
1291/1688 [=====================>........] - ETA: 0s - loss: 0.0294 - accuracy: 0.9907
1323/1688 [======================>.......] - ETA: 0s - loss: 0.0293 - accuracy: 0.9907
1355/1688 [=======================>......] - ETA: 0s - loss: 0.0294 - accuracy: 0.9907
1387/1688 [=======================>......] - ETA: 0s - loss: 0.0297 - accuracy: 0.9906
1419/1688 [========================>.....] - ETA: 0s - loss: 0.0300 - accuracy: 0.9905
1452/1688 [========================>.....] - ETA: 0s - loss: 0.0297 - accuracy: 0.9906
1484/1688 [=========================>....] - ETA: 0s - loss: 0.0296 - accuracy: 0.9907
1517/1688 [=========================>....] - ETA: 0s - loss: 0.0297 - accuracy: 0.9906
1549/1688 [==========================>...] - ETA: 0s - loss: 0.0296 - accuracy: 0.9906
1581/1688 [===========================>..] - ETA: 0s - loss: 0.0293 - accuracy: 0.9907
1614/1688 [===========================>..] - ETA: 0s - loss: 0.0290 - accuracy: 0.9907
1647/1688 [============================>.] - ETA: 0s - loss: 0.0295 - accuracy: 0.9907
1679/1688 [============================>.] - ETA: 0s - loss: 0.0293 - accuracy: 0.9908
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0293 - accuracy: 0.9908 - val_loss: 0.0449 - val_accuracy: 0.9892
Epoch 4/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0152 - accuracy: 1.0000
  36/1688 [..............................] - ETA: 2s - loss: 0.0137 - accuracy: 0.9974
  68/1688 [>.............................] - ETA: 2s - loss: 0.0201 - accuracy: 0.9945
 101/1688 [>.............................] - ETA: 2s - loss: 0.0177 - accuracy: 0.9947
 136/1688 [=>............................] - ETA: 2s - loss: 0.0206 - accuracy: 0.9943
 169/1688 [==>...........................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9935
 201/1688 [==>...........................] - ETA: 2s - loss: 0.0207 - accuracy: 0.9939
 233/1688 [===>..........................] - ETA: 2s - loss: 0.0199 - accuracy: 0.9941
 266/1688 [===>..........................] - ETA: 2s - loss: 0.0204 - accuracy: 0.9937
 299/1688 [====>.........................] - ETA: 2s - loss: 0.0205 - accuracy: 0.9934
 331/1688 [====>.........................] - ETA: 2s - loss: 0.0199 - accuracy: 0.9935
 364/1688 [=====>........................] - ETA: 2s - loss: 0.0189 - accuracy: 0.9938
 396/1688 [======>.......................] - ETA: 1s - loss: 0.0197 - accuracy: 0.9938
 428/1688 [======>.......................] - ETA: 1s - loss: 0.0193 - accuracy: 0.9939
 461/1688 [=======>......................] - ETA: 1s - loss: 0.0191 - accuracy: 0.9940
 492/1688 [=======>......................] - ETA: 1s - loss: 0.0188 - accuracy: 0.9941
 525/1688 [========>.....................] - ETA: 1s - loss: 0.0189 - accuracy: 0.9941
 557/1688 [========>.....................] - ETA: 1s - loss: 0.0182 - accuracy: 0.9943
 592/1688 [=========>....................] - ETA: 1s - loss: 0.0182 - accuracy: 0.9944
 624/1688 [==========>...................] - ETA: 1s - loss: 0.0179 - accuracy: 0.9945
 655/1688 [==========>...................] - ETA: 1s - loss: 0.0179 - accuracy: 0.9944
 687/1688 [===========>..................] - ETA: 1s - loss: 0.0178 - accuracy: 0.9943
 719/1688 [===========>..................] - ETA: 1s - loss: 0.0180 - accuracy: 0.9943
 752/1688 [============>.................] - ETA: 1s - loss: 0.0187 - accuracy: 0.9941
 784/1688 [============>.................] - ETA: 1s - loss: 0.0199 - accuracy: 0.9938
 815/1688 [=============>................] - ETA: 1s - loss: 0.0202 - accuracy: 0.9937
 847/1688 [==============>...............] - ETA: 1s - loss: 0.0205 - accuracy: 0.9935
 880/1688 [==============>...............] - ETA: 1s - loss: 0.0203 - accuracy: 0.9935
 912/1688 [===============>..............] - ETA: 1s - loss: 0.0207 - accuracy: 0.9933
 944/1688 [===============>..............] - ETA: 1s - loss: 0.0205 - accuracy: 0.9934
 977/1688 [================>.............] - ETA: 1s - loss: 0.0203 - accuracy: 0.9934
1010/1688 [================>.............] - ETA: 1s - loss: 0.0204 - accuracy: 0.9933
1042/1688 [=================>............] - ETA: 1s - loss: 0.0204 - accuracy: 0.9933
1074/1688 [==================>...........] - ETA: 0s - loss: 0.0203 - accuracy: 0.9932
1106/1688 [==================>...........] - ETA: 0s - loss: 0.0203 - accuracy: 0.9933
1138/1688 [===================>..........] - ETA: 0s - loss: 0.0200 - accuracy: 0.9934
1170/1688 [===================>..........] - ETA: 0s - loss: 0.0198 - accuracy: 0.9934
1202/1688 [====================>.........] - ETA: 0s - loss: 0.0197 - accuracy: 0.9934
1235/1688 [====================>.........] - ETA: 0s - loss: 0.0198 - accuracy: 0.9933
1267/1688 [=====================>........] - ETA: 0s - loss: 0.0200 - accuracy: 0.9932
1299/1688 [======================>.......] - ETA: 0s - loss: 0.0197 - accuracy: 0.9933
1331/1688 [======================>.......] - ETA: 0s - loss: 0.0196 - accuracy: 0.9934
1363/1688 [=======================>......] - ETA: 0s - loss: 0.0197 - accuracy: 0.9934
1395/1688 [=======================>......] - ETA: 0s - loss: 0.0198 - accuracy: 0.9933
1427/1688 [========================>.....] - ETA: 0s - loss: 0.0198 - accuracy: 0.9933
1459/1688 [========================>.....] - ETA: 0s - loss: 0.0200 - accuracy: 0.9933
1491/1688 [=========================>....] - ETA: 0s - loss: 0.0210 - accuracy: 0.9930
1523/1688 [==========================>...] - ETA: 0s - loss: 0.0211 - accuracy: 0.9930
1555/1688 [==========================>...] - ETA: 0s - loss: 0.0211 - accuracy: 0.9929
1587/1688 [===========================>..] - ETA: 0s - loss: 0.0210 - accuracy: 0.9929
1619/1688 [===========================>..] - ETA: 0s - loss: 0.0211 - accuracy: 0.9929
1651/1688 [============================>.] - ETA: 0s - loss: 0.0210 - accuracy: 0.9929
1683/1688 [============================>.] - ETA: 0s - loss: 0.0209 - accuracy: 0.9929
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0210 - accuracy: 0.9929 - val_loss: 0.0568 - val_accuracy: 0.9840
Epoch 5/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0347 - accuracy: 0.9688
  35/1688 [..............................] - ETA: 2s - loss: 0.0184 - accuracy: 0.9937
  69/1688 [>.............................] - ETA: 2s - loss: 0.0170 - accuracy: 0.9937
 102/1688 [>.............................] - ETA: 2s - loss: 0.0153 - accuracy: 0.9948
 134/1688 [=>............................] - ETA: 2s - loss: 0.0142 - accuracy: 0.9953
 166/1688 [=>............................] - ETA: 2s - loss: 0.0154 - accuracy: 0.9947
 197/1688 [==>...........................] - ETA: 2s - loss: 0.0155 - accuracy: 0.9948
 229/1688 [===>..........................] - ETA: 2s - loss: 0.0150 - accuracy: 0.9950
 261/1688 [===>..........................] - ETA: 2s - loss: 0.0144 - accuracy: 0.9953
 292/1688 [====>.........................] - ETA: 2s - loss: 0.0143 - accuracy: 0.9953
 324/1688 [====>.........................] - ETA: 2s - loss: 0.0155 - accuracy: 0.9950
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0150 - accuracy: 0.9951
 389/1688 [=====>........................] - ETA: 2s - loss: 0.0156 - accuracy: 0.9948
 421/1688 [======>.......................] - ETA: 1s - loss: 0.0148 - accuracy: 0.9951
 453/1688 [=======>......................] - ETA: 1s - loss: 0.0147 - accuracy: 0.9949
 486/1688 [=======>......................] - ETA: 1s - loss: 0.0143 - accuracy: 0.9951
 517/1688 [========>.....................] - ETA: 1s - loss: 0.0139 - accuracy: 0.9953
 549/1688 [========>.....................] - ETA: 1s - loss: 0.0134 - accuracy: 0.9955
 581/1688 [=========>....................] - ETA: 1s - loss: 0.0131 - accuracy: 0.9957
 613/1688 [=========>....................] - ETA: 1s - loss: 0.0131 - accuracy: 0.9958
 645/1688 [==========>...................] - ETA: 1s - loss: 0.0129 - accuracy: 0.9958
 677/1688 [===========>..................] - ETA: 1s - loss: 0.0132 - accuracy: 0.9957
 710/1688 [===========>..................] - ETA: 1s - loss: 0.0135 - accuracy: 0.9955
 741/1688 [============>.................] - ETA: 1s - loss: 0.0138 - accuracy: 0.9954
 773/1688 [============>.................] - ETA: 1s - loss: 0.0141 - accuracy: 0.9953
 805/1688 [=============>................] - ETA: 1s - loss: 0.0138 - accuracy: 0.9954
 838/1688 [=============>................] - ETA: 1s - loss: 0.0136 - accuracy: 0.9955
 872/1688 [==============>...............] - ETA: 1s - loss: 0.0136 - accuracy: 0.9955
 905/1688 [===============>..............] - ETA: 1s - loss: 0.0136 - accuracy: 0.9956
 938/1688 [===============>..............] - ETA: 1s - loss: 0.0137 - accuracy: 0.9956
 970/1688 [================>.............] - ETA: 1s - loss: 0.0143 - accuracy: 0.9955
1002/1688 [================>.............] - ETA: 1s - loss: 0.0143 - accuracy: 0.9954
1034/1688 [=================>............] - ETA: 1s - loss: 0.0143 - accuracy: 0.9954
1066/1688 [=================>............] - ETA: 0s - loss: 0.0144 - accuracy: 0.9953
1098/1688 [==================>...........] - ETA: 0s - loss: 0.0145 - accuracy: 0.9953
1130/1688 [===================>..........] - ETA: 0s - loss: 0.0146 - accuracy: 0.9952
1162/1688 [===================>..........] - ETA: 0s - loss: 0.0153 - accuracy: 0.9950
1194/1688 [====================>.........] - ETA: 0s - loss: 0.0159 - accuracy: 0.9949
1226/1688 [====================>.........] - ETA: 0s - loss: 0.0162 - accuracy: 0.9947
1258/1688 [=====================>........] - ETA: 0s - loss: 0.0163 - accuracy: 0.9947
1290/1688 [=====================>........] - ETA: 0s - loss: 0.0166 - accuracy: 0.9946
1321/1688 [======================>.......] - ETA: 0s - loss: 0.0168 - accuracy: 0.9946
1352/1688 [=======================>......] - ETA: 0s - loss: 0.0167 - accuracy: 0.9947
1383/1688 [=======================>......] - ETA: 0s - loss: 0.0169 - accuracy: 0.9946
1414/1688 [========================>.....] - ETA: 0s - loss: 0.0170 - accuracy: 0.9946
1446/1688 [========================>.....] - ETA: 0s - loss: 0.0169 - accuracy: 0.9946
1476/1688 [=========================>....] - ETA: 0s - loss: 0.0170 - accuracy: 0.9945
1507/1688 [=========================>....] - ETA: 0s - loss: 0.0169 - accuracy: 0.9946
1537/1688 [==========================>...] - ETA: 0s - loss: 0.0171 - accuracy: 0.9945
1568/1688 [==========================>...] - ETA: 0s - loss: 0.0172 - accuracy: 0.9944
1600/1688 [===========================>..] - ETA: 0s - loss: 0.0171 - accuracy: 0.9944
1630/1688 [===========================>..] - ETA: 0s - loss: 0.0170 - accuracy: 0.9945
1662/1688 [============================>.] - ETA: 0s - loss: 0.0170 - accuracy: 0.9945
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0172 - accuracy: 0.9944 - val_loss: 0.0965 - val_accuracy: 0.9778
Epoch 6/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0049 - accuracy: 1.0000
  35/1688 [..............................] - ETA: 2s - loss: 0.0232 - accuracy: 0.9937
  68/1688 [>.............................] - ETA: 2s - loss: 0.0199 - accuracy: 0.9936
 100/1688 [>.............................] - ETA: 2s - loss: 0.0150 - accuracy: 0.9956
 132/1688 [=>............................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9960
 164/1688 [=>............................] - ETA: 2s - loss: 0.0118 - accuracy: 0.9966
 195/1688 [==>...........................] - ETA: 2s - loss: 0.0111 - accuracy: 0.9968
 227/1688 [===>..........................] - ETA: 2s - loss: 0.0107 - accuracy: 0.9967
 259/1688 [===>..........................] - ETA: 2s - loss: 0.0106 - accuracy: 0.9966
 291/1688 [====>.........................] - ETA: 2s - loss: 0.0107 - accuracy: 0.9966
 323/1688 [====>.........................] - ETA: 2s - loss: 0.0111 - accuracy: 0.9964
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0112 - accuracy: 0.9962
 390/1688 [=====>........................] - ETA: 2s - loss: 0.0113 - accuracy: 0.9962
 422/1688 [======>.......................] - ETA: 1s - loss: 0.0118 - accuracy: 0.9959
 454/1688 [=======>......................] - ETA: 1s - loss: 0.0117 - accuracy: 0.9959
 486/1688 [=======>......................] - ETA: 1s - loss: 0.0113 - accuracy: 0.9960
 517/1688 [========>.....................] - ETA: 1s - loss: 0.0109 - accuracy: 0.9962
 549/1688 [========>.....................] - ETA: 1s - loss: 0.0109 - accuracy: 0.9963
 581/1688 [=========>....................] - ETA: 1s - loss: 0.0111 - accuracy: 0.9962
 613/1688 [=========>....................] - ETA: 1s - loss: 0.0114 - accuracy: 0.9960
 645/1688 [==========>...................] - ETA: 1s - loss: 0.0113 - accuracy: 0.9961
 677/1688 [===========>..................] - ETA: 1s - loss: 0.0114 - accuracy: 0.9961
 709/1688 [===========>..................] - ETA: 1s - loss: 0.0113 - accuracy: 0.9961
 741/1688 [============>.................] - ETA: 1s - loss: 0.0112 - accuracy: 0.9961
 773/1688 [============>.................] - ETA: 1s - loss: 0.0116 - accuracy: 0.9960
 805/1688 [=============>................] - ETA: 1s - loss: 0.0116 - accuracy: 0.9959
 836/1688 [=============>................] - ETA: 1s - loss: 0.0113 - accuracy: 0.9960
 868/1688 [==============>...............] - ETA: 1s - loss: 0.0115 - accuracy: 0.9959
 900/1688 [==============>...............] - ETA: 1s - loss: 0.0117 - accuracy: 0.9959
 932/1688 [===============>..............] - ETA: 1s - loss: 0.0117 - accuracy: 0.9959
 964/1688 [================>.............] - ETA: 1s - loss: 0.0121 - accuracy: 0.9958
 996/1688 [================>.............] - ETA: 1s - loss: 0.0119 - accuracy: 0.9959
1028/1688 [=================>............] - ETA: 1s - loss: 0.0121 - accuracy: 0.9957
1060/1688 [=================>............] - ETA: 0s - loss: 0.0119 - accuracy: 0.9958
1093/1688 [==================>...........] - ETA: 0s - loss: 0.0122 - accuracy: 0.9957
1124/1688 [==================>...........] - ETA: 0s - loss: 0.0122 - accuracy: 0.9957
1156/1688 [===================>..........] - ETA: 0s - loss: 0.0122 - accuracy: 0.9957
1188/1688 [====================>.........] - ETA: 0s - loss: 0.0122 - accuracy: 0.9957
1220/1688 [====================>.........] - ETA: 0s - loss: 0.0122 - accuracy: 0.9956
1252/1688 [=====================>........] - ETA: 0s - loss: 0.0125 - accuracy: 0.9956
1284/1688 [=====================>........] - ETA: 0s - loss: 0.0125 - accuracy: 0.9956
1316/1688 [======================>.......] - ETA: 0s - loss: 0.0123 - accuracy: 0.9956
1349/1688 [======================>.......] - ETA: 0s - loss: 0.0124 - accuracy: 0.9956
1380/1688 [=======================>......] - ETA: 0s - loss: 0.0124 - accuracy: 0.9956
1411/1688 [========================>.....] - ETA: 0s - loss: 0.0125 - accuracy: 0.9956
1441/1688 [========================>.....] - ETA: 0s - loss: 0.0124 - accuracy: 0.9956
1471/1688 [=========================>....] - ETA: 0s - loss: 0.0127 - accuracy: 0.9955
1502/1688 [=========================>....] - ETA: 0s - loss: 0.0128 - accuracy: 0.9954
1532/1688 [==========================>...] - ETA: 0s - loss: 0.0128 - accuracy: 0.9954
1562/1688 [==========================>...] - ETA: 0s - loss: 0.0127 - accuracy: 0.9954
1595/1688 [===========================>..] - ETA: 0s - loss: 0.0130 - accuracy: 0.9954
1627/1688 [===========================>..] - ETA: 0s - loss: 0.0130 - accuracy: 0.9954
1660/1688 [============================>.] - ETA: 0s - loss: 0.0129 - accuracy: 0.9954
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0131 - accuracy: 0.9954 - val_loss: 0.0448 - val_accuracy: 0.9898
Epoch 7/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0088 - accuracy: 1.0000
  36/1688 [..............................] - ETA: 2s - loss: 0.0046 - accuracy: 0.9991
  68/1688 [>.............................] - ETA: 2s - loss: 0.0049 - accuracy: 0.9991
 100/1688 [>.............................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9984
 132/1688 [=>............................] - ETA: 2s - loss: 0.0051 - accuracy: 0.9986
 164/1688 [=>............................] - ETA: 2s - loss: 0.0049 - accuracy: 0.9989
 196/1688 [==>...........................] - ETA: 2s - loss: 0.0056 - accuracy: 0.9984
 228/1688 [===>..........................] - ETA: 2s - loss: 0.0064 - accuracy: 0.9981
 260/1688 [===>..........................] - ETA: 2s - loss: 0.0061 - accuracy: 0.9981
 293/1688 [====>.........................] - ETA: 2s - loss: 0.0061 - accuracy: 0.9981
 325/1688 [====>.........................] - ETA: 2s - loss: 0.0065 - accuracy: 0.9978
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0065 - accuracy: 0.9978
 390/1688 [=====>........................] - ETA: 2s - loss: 0.0071 - accuracy: 0.9975
 421/1688 [======>.......................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 453/1688 [=======>......................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 484/1688 [=======>......................] - ETA: 1s - loss: 0.0070 - accuracy: 0.9974
 515/1688 [========>.....................] - ETA: 1s - loss: 0.0069 - accuracy: 0.9975
 547/1688 [========>.....................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 578/1688 [=========>....................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9973
 610/1688 [=========>....................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9972
 642/1688 [==========>...................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9972
 674/1688 [==========>...................] - ETA: 1s - loss: 0.0082 - accuracy: 0.9971
 707/1688 [===========>..................] - ETA: 1s - loss: 0.0087 - accuracy: 0.9970
 739/1688 [============>.................] - ETA: 1s - loss: 0.0088 - accuracy: 0.9970
 770/1688 [============>.................] - ETA: 1s - loss: 0.0090 - accuracy: 0.9969
 802/1688 [=============>................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9967
 833/1688 [=============>................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9967
 864/1688 [==============>...............] - ETA: 1s - loss: 0.0091 - accuracy: 0.9967
 896/1688 [==============>...............] - ETA: 1s - loss: 0.0095 - accuracy: 0.9966
 927/1688 [===============>..............] - ETA: 1s - loss: 0.0095 - accuracy: 0.9966
 960/1688 [================>.............] - ETA: 1s - loss: 0.0095 - accuracy: 0.9966
 992/1688 [================>.............] - ETA: 1s - loss: 0.0100 - accuracy: 0.9964
1024/1688 [=================>............] - ETA: 1s - loss: 0.0102 - accuracy: 0.9964
1056/1688 [=================>............] - ETA: 1s - loss: 0.0108 - accuracy: 0.9963
1088/1688 [==================>...........] - ETA: 0s - loss: 0.0110 - accuracy: 0.9962
1120/1688 [==================>...........] - ETA: 0s - loss: 0.0110 - accuracy: 0.9962
1152/1688 [===================>..........] - ETA: 0s - loss: 0.0109 - accuracy: 0.9962
1183/1688 [====================>.........] - ETA: 0s - loss: 0.0108 - accuracy: 0.9963
1214/1688 [====================>.........] - ETA: 0s - loss: 0.0106 - accuracy: 0.9963
1246/1688 [=====================>........] - ETA: 0s - loss: 0.0108 - accuracy: 0.9962
1278/1688 [=====================>........] - ETA: 0s - loss: 0.0109 - accuracy: 0.9962
1310/1688 [======================>.......] - ETA: 0s - loss: 0.0110 - accuracy: 0.9961
1342/1688 [======================>.......] - ETA: 0s - loss: 0.0110 - accuracy: 0.9961
1374/1688 [=======================>......] - ETA: 0s - loss: 0.0110 - accuracy: 0.9961
1405/1688 [=======================>......] - ETA: 0s - loss: 0.0109 - accuracy: 0.9962
1436/1688 [========================>.....] - ETA: 0s - loss: 0.0111 - accuracy: 0.9961
1468/1688 [=========================>....] - ETA: 0s - loss: 0.0112 - accuracy: 0.9960
1500/1688 [=========================>....] - ETA: 0s - loss: 0.0114 - accuracy: 0.9959
1534/1688 [==========================>...] - ETA: 0s - loss: 0.0114 - accuracy: 0.9959
1566/1688 [==========================>...] - ETA: 0s - loss: 0.0114 - accuracy: 0.9960
1598/1688 [===========================>..] - ETA: 0s - loss: 0.0114 - accuracy: 0.9960
1629/1688 [===========================>..] - ETA: 0s - loss: 0.0113 - accuracy: 0.9960
1661/1688 [============================>.] - ETA: 0s - loss: 0.0113 - accuracy: 0.9960
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0112 - accuracy: 0.9961 - val_loss: 0.0428 - val_accuracy: 0.9898
Epoch 8/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0040 - accuracy: 1.0000
  35/1688 [..............................] - ETA: 2s - loss: 0.0080 - accuracy: 0.9982
  69/1688 [>.............................] - ETA: 2s - loss: 0.0081 - accuracy: 0.9977
 101/1688 [>.............................] - ETA: 2s - loss: 0.0074 - accuracy: 0.9981
 133/1688 [=>............................] - ETA: 2s - loss: 0.0073 - accuracy: 0.9977
 165/1688 [=>............................] - ETA: 2s - loss: 0.0067 - accuracy: 0.9979
 197/1688 [==>...........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9975
 229/1688 [===>..........................] - ETA: 2s - loss: 0.0073 - accuracy: 0.9974
 261/1688 [===>..........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9976
 293/1688 [====>.........................] - ETA: 2s - loss: 0.0066 - accuracy: 0.9978
 325/1688 [====>.........................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9980
 357/1688 [=====>........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9978
 389/1688 [=====>........................] - ETA: 2s - loss: 0.0069 - accuracy: 0.9979
 421/1688 [======>.......................] - ETA: 1s - loss: 0.0071 - accuracy: 0.9977
 453/1688 [=======>......................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9976
 485/1688 [=======>......................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9975
 519/1688 [========>.....................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9976
 551/1688 [========>.....................] - ETA: 1s - loss: 0.0073 - accuracy: 0.9976
 583/1688 [=========>....................] - ETA: 1s - loss: 0.0071 - accuracy: 0.9977
 615/1688 [=========>....................] - ETA: 1s - loss: 0.0069 - accuracy: 0.9978
 647/1688 [==========>...................] - ETA: 1s - loss: 0.0067 - accuracy: 0.9979
 680/1688 [===========>..................] - ETA: 1s - loss: 0.0064 - accuracy: 0.9979
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0064 - accuracy: 0.9978
 745/1688 [============>.................] - ETA: 1s - loss: 0.0065 - accuracy: 0.9978
 777/1688 [============>.................] - ETA: 1s - loss: 0.0069 - accuracy: 0.9976
 809/1688 [=============>................] - ETA: 1s - loss: 0.0071 - accuracy: 0.9976
 841/1688 [=============>................] - ETA: 1s - loss: 0.0071 - accuracy: 0.9975
 873/1688 [==============>...............] - ETA: 1s - loss: 0.0069 - accuracy: 0.9976
 905/1688 [===============>..............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9977
 938/1688 [===============>..............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9977
 971/1688 [================>.............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9977
1002/1688 [================>.............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9976
1033/1688 [=================>............] - ETA: 1s - loss: 0.0067 - accuracy: 0.9977
1065/1688 [=================>............] - ETA: 0s - loss: 0.0067 - accuracy: 0.9977
1097/1688 [==================>...........] - ETA: 0s - loss: 0.0067 - accuracy: 0.9977
1129/1688 [===================>..........] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1161/1688 [===================>..........] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1192/1688 [====================>.........] - ETA: 0s - loss: 0.0067 - accuracy: 0.9977
1225/1688 [====================>.........] - ETA: 0s - loss: 0.0067 - accuracy: 0.9977
1258/1688 [=====================>........] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1290/1688 [=====================>........] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1322/1688 [======================>.......] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1354/1688 [=======================>......] - ETA: 0s - loss: 0.0066 - accuracy: 0.9977
1386/1688 [=======================>......] - ETA: 0s - loss: 0.0068 - accuracy: 0.9976
1419/1688 [========================>.....] - ETA: 0s - loss: 0.0073 - accuracy: 0.9975
1452/1688 [========================>.....] - ETA: 0s - loss: 0.0075 - accuracy: 0.9974
1484/1688 [=========================>....] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1516/1688 [=========================>....] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1549/1688 [==========================>...] - ETA: 0s - loss: 0.0077 - accuracy: 0.9973
1581/1688 [===========================>..] - ETA: 0s - loss: 0.0077 - accuracy: 0.9973
1613/1688 [===========================>..] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1645/1688 [============================>.] - ETA: 0s - loss: 0.0077 - accuracy: 0.9973
1676/1688 [============================>.] - ETA: 0s - loss: 0.0078 - accuracy: 0.9972
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0078 - accuracy: 0.9972 - val_loss: 0.0482 - val_accuracy: 0.9892
Epoch 9/10

   1/1688 [..............................] - ETA: 0s - loss: 2.7985e-05 - accuracy: 1.0000
  36/1688 [..............................] - ETA: 2s - loss: 0.0104 - accuracy: 0.9957    
  69/1688 [>.............................] - ETA: 2s - loss: 0.0077 - accuracy: 0.9968
 101/1688 [>.............................] - ETA: 2s - loss: 0.0072 - accuracy: 0.9975
 133/1688 [=>............................] - ETA: 2s - loss: 0.0076 - accuracy: 0.9972
 166/1688 [=>............................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9976
 198/1688 [==>...........................] - ETA: 2s - loss: 0.0063 - accuracy: 0.9978
 230/1688 [===>..........................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9981
 262/1688 [===>..........................] - ETA: 2s - loss: 0.0055 - accuracy: 0.9983
 294/1688 [====>.........................] - ETA: 2s - loss: 0.0052 - accuracy: 0.9985
 326/1688 [====>.........................] - ETA: 2s - loss: 0.0048 - accuracy: 0.9987
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0048 - accuracy: 0.9987
 391/1688 [=====>........................] - ETA: 2s - loss: 0.0049 - accuracy: 0.9985
 424/1688 [======>.......................] - ETA: 1s - loss: 0.0055 - accuracy: 0.9983
 457/1688 [=======>......................] - ETA: 1s - loss: 0.0058 - accuracy: 0.9981
 489/1688 [=======>......................] - ETA: 1s - loss: 0.0068 - accuracy: 0.9977
 522/1688 [========>.....................] - ETA: 1s - loss: 0.0069 - accuracy: 0.9976
 555/1688 [========>.....................] - ETA: 1s - loss: 0.0079 - accuracy: 0.9972
 588/1688 [=========>....................] - ETA: 1s - loss: 0.0077 - accuracy: 0.9973
 622/1688 [==========>...................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9972
 655/1688 [==========>...................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9972
 687/1688 [===========>..................] - ETA: 1s - loss: 0.0077 - accuracy: 0.9972
 719/1688 [===========>..................] - ETA: 1s - loss: 0.0077 - accuracy: 0.9972
 751/1688 [============>.................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9973
 783/1688 [============>.................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 816/1688 [=============>................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 848/1688 [==============>...............] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
 880/1688 [==============>...............] - ETA: 1s - loss: 0.0071 - accuracy: 0.9974
 912/1688 [===============>..............] - ETA: 1s - loss: 0.0070 - accuracy: 0.9974
 945/1688 [===============>..............] - ETA: 1s - loss: 0.0070 - accuracy: 0.9975
 977/1688 [================>.............] - ETA: 1s - loss: 0.0074 - accuracy: 0.9973
1009/1688 [================>.............] - ETA: 1s - loss: 0.0073 - accuracy: 0.9974
1041/1688 [=================>............] - ETA: 1s - loss: 0.0072 - accuracy: 0.9974
1072/1688 [==================>...........] - ETA: 0s - loss: 0.0071 - accuracy: 0.9974
1105/1688 [==================>...........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9974
1137/1688 [===================>..........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9974
1169/1688 [===================>..........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9974
1201/1688 [====================>.........] - ETA: 0s - loss: 0.0070 - accuracy: 0.9974
1233/1688 [====================>.........] - ETA: 0s - loss: 0.0071 - accuracy: 0.9974
1266/1688 [=====================>........] - ETA: 0s - loss: 0.0072 - accuracy: 0.9975
1298/1688 [======================>.......] - ETA: 0s - loss: 0.0072 - accuracy: 0.9975
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0070 - accuracy: 0.9976
1362/1688 [=======================>......] - ETA: 0s - loss: 0.0071 - accuracy: 0.9976
1394/1688 [=======================>......] - ETA: 0s - loss: 0.0071 - accuracy: 0.9976
1426/1688 [========================>.....] - ETA: 0s - loss: 0.0070 - accuracy: 0.9976
1458/1688 [========================>.....] - ETA: 0s - loss: 0.0072 - accuracy: 0.9975
1489/1688 [=========================>....] - ETA: 0s - loss: 0.0074 - accuracy: 0.9974
1521/1688 [==========================>...] - ETA: 0s - loss: 0.0074 - accuracy: 0.9974
1553/1688 [==========================>...] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1585/1688 [===========================>..] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1617/1688 [===========================>..] - ETA: 0s - loss: 0.0076 - accuracy: 0.9973
1649/1688 [============================>.] - ETA: 0s - loss: 0.0075 - accuracy: 0.9973
1681/1688 [============================>.] - ETA: 0s - loss: 0.0075 - accuracy: 0.9973
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0076 - accuracy: 0.9973 - val_loss: 0.0597 - val_accuracy: 0.9882
Epoch 10/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0097 - accuracy: 1.0000
  35/1688 [..............................] - ETA: 2s - loss: 0.0130 - accuracy: 0.9946
  67/1688 [>.............................] - ETA: 2s - loss: 0.0086 - accuracy: 0.9963
  98/1688 [>.............................] - ETA: 2s - loss: 0.0086 - accuracy: 0.9962
 130/1688 [=>............................] - ETA: 2s - loss: 0.0069 - accuracy: 0.9971
 162/1688 [=>............................] - ETA: 2s - loss: 0.0083 - accuracy: 0.9973
 193/1688 [==>...........................] - ETA: 2s - loss: 0.0072 - accuracy: 0.9977
 224/1688 [==>...........................] - ETA: 2s - loss: 0.0066 - accuracy: 0.9980
 256/1688 [===>..........................] - ETA: 2s - loss: 0.0062 - accuracy: 0.9982
 288/1688 [====>.........................] - ETA: 2s - loss: 0.0065 - accuracy: 0.9979
 320/1688 [====>.........................] - ETA: 2s - loss: 0.0060 - accuracy: 0.9981
 351/1688 [=====>........................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9981
 382/1688 [=====>........................] - ETA: 2s - loss: 0.0056 - accuracy: 0.9982
 413/1688 [======>.......................] - ETA: 2s - loss: 0.0055 - accuracy: 0.9982
 446/1688 [======>.......................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9981
 478/1688 [=======>......................] - ETA: 1s - loss: 0.0055 - accuracy: 0.9981
 510/1688 [========>.....................] - ETA: 1s - loss: 0.0052 - accuracy: 0.9982
 542/1688 [========>.....................] - ETA: 1s - loss: 0.0055 - accuracy: 0.9980
 574/1688 [=========>....................] - ETA: 1s - loss: 0.0056 - accuracy: 0.9980
 606/1688 [=========>....................] - ETA: 1s - loss: 0.0053 - accuracy: 0.9981
 639/1688 [==========>...................] - ETA: 1s - loss: 0.0054 - accuracy: 0.9980
 671/1688 [==========>...................] - ETA: 1s - loss: 0.0054 - accuracy: 0.9980
 703/1688 [===========>..................] - ETA: 1s - loss: 0.0053 - accuracy: 0.9980
 735/1688 [============>.................] - ETA: 1s - loss: 0.0051 - accuracy: 0.9981
 767/1688 [============>.................] - ETA: 1s - loss: 0.0050 - accuracy: 0.9981
 799/1688 [=============>................] - ETA: 1s - loss: 0.0050 - accuracy: 0.9981
 831/1688 [=============>................] - ETA: 1s - loss: 0.0051 - accuracy: 0.9981
 863/1688 [==============>...............] - ETA: 1s - loss: 0.0051 - accuracy: 0.9981
 895/1688 [==============>...............] - ETA: 1s - loss: 0.0050 - accuracy: 0.9982
 927/1688 [===============>..............] - ETA: 1s - loss: 0.0049 - accuracy: 0.9982
 959/1688 [================>.............] - ETA: 1s - loss: 0.0048 - accuracy: 0.9983
 991/1688 [================>.............] - ETA: 1s - loss: 0.0047 - accuracy: 0.9983
1023/1688 [=================>............] - ETA: 1s - loss: 0.0046 - accuracy: 0.9984
1054/1688 [=================>............] - ETA: 1s - loss: 0.0045 - accuracy: 0.9984
1086/1688 [==================>...........] - ETA: 0s - loss: 0.0044 - accuracy: 0.9984
1118/1688 [==================>...........] - ETA: 0s - loss: 0.0043 - accuracy: 0.9985
1149/1688 [===================>..........] - ETA: 0s - loss: 0.0044 - accuracy: 0.9985
1181/1688 [===================>..........] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1213/1688 [====================>.........] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1245/1688 [=====================>........] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1278/1688 [=====================>........] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1310/1688 [======================>.......] - ETA: 0s - loss: 0.0046 - accuracy: 0.9984
1342/1688 [======================>.......] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1374/1688 [=======================>......] - ETA: 0s - loss: 0.0045 - accuracy: 0.9985
1407/1688 [========================>.....] - ETA: 0s - loss: 0.0044 - accuracy: 0.9984
1439/1688 [========================>.....] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1471/1688 [=========================>....] - ETA: 0s - loss: 0.0044 - accuracy: 0.9984
1505/1688 [=========================>....] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1539/1688 [==========================>...] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1571/1688 [==========================>...] - ETA: 0s - loss: 0.0045 - accuracy: 0.9984
1603/1688 [===========================>..] - ETA: 0s - loss: 0.0046 - accuracy: 0.9983
1635/1688 [============================>.] - ETA: 0s - loss: 0.0046 - accuracy: 0.9983
1667/1688 [============================>.] - ETA: 0s - loss: 0.0046 - accuracy: 0.9983
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0047 - accuracy: 0.9983 - val_loss: 0.0560 - val_accuracy: 0.9892
Test score: 0.042816463857889175
Test accuracy: 0.9901000261306763

4. Model quantization

We can now turn to quantization to get a discretized version of the model, where the weights and activations are quantized so as to be suitable for implementation in the Akida NSoC.

For this, we just have to quantize the Keras model using the quantize function. Here, we decide to quantize to the maximum allowed bitwidths for the first layer weights (8-bit), the subsequent layer weights (4-bit) and the activations (4-bit).

The quantized model is a Keras model where the neural layers (Conv2D, Dense) and the ReLU layers are replaced with custom CNN2SNN quantized layers (QuantizedConv2D, QuantizedDense, QuantizedReLU). All Keras API functions can be applied on this new model: summary(), compile(), fit(). etc.

Note

The quantize function folds the batch normalization layers into the corresponding neural layer. The new weights are computed according to this folding operation.

Note

The CNN2SNN toolkit provides the check_model_compatibility function to ensure that the quantized model is compatible with the Akida NSoC. If the model is not fully compatible, substitutes will be needed for the relevant layers/operations (guidelines included in the documentation).

from cnn2snn import quantize, check_model_compatibility

model_quantized = quantize(model_keras,
                           input_weight_quantization=8,
                           weight_quantization=4,
                           activ_quantization=4)
model_quantized.summary()

print("Model compatible for Akida conversion:",
      check_model_compatibility(model_quantized, input_is_sparse=False))

Out:

Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_3 (InputLayer)         [(None, 28, 28, 1)]       0
_________________________________________________________________
conv2d (QuantizedConv2D)     (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
re_lu (ActivationDiscreteRel (None, 13, 13, 32)        0
_________________________________________________________________
conv2d_1 (QuantizedConv2D)   (None, 13, 13, 64)        18496
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
re_lu_1 (ActivationDiscreteR (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (QuantizedDense)       (None, 10)                31370
=================================================================
Total params: 50,186
Trainable params: 50,186
Non-trainable params: 0
_________________________________________________________________
Model compatible for Akida conversion: True

Check the quantized model accuracy.

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after 8-4-4 quantization:', score[1])

Out:

Test accuracy after 8-4-4 quantization: 0.989300012588501

Since we used the maximum allowed bitwidths for weights and activations, the accuracy of the quantized model is equivalent to the one of the base model, but for lower bitwidth, the quantization usually introduces a performance drop.

Let’s try this time with 2-bit for weights and 1-bit for activations.

model_quantized = quantize(model_keras,
                           weight_quantization=2,
                           activ_quantization=1)

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after 2-2-1 quantization:', score[1])

# To recover the original model accuracy, a quantization-aware training phase
# is required.

Out:

Test accuracy after 2-2-1 quantization: 0.38940000534057617

5. Model fine tuning (quantization-aware training)

This quantization-aware training (fine tuning) allows to cover the performance drop due to the quantization step.

Note that since this step is a fine tuning, the number of epochs can be lowered, compared to the training from scratch of the standard model.

model_quantized.fit(x_train, y_train, epochs=5, validation_split=0.1)

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after fine tuning:', score[1])

Out:

Epoch 1/5

   1/1688 [..............................] - ETA: 0s - loss: 1.4543 - accuracy: 0.5938
  26/1688 [..............................] - ETA: 3s - loss: 0.9240 - accuracy: 0.7007
  51/1688 [..............................] - ETA: 3s - loss: 0.6228 - accuracy: 0.8002
  77/1688 [>.............................] - ETA: 3s - loss: 0.4717 - accuracy: 0.8515
 103/1688 [>.............................] - ETA: 3s - loss: 0.3964 - accuracy: 0.8750
 129/1688 [=>............................] - ETA: 3s - loss: 0.3455 - accuracy: 0.8905
 155/1688 [=>............................] - ETA: 3s - loss: 0.3136 - accuracy: 0.9004
 181/1688 [==>...........................] - ETA: 2s - loss: 0.2882 - accuracy: 0.9076
 207/1688 [==>...........................] - ETA: 2s - loss: 0.2641 - accuracy: 0.9146
 233/1688 [===>..........................] - ETA: 2s - loss: 0.2452 - accuracy: 0.9211
 259/1688 [===>..........................] - ETA: 2s - loss: 0.2302 - accuracy: 0.9258
 285/1688 [====>.........................] - ETA: 2s - loss: 0.2200 - accuracy: 0.9291
 312/1688 [====>.........................] - ETA: 2s - loss: 0.2073 - accuracy: 0.9328
 339/1688 [=====>........................] - ETA: 2s - loss: 0.1988 - accuracy: 0.9355
 366/1688 [=====>........................] - ETA: 2s - loss: 0.1916 - accuracy: 0.9377
 392/1688 [=====>........................] - ETA: 2s - loss: 0.1838 - accuracy: 0.9399
 418/1688 [======>.......................] - ETA: 2s - loss: 0.1778 - accuracy: 0.9415
 444/1688 [======>.......................] - ETA: 2s - loss: 0.1718 - accuracy: 0.9433
 471/1688 [=======>......................] - ETA: 2s - loss: 0.1658 - accuracy: 0.9455
 497/1688 [=======>......................] - ETA: 2s - loss: 0.1604 - accuracy: 0.9474
 523/1688 [========>.....................] - ETA: 2s - loss: 0.1565 - accuracy: 0.9489
 549/1688 [========>.....................] - ETA: 2s - loss: 0.1516 - accuracy: 0.9503
 575/1688 [=========>....................] - ETA: 2s - loss: 0.1475 - accuracy: 0.9517
 602/1688 [=========>....................] - ETA: 2s - loss: 0.1436 - accuracy: 0.9530
 628/1688 [==========>...................] - ETA: 2s - loss: 0.1391 - accuracy: 0.9545
 654/1688 [==========>...................] - ETA: 2s - loss: 0.1366 - accuracy: 0.9553
 680/1688 [===========>..................] - ETA: 1s - loss: 0.1331 - accuracy: 0.9566
 707/1688 [===========>..................] - ETA: 1s - loss: 0.1299 - accuracy: 0.9577
 732/1688 [============>.................] - ETA: 1s - loss: 0.1265 - accuracy: 0.9588
 759/1688 [============>.................] - ETA: 1s - loss: 0.1241 - accuracy: 0.9597
 785/1688 [============>.................] - ETA: 1s - loss: 0.1219 - accuracy: 0.9603
 811/1688 [=============>................] - ETA: 1s - loss: 0.1196 - accuracy: 0.9610
 838/1688 [=============>................] - ETA: 1s - loss: 0.1176 - accuracy: 0.9616
 865/1688 [==============>...............] - ETA: 1s - loss: 0.1154 - accuracy: 0.9623
 891/1688 [==============>...............] - ETA: 1s - loss: 0.1137 - accuracy: 0.9630
 917/1688 [===============>..............] - ETA: 1s - loss: 0.1119 - accuracy: 0.9635
 943/1688 [===============>..............] - ETA: 1s - loss: 0.1109 - accuracy: 0.9638
 969/1688 [================>.............] - ETA: 1s - loss: 0.1102 - accuracy: 0.9639
 995/1688 [================>.............] - ETA: 1s - loss: 0.1082 - accuracy: 0.9645
1022/1688 [=================>............] - ETA: 1s - loss: 0.1070 - accuracy: 0.9649
1048/1688 [=================>............] - ETA: 1s - loss: 0.1054 - accuracy: 0.9656
1074/1688 [==================>...........] - ETA: 1s - loss: 0.1039 - accuracy: 0.9662
1100/1688 [==================>...........] - ETA: 1s - loss: 0.1027 - accuracy: 0.9665
1126/1688 [===================>..........] - ETA: 1s - loss: 0.1014 - accuracy: 0.9669
1153/1688 [===================>..........] - ETA: 1s - loss: 0.0999 - accuracy: 0.9674
1179/1688 [===================>..........] - ETA: 0s - loss: 0.0992 - accuracy: 0.9675
1205/1688 [====================>.........] - ETA: 0s - loss: 0.0989 - accuracy: 0.9676
1232/1688 [====================>.........] - ETA: 0s - loss: 0.0979 - accuracy: 0.9678
1259/1688 [=====================>........] - ETA: 0s - loss: 0.0969 - accuracy: 0.9681
1285/1688 [=====================>........] - ETA: 0s - loss: 0.0966 - accuracy: 0.9681
1311/1688 [======================>.......] - ETA: 0s - loss: 0.0956 - accuracy: 0.9685
1337/1688 [======================>.......] - ETA: 0s - loss: 0.0947 - accuracy: 0.9688
1363/1688 [=======================>......] - ETA: 0s - loss: 0.0935 - accuracy: 0.9692
1389/1688 [=======================>......] - ETA: 0s - loss: 0.0928 - accuracy: 0.9695
1416/1688 [========================>.....] - ETA: 0s - loss: 0.0921 - accuracy: 0.9697
1443/1688 [========================>.....] - ETA: 0s - loss: 0.0910 - accuracy: 0.9700
1469/1688 [=========================>....] - ETA: 0s - loss: 0.0903 - accuracy: 0.9702
1495/1688 [=========================>....] - ETA: 0s - loss: 0.0900 - accuracy: 0.9705
1521/1688 [==========================>...] - ETA: 0s - loss: 0.0897 - accuracy: 0.9706
1547/1688 [==========================>...] - ETA: 0s - loss: 0.0894 - accuracy: 0.9706
1574/1688 [==========================>...] - ETA: 0s - loss: 0.0892 - accuracy: 0.9706
1600/1688 [===========================>..] - ETA: 0s - loss: 0.0885 - accuracy: 0.9708
1627/1688 [===========================>..] - ETA: 0s - loss: 0.0877 - accuracy: 0.9712
1654/1688 [============================>.] - ETA: 0s - loss: 0.0869 - accuracy: 0.9714
1681/1688 [============================>.] - ETA: 0s - loss: 0.0862 - accuracy: 0.9718
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0861 - accuracy: 0.9718 - val_loss: 0.0667 - val_accuracy: 0.9815
Epoch 2/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0220 - accuracy: 1.0000
  29/1688 [..............................] - ETA: 2s - loss: 0.0449 - accuracy: 0.9871
  56/1688 [..............................] - ETA: 3s - loss: 0.0375 - accuracy: 0.9872
  82/1688 [>.............................] - ETA: 3s - loss: 0.0409 - accuracy: 0.9863
 108/1688 [>.............................] - ETA: 2s - loss: 0.0424 - accuracy: 0.9850
 135/1688 [=>............................] - ETA: 2s - loss: 0.0421 - accuracy: 0.9856
 162/1688 [=>............................] - ETA: 2s - loss: 0.0406 - accuracy: 0.9859
 188/1688 [==>...........................] - ETA: 2s - loss: 0.0408 - accuracy: 0.9857
 215/1688 [==>...........................] - ETA: 2s - loss: 0.0427 - accuracy: 0.9855
 241/1688 [===>..........................] - ETA: 2s - loss: 0.0436 - accuracy: 0.9848
 268/1688 [===>..........................] - ETA: 2s - loss: 0.0435 - accuracy: 0.9850
 295/1688 [====>.........................] - ETA: 2s - loss: 0.0434 - accuracy: 0.9850
 321/1688 [====>.........................] - ETA: 2s - loss: 0.0460 - accuracy: 0.9843
 348/1688 [=====>........................] - ETA: 2s - loss: 0.0466 - accuracy: 0.9837
 375/1688 [=====>........................] - ETA: 2s - loss: 0.0472 - accuracy: 0.9836
 401/1688 [======>.......................] - ETA: 2s - loss: 0.0471 - accuracy: 0.9838
 427/1688 [======>.......................] - ETA: 2s - loss: 0.0473 - accuracy: 0.9836
 453/1688 [=======>......................] - ETA: 2s - loss: 0.0481 - accuracy: 0.9834
 480/1688 [=======>......................] - ETA: 2s - loss: 0.0478 - accuracy: 0.9837
 507/1688 [========>.....................] - ETA: 2s - loss: 0.0471 - accuracy: 0.9839
 533/1688 [========>.....................] - ETA: 2s - loss: 0.0465 - accuracy: 0.9839
 560/1688 [========>.....................] - ETA: 2s - loss: 0.0466 - accuracy: 0.9839
 587/1688 [=========>....................] - ETA: 2s - loss: 0.0468 - accuracy: 0.9841
 614/1688 [=========>....................] - ETA: 2s - loss: 0.0465 - accuracy: 0.9843
 640/1688 [==========>...................] - ETA: 2s - loss: 0.0469 - accuracy: 0.9843
 666/1688 [==========>...................] - ETA: 1s - loss: 0.0472 - accuracy: 0.9841
 693/1688 [===========>..................] - ETA: 1s - loss: 0.0470 - accuracy: 0.9842
 720/1688 [===========>..................] - ETA: 1s - loss: 0.0473 - accuracy: 0.9841
 747/1688 [============>.................] - ETA: 1s - loss: 0.0483 - accuracy: 0.9839
 773/1688 [============>.................] - ETA: 1s - loss: 0.0478 - accuracy: 0.9840
 799/1688 [=============>................] - ETA: 1s - loss: 0.0471 - accuracy: 0.9842
 825/1688 [=============>................] - ETA: 1s - loss: 0.0473 - accuracy: 0.9842
 851/1688 [==============>...............] - ETA: 1s - loss: 0.0468 - accuracy: 0.9844
 877/1688 [==============>...............] - ETA: 1s - loss: 0.0470 - accuracy: 0.9844
 904/1688 [===============>..............] - ETA: 1s - loss: 0.0467 - accuracy: 0.9845
 931/1688 [===============>..............] - ETA: 1s - loss: 0.0467 - accuracy: 0.9845
 958/1688 [================>.............] - ETA: 1s - loss: 0.0466 - accuracy: 0.9846
 985/1688 [================>.............] - ETA: 1s - loss: 0.0465 - accuracy: 0.9845
1011/1688 [================>.............] - ETA: 1s - loss: 0.0470 - accuracy: 0.9844
1038/1688 [=================>............] - ETA: 1s - loss: 0.0470 - accuracy: 0.9844
1064/1688 [=================>............] - ETA: 1s - loss: 0.0476 - accuracy: 0.9843
1090/1688 [==================>...........] - ETA: 1s - loss: 0.0474 - accuracy: 0.9843
1116/1688 [==================>...........] - ETA: 1s - loss: 0.0475 - accuracy: 0.9843
1142/1688 [===================>..........] - ETA: 1s - loss: 0.0475 - accuracy: 0.9843
1169/1688 [===================>..........] - ETA: 0s - loss: 0.0473 - accuracy: 0.9844
1195/1688 [====================>.........] - ETA: 0s - loss: 0.0470 - accuracy: 0.9844
1221/1688 [====================>.........] - ETA: 0s - loss: 0.0470 - accuracy: 0.9845
1248/1688 [=====================>........] - ETA: 0s - loss: 0.0473 - accuracy: 0.9843
1274/1688 [=====================>........] - ETA: 0s - loss: 0.0471 - accuracy: 0.9844
1300/1688 [======================>.......] - ETA: 0s - loss: 0.0468 - accuracy: 0.9845
1327/1688 [======================>.......] - ETA: 0s - loss: 0.0465 - accuracy: 0.9846
1354/1688 [=======================>......] - ETA: 0s - loss: 0.0461 - accuracy: 0.9847
1380/1688 [=======================>......] - ETA: 0s - loss: 0.0461 - accuracy: 0.9848
1407/1688 [========================>.....] - ETA: 0s - loss: 0.0460 - accuracy: 0.9847
1433/1688 [========================>.....] - ETA: 0s - loss: 0.0462 - accuracy: 0.9847
1459/1688 [========================>.....] - ETA: 0s - loss: 0.0459 - accuracy: 0.9848
1485/1688 [=========================>....] - ETA: 0s - loss: 0.0458 - accuracy: 0.9848
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0458 - accuracy: 0.9849
1539/1688 [==========================>...] - ETA: 0s - loss: 0.0458 - accuracy: 0.9848
1565/1688 [==========================>...] - ETA: 0s - loss: 0.0457 - accuracy: 0.9849
1591/1688 [===========================>..] - ETA: 0s - loss: 0.0457 - accuracy: 0.9849
1618/1688 [===========================>..] - ETA: 0s - loss: 0.0459 - accuracy: 0.9849
1644/1688 [============================>.] - ETA: 0s - loss: 0.0457 - accuracy: 0.9849
1670/1688 [============================>.] - ETA: 0s - loss: 0.0459 - accuracy: 0.9849
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0459 - accuracy: 0.9849 - val_loss: 0.0659 - val_accuracy: 0.9812
Epoch 3/5

   1/1688 [..............................] - ETA: 0s - loss: 5.8809e-04 - accuracy: 1.0000
  28/1688 [..............................] - ETA: 3s - loss: 0.0525 - accuracy: 0.9844    
  55/1688 [..............................] - ETA: 3s - loss: 0.0436 - accuracy: 0.9847
  81/1688 [>.............................] - ETA: 3s - loss: 0.0412 - accuracy: 0.9857
 107/1688 [>.............................] - ETA: 3s - loss: 0.0376 - accuracy: 0.9866
 133/1688 [=>............................] - ETA: 2s - loss: 0.0357 - accuracy: 0.9878
 159/1688 [=>............................] - ETA: 2s - loss: 0.0338 - accuracy: 0.9886
 186/1688 [==>...........................] - ETA: 2s - loss: 0.0328 - accuracy: 0.9889
 212/1688 [==>...........................] - ETA: 2s - loss: 0.0349 - accuracy: 0.9882
 238/1688 [===>..........................] - ETA: 2s - loss: 0.0333 - accuracy: 0.9888
 265/1688 [===>..........................] - ETA: 2s - loss: 0.0334 - accuracy: 0.9890
 291/1688 [====>.........................] - ETA: 2s - loss: 0.0351 - accuracy: 0.9888
 318/1688 [====>.........................] - ETA: 2s - loss: 0.0338 - accuracy: 0.9893
 344/1688 [=====>........................] - ETA: 2s - loss: 0.0340 - accuracy: 0.9892
 370/1688 [=====>........................] - ETA: 2s - loss: 0.0371 - accuracy: 0.9883
 397/1688 [======>.......................] - ETA: 2s - loss: 0.0375 - accuracy: 0.9881
 424/1688 [======>.......................] - ETA: 2s - loss: 0.0368 - accuracy: 0.9884
 450/1688 [======>.......................] - ETA: 2s - loss: 0.0356 - accuracy: 0.9889
 475/1688 [=======>......................] - ETA: 2s - loss: 0.0356 - accuracy: 0.9889
 502/1688 [=======>......................] - ETA: 2s - loss: 0.0361 - accuracy: 0.9885
 529/1688 [========>.....................] - ETA: 2s - loss: 0.0359 - accuracy: 0.9886
 555/1688 [========>.....................] - ETA: 2s - loss: 0.0350 - accuracy: 0.9890
 582/1688 [=========>....................] - ETA: 2s - loss: 0.0352 - accuracy: 0.9889
 608/1688 [=========>....................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9886
 635/1688 [==========>...................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9885
 662/1688 [==========>...................] - ETA: 1s - loss: 0.0361 - accuracy: 0.9886
 689/1688 [===========>..................] - ETA: 1s - loss: 0.0357 - accuracy: 0.9887
 716/1688 [===========>..................] - ETA: 1s - loss: 0.0353 - accuracy: 0.9888
 742/1688 [============>.................] - ETA: 1s - loss: 0.0354 - accuracy: 0.9887
 769/1688 [============>.................] - ETA: 1s - loss: 0.0360 - accuracy: 0.9887
 795/1688 [=============>................] - ETA: 1s - loss: 0.0354 - accuracy: 0.9888
 821/1688 [=============>................] - ETA: 1s - loss: 0.0361 - accuracy: 0.9886
 848/1688 [==============>...............] - ETA: 1s - loss: 0.0364 - accuracy: 0.9885
 875/1688 [==============>...............] - ETA: 1s - loss: 0.0374 - accuracy: 0.9883
 901/1688 [===============>..............] - ETA: 1s - loss: 0.0379 - accuracy: 0.9881
 927/1688 [===============>..............] - ETA: 1s - loss: 0.0378 - accuracy: 0.9881
 954/1688 [===============>..............] - ETA: 1s - loss: 0.0380 - accuracy: 0.9879
 980/1688 [================>.............] - ETA: 1s - loss: 0.0381 - accuracy: 0.9880
1006/1688 [================>.............] - ETA: 1s - loss: 0.0384 - accuracy: 0.9880
1032/1688 [=================>............] - ETA: 1s - loss: 0.0386 - accuracy: 0.9879
1059/1688 [=================>............] - ETA: 1s - loss: 0.0384 - accuracy: 0.9880
1086/1688 [==================>...........] - ETA: 1s - loss: 0.0389 - accuracy: 0.9880
1113/1688 [==================>...........] - ETA: 1s - loss: 0.0390 - accuracy: 0.9880
1140/1688 [===================>..........] - ETA: 1s - loss: 0.0398 - accuracy: 0.9879
1167/1688 [===================>..........] - ETA: 1s - loss: 0.0396 - accuracy: 0.9879
1194/1688 [====================>.........] - ETA: 0s - loss: 0.0392 - accuracy: 0.9880
1221/1688 [====================>.........] - ETA: 0s - loss: 0.0393 - accuracy: 0.9879
1246/1688 [=====================>........] - ETA: 0s - loss: 0.0389 - accuracy: 0.9880
1272/1688 [=====================>........] - ETA: 0s - loss: 0.0384 - accuracy: 0.9881
1298/1688 [======================>.......] - ETA: 0s - loss: 0.0383 - accuracy: 0.9881
1324/1688 [======================>.......] - ETA: 0s - loss: 0.0385 - accuracy: 0.9880
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0387 - accuracy: 0.9880
1376/1688 [=======================>......] - ETA: 0s - loss: 0.0387 - accuracy: 0.9879
1402/1688 [=======================>......] - ETA: 0s - loss: 0.0386 - accuracy: 0.9879
1428/1688 [========================>.....] - ETA: 0s - loss: 0.0386 - accuracy: 0.9879
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0383 - accuracy: 0.9880
1482/1688 [=========================>....] - ETA: 0s - loss: 0.0385 - accuracy: 0.9880
1509/1688 [=========================>....] - ETA: 0s - loss: 0.0384 - accuracy: 0.9880
1535/1688 [==========================>...] - ETA: 0s - loss: 0.0383 - accuracy: 0.9880
1562/1688 [==========================>...] - ETA: 0s - loss: 0.0382 - accuracy: 0.9881
1589/1688 [===========================>..] - ETA: 0s - loss: 0.0381 - accuracy: 0.9881
1615/1688 [===========================>..] - ETA: 0s - loss: 0.0381 - accuracy: 0.9881
1642/1688 [============================>.] - ETA: 0s - loss: 0.0381 - accuracy: 0.9881
1669/1688 [============================>.] - ETA: 0s - loss: 0.0380 - accuracy: 0.9881
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0381 - accuracy: 0.9881 - val_loss: 0.0545 - val_accuracy: 0.9853
Epoch 4/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0301 - accuracy: 0.9688
  29/1688 [..............................] - ETA: 2s - loss: 0.0450 - accuracy: 0.9892
  55/1688 [..............................] - ETA: 3s - loss: 0.0434 - accuracy: 0.9898
  81/1688 [>.............................] - ETA: 3s - loss: 0.0393 - accuracy: 0.9892
 107/1688 [>.............................] - ETA: 3s - loss: 0.0375 - accuracy: 0.9892
 133/1688 [=>............................] - ETA: 2s - loss: 0.0371 - accuracy: 0.9892
 160/1688 [=>............................] - ETA: 2s - loss: 0.0367 - accuracy: 0.9893
 187/1688 [==>...........................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9893
 214/1688 [==>...........................] - ETA: 2s - loss: 0.0355 - accuracy: 0.9893
 240/1688 [===>..........................] - ETA: 2s - loss: 0.0359 - accuracy: 0.9889
 267/1688 [===>..........................] - ETA: 2s - loss: 0.0366 - accuracy: 0.9889
 293/1688 [====>.........................] - ETA: 2s - loss: 0.0362 - accuracy: 0.9886
 320/1688 [====>.........................] - ETA: 2s - loss: 0.0344 - accuracy: 0.9888
 347/1688 [=====>........................] - ETA: 2s - loss: 0.0349 - accuracy: 0.9881
 373/1688 [=====>........................] - ETA: 2s - loss: 0.0343 - accuracy: 0.9883
 400/1688 [======>.......................] - ETA: 2s - loss: 0.0339 - accuracy: 0.9884
 426/1688 [======>.......................] - ETA: 2s - loss: 0.0339 - accuracy: 0.9882
 452/1688 [=======>......................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9875
 478/1688 [=======>......................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9873
 504/1688 [=======>......................] - ETA: 2s - loss: 0.0365 - accuracy: 0.9874
 530/1688 [========>.....................] - ETA: 2s - loss: 0.0364 - accuracy: 0.9873
 556/1688 [========>.....................] - ETA: 2s - loss: 0.0361 - accuracy: 0.9874
 582/1688 [=========>....................] - ETA: 2s - loss: 0.0369 - accuracy: 0.9872
 609/1688 [=========>....................] - ETA: 2s - loss: 0.0363 - accuracy: 0.9874
 636/1688 [==========>...................] - ETA: 2s - loss: 0.0367 - accuracy: 0.9874
 662/1688 [==========>...................] - ETA: 1s - loss: 0.0373 - accuracy: 0.9873
 687/1688 [===========>..................] - ETA: 1s - loss: 0.0375 - accuracy: 0.9872
 714/1688 [===========>..................] - ETA: 1s - loss: 0.0377 - accuracy: 0.9872
 740/1688 [============>.................] - ETA: 1s - loss: 0.0376 - accuracy: 0.9872
 767/1688 [============>.................] - ETA: 1s - loss: 0.0369 - accuracy: 0.9875
 794/1688 [=============>................] - ETA: 1s - loss: 0.0368 - accuracy: 0.9875
 821/1688 [=============>................] - ETA: 1s - loss: 0.0365 - accuracy: 0.9876
 848/1688 [==============>...............] - ETA: 1s - loss: 0.0369 - accuracy: 0.9877
 875/1688 [==============>...............] - ETA: 1s - loss: 0.0366 - accuracy: 0.9877
 901/1688 [===============>..............] - ETA: 1s - loss: 0.0374 - accuracy: 0.9876
 928/1688 [===============>..............] - ETA: 1s - loss: 0.0380 - accuracy: 0.9875
 955/1688 [===============>..............] - ETA: 1s - loss: 0.0382 - accuracy: 0.9875
 982/1688 [================>.............] - ETA: 1s - loss: 0.0378 - accuracy: 0.9877
1009/1688 [================>.............] - ETA: 1s - loss: 0.0375 - accuracy: 0.9877
1036/1688 [=================>............] - ETA: 1s - loss: 0.0375 - accuracy: 0.9877
1062/1688 [=================>............] - ETA: 1s - loss: 0.0378 - accuracy: 0.9875
1088/1688 [==================>...........] - ETA: 1s - loss: 0.0379 - accuracy: 0.9874
1115/1688 [==================>...........] - ETA: 1s - loss: 0.0382 - accuracy: 0.9873
1141/1688 [===================>..........] - ETA: 1s - loss: 0.0387 - accuracy: 0.9872
1167/1688 [===================>..........] - ETA: 1s - loss: 0.0388 - accuracy: 0.9872
1193/1688 [====================>.........] - ETA: 0s - loss: 0.0394 - accuracy: 0.9871
1219/1688 [====================>.........] - ETA: 0s - loss: 0.0392 - accuracy: 0.9872
1245/1688 [=====================>........] - ETA: 0s - loss: 0.0394 - accuracy: 0.9871
1272/1688 [=====================>........] - ETA: 0s - loss: 0.0399 - accuracy: 0.9870
1298/1688 [======================>.......] - ETA: 0s - loss: 0.0405 - accuracy: 0.9868
1324/1688 [======================>.......] - ETA: 0s - loss: 0.0401 - accuracy: 0.9869
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0397 - accuracy: 0.9870
1377/1688 [=======================>......] - ETA: 0s - loss: 0.0395 - accuracy: 0.9871
1404/1688 [=======================>......] - ETA: 0s - loss: 0.0398 - accuracy: 0.9871
1430/1688 [========================>.....] - ETA: 0s - loss: 0.0396 - accuracy: 0.9872
1456/1688 [========================>.....] - ETA: 0s - loss: 0.0400 - accuracy: 0.9871
1482/1688 [=========================>....] - ETA: 0s - loss: 0.0398 - accuracy: 0.9871
1509/1688 [=========================>....] - ETA: 0s - loss: 0.0397 - accuracy: 0.9871
1535/1688 [==========================>...] - ETA: 0s - loss: 0.0396 - accuracy: 0.9871
1562/1688 [==========================>...] - ETA: 0s - loss: 0.0395 - accuracy: 0.9872
1589/1688 [===========================>..] - ETA: 0s - loss: 0.0394 - accuracy: 0.9872
1616/1688 [===========================>..] - ETA: 0s - loss: 0.0394 - accuracy: 0.9872
1642/1688 [============================>.] - ETA: 0s - loss: 0.0394 - accuracy: 0.9871
1668/1688 [============================>.] - ETA: 0s - loss: 0.0393 - accuracy: 0.9871
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0393 - accuracy: 0.9871 - val_loss: 0.0669 - val_accuracy: 0.9840
Epoch 5/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0197 - accuracy: 1.0000
  28/1688 [..............................] - ETA: 3s - loss: 0.0504 - accuracy: 0.9833
  55/1688 [..............................] - ETA: 3s - loss: 0.0429 - accuracy: 0.9864
  82/1688 [>.............................] - ETA: 3s - loss: 0.0461 - accuracy: 0.9867
 108/1688 [>.............................] - ETA: 2s - loss: 0.0415 - accuracy: 0.9873
 134/1688 [=>............................] - ETA: 2s - loss: 0.0388 - accuracy: 0.9888
 160/1688 [=>............................] - ETA: 2s - loss: 0.0369 - accuracy: 0.9889
 186/1688 [==>...........................] - ETA: 2s - loss: 0.0373 - accuracy: 0.9891
 212/1688 [==>...........................] - ETA: 2s - loss: 0.0359 - accuracy: 0.9892
 238/1688 [===>..........................] - ETA: 2s - loss: 0.0359 - accuracy: 0.9891
 264/1688 [===>..........................] - ETA: 2s - loss: 0.0354 - accuracy: 0.9885
 290/1688 [====>.........................] - ETA: 2s - loss: 0.0360 - accuracy: 0.9884
 316/1688 [====>.........................] - ETA: 2s - loss: 0.0351 - accuracy: 0.9883
 343/1688 [=====>........................] - ETA: 2s - loss: 0.0364 - accuracy: 0.9881
 369/1688 [=====>........................] - ETA: 2s - loss: 0.0353 - accuracy: 0.9884
 396/1688 [======>.......................] - ETA: 2s - loss: 0.0348 - accuracy: 0.9886
 422/1688 [======>.......................] - ETA: 2s - loss: 0.0335 - accuracy: 0.9889
 448/1688 [======>.......................] - ETA: 2s - loss: 0.0340 - accuracy: 0.9886
 474/1688 [=======>......................] - ETA: 2s - loss: 0.0354 - accuracy: 0.9885
 500/1688 [=======>......................] - ETA: 2s - loss: 0.0351 - accuracy: 0.9886
 526/1688 [========>.....................] - ETA: 2s - loss: 0.0346 - accuracy: 0.9887
 553/1688 [========>.....................] - ETA: 2s - loss: 0.0336 - accuracy: 0.9890
 580/1688 [=========>....................] - ETA: 2s - loss: 0.0335 - accuracy: 0.9888
 606/1688 [=========>....................] - ETA: 2s - loss: 0.0331 - accuracy: 0.9889
 633/1688 [==========>...................] - ETA: 2s - loss: 0.0324 - accuracy: 0.9891
 659/1688 [==========>...................] - ETA: 1s - loss: 0.0320 - accuracy: 0.9892
 686/1688 [===========>..................] - ETA: 1s - loss: 0.0320 - accuracy: 0.9893
 712/1688 [===========>..................] - ETA: 1s - loss: 0.0330 - accuracy: 0.9889
 738/1688 [============>.................] - ETA: 1s - loss: 0.0337 - accuracy: 0.9887
 764/1688 [============>.................] - ETA: 1s - loss: 0.0352 - accuracy: 0.9883
 791/1688 [=============>................] - ETA: 1s - loss: 0.0353 - accuracy: 0.9882
 818/1688 [=============>................] - ETA: 1s - loss: 0.0356 - accuracy: 0.9881
 845/1688 [==============>...............] - ETA: 1s - loss: 0.0356 - accuracy: 0.9880
 872/1688 [==============>...............] - ETA: 1s - loss: 0.0360 - accuracy: 0.9879
 898/1688 [==============>...............] - ETA: 1s - loss: 0.0358 - accuracy: 0.9880
 925/1688 [===============>..............] - ETA: 1s - loss: 0.0355 - accuracy: 0.9880
 951/1688 [===============>..............] - ETA: 1s - loss: 0.0357 - accuracy: 0.9879
 977/1688 [================>.............] - ETA: 1s - loss: 0.0355 - accuracy: 0.9878
1003/1688 [================>.............] - ETA: 1s - loss: 0.0351 - accuracy: 0.9880
1029/1688 [=================>............] - ETA: 1s - loss: 0.0356 - accuracy: 0.9879
1055/1688 [=================>............] - ETA: 1s - loss: 0.0358 - accuracy: 0.9879
1081/1688 [==================>...........] - ETA: 1s - loss: 0.0357 - accuracy: 0.9879
1107/1688 [==================>...........] - ETA: 1s - loss: 0.0359 - accuracy: 0.9880
1134/1688 [===================>..........] - ETA: 1s - loss: 0.0358 - accuracy: 0.9880
1161/1688 [===================>..........] - ETA: 1s - loss: 0.0358 - accuracy: 0.9880
1187/1688 [====================>.........] - ETA: 0s - loss: 0.0356 - accuracy: 0.9881
1213/1688 [====================>.........] - ETA: 0s - loss: 0.0352 - accuracy: 0.9883
1240/1688 [=====================>........] - ETA: 0s - loss: 0.0353 - accuracy: 0.9882
1267/1688 [=====================>........] - ETA: 0s - loss: 0.0355 - accuracy: 0.9882
1293/1688 [=====================>........] - ETA: 0s - loss: 0.0355 - accuracy: 0.9882
1319/1688 [======================>.......] - ETA: 0s - loss: 0.0354 - accuracy: 0.9882
1345/1688 [======================>.......] - ETA: 0s - loss: 0.0356 - accuracy: 0.9882
1372/1688 [=======================>......] - ETA: 0s - loss: 0.0358 - accuracy: 0.9882
1398/1688 [=======================>......] - ETA: 0s - loss: 0.0361 - accuracy: 0.9880
1425/1688 [========================>.....] - ETA: 0s - loss: 0.0359 - accuracy: 0.9881
1452/1688 [========================>.....] - ETA: 0s - loss: 0.0364 - accuracy: 0.9880
1478/1688 [=========================>....] - ETA: 0s - loss: 0.0362 - accuracy: 0.9881
1505/1688 [=========================>....] - ETA: 0s - loss: 0.0362 - accuracy: 0.9881
1531/1688 [==========================>...] - ETA: 0s - loss: 0.0362 - accuracy: 0.9881
1558/1688 [==========================>...] - ETA: 0s - loss: 0.0366 - accuracy: 0.9880
1584/1688 [===========================>..] - ETA: 0s - loss: 0.0370 - accuracy: 0.9878
1610/1688 [===========================>..] - ETA: 0s - loss: 0.0372 - accuracy: 0.9878
1636/1688 [============================>.] - ETA: 0s - loss: 0.0372 - accuracy: 0.9878
1662/1688 [============================>.] - ETA: 0s - loss: 0.0371 - accuracy: 0.9878
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0374 - accuracy: 0.9877 - val_loss: 0.0523 - val_accuracy: 0.9872
Test accuracy after fine tuning: 0.9840999841690063

6. Model conversion

After having obtained a quantized model with satisfactory performance, it can be converted to a model suitable to be used in the Akida NSoC in inference mode. The convert function returns a model in Akida format, ready for the Akida NSoC or the Akida Execution Engine.

Note

One needs to supply the coefficients used to rescale the input dataset before the training - here input_scaling.

As with Keras, the summary() method provides a textual representation of the Akida model.

from cnn2snn import convert

model_akida = convert(model_quantized, input_scaling=input_scaling)
model_akida.summary()

results = model_akida.predict(raw_x_test)
accuracy = (raw_y_test == results).mean()

print('Test accuracy after conversion:', accuracy)

# For non-regression purpose
assert accuracy > 0.97

Out:

                        Model Summary
_____________________________________________________________
Layer (type)                 Output shape  Kernel shape
=============================================================
conv2d (InputConvolutional)  [13, 13, 32]  (3, 3, 1, 32)
_____________________________________________________________
conv2d_1 (Convolutional)     [7, 7, 64]    (3, 3, 32, 64)
_____________________________________________________________
dense (FullyConnected)       [1, 1, 10]    (1, 1, 3136, 10)
_____________________________________________________________
Input shape: 28, 28, 1
Backend type: Software - 1.8.9


Test accuracy after conversion: 0.9838

Depending on the number of samples you run, you should find a performance of around 98% (better results can be achieved using more epochs for training).

Total running time of the script: ( 0 minutes 52.518 seconds)

Gallery generated by Sphinx-Gallery