CNN conversion flow tutorial

This tutorial illustrates how to use the CNN2SNN toolkit to convert CNN networks to SNN networks compatible with the Akida NSoC in a few steps. You can refer to our CNN2SNN toolkit user guide for further explanation.

The CNN2SNN tool is based on Keras, TensorFlow high-level API for building and training deep learning models.

Note

Please refer to TensorFlow tf.keras.models module for model creation/import details and TensorFlow Guide for details of how TensorFlow works.

MNIST example below is light enough so you do not need a GPU to run the CNN2SNN tool.

../../_images/cnn2snn_flow_small.jpg

1. Load and reshape MNIST dataset

After loading, we make 2 transformations on the dataset:

  1. Reshape the sample content data (x values) into a num_samples x width x height x channels matrix.

Note

At this point, we’ll set aside the raw data for testing our converted model in the Akida Execution Engine later.

  1. Rescale the 8-bit loaded data to the range 0-to-1 for training.

Note

Input data normalization is a common step dealing with CNN (rationale is to keep data in a range that works with selected optimizers, some reading can be found here.

This shift makes almost no difference in the current example, but for some datasets rescaling the absolute values (and also shifting to zero-mean) can make a really major difference.

Also note that we store the scaling values input_scaling for use when preparing the model for the Akida Execution Engine. The implementation of the Akida neural network allows us to completely skip the rescaling step (i.e. the Akida model should be fed with the raw 8-bit values) but that does require information about what scaling was applied prior to training - see below for more details.

import tensorflow as tf
from tensorflow import keras

# Load MNIST dataset
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()

# Reshape x-data
x_train = x_train.reshape(60000, 28, 28, 1)
x_test = x_test.reshape(10000, 28, 28, 1)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')
raw_y_test = y_test

# Rescale x-data
a = 255
b = 0
input_scaling = (a, b)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

2. Model definition

Note that at this stage, there is nothing specific to the Akida NSoC. This start point is very much a completely standard CNN as defined within Keras.

An appropriate model for MNIST (inspired by this example) might look something like the following:

model_keras = keras.models.Sequential([
    keras.layers.Conv2D(filters=32, kernel_size=3, input_shape=(28, 28, 1)),
    keras.layers.MaxPool2D(),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.SeparableConv2D(filters=64, kernel_size=3, padding='same'),
    keras.layers.MaxPool2D(padding='same'),
    keras.layers.BatchNormalization(),
    keras.layers.ReLU(),
    keras.layers.Flatten(),
    keras.layers.Dense(10)
], 'mnistnet')

model_keras.summary()

Out:

Model: "mnistnet"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (Conv2D)              (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
batch_normalization (BatchNo (None, 13, 13, 32)        128
_________________________________________________________________
re_lu (ReLU)                 (None, 13, 13, 32)        0
_________________________________________________________________
separable_conv2d (SeparableC (None, 13, 13, 64)        2400
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
batch_normalization_1 (Batch (None, 7, 7, 64)          256
_________________________________________________________________
re_lu_1 (ReLU)               (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (Dense)                (None, 10)                31370
=================================================================
Total params: 34,474
Trainable params: 34,282
Non-trainable params: 192
_________________________________________________________________

The model defined above is compatible for conversion into an Akida model, i.e. the model doesn’t include any layers or operations that aren’t Akida-compatible (please refer to the CNN2SNN toolkit documentation for full details):

  • Standard Conv2D and Dense layers are supported

  • Hidden layers must be followed by a ReLU layer.

  • BatchNormalization must always happen before activations.

  • Convolutional blocks can optionally be followed by a MaxPooling.

The CNN2SNN toolkit provides the check_model_compatibility function to ensure that the model can be converted into an Akida model. If the model is not fully compatible, substitutes will be needed for the relevant layers/operations (guidelines included in the documentation).

from cnn2snn import check_model_compatibility

print("Model compatible for Akida conversion:",
      check_model_compatibility(model_keras))

Out:

Model compatible for Akida conversion: True

3. Model training

Before going any further, train the model and get its performance. The created model should have achieved a test accuracy a little over 99% after 10 epochs.

model_keras.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

model_keras.fit(x_train, y_train, epochs=10, validation_split=0.1)

score = model_keras.evaluate(x_test, y_test, verbose=0)
print('Test score:', score[0])
print('Test accuracy:', score[1])

Out:

Epoch 1/10

   1/1688 [..............................] - ETA: 0s - loss: 2.7706 - accuracy: 0.0625
  32/1688 [..............................] - ETA: 2s - loss: 1.0412 - accuracy: 0.6582
  63/1688 [>.............................] - ETA: 2s - loss: 0.6883 - accuracy: 0.7812
  94/1688 [>.............................] - ETA: 2s - loss: 0.5403 - accuracy: 0.8331
 126/1688 [=>............................] - ETA: 2s - loss: 0.4547 - accuracy: 0.8606
 158/1688 [=>............................] - ETA: 2s - loss: 0.3999 - accuracy: 0.8780
 190/1688 [==>...........................] - ETA: 2s - loss: 0.3573 - accuracy: 0.8911
 221/1688 [==>...........................] - ETA: 2s - loss: 0.3257 - accuracy: 0.9007
 252/1688 [===>..........................] - ETA: 2s - loss: 0.3075 - accuracy: 0.9062
 283/1688 [====>.........................] - ETA: 2s - loss: 0.2850 - accuracy: 0.9136
 314/1688 [====>.........................] - ETA: 2s - loss: 0.2697 - accuracy: 0.9181
 345/1688 [=====>........................] - ETA: 2s - loss: 0.2583 - accuracy: 0.9208
 376/1688 [=====>........................] - ETA: 2s - loss: 0.2465 - accuracy: 0.9241
 408/1688 [======>.......................] - ETA: 2s - loss: 0.2359 - accuracy: 0.9272
 439/1688 [======>.......................] - ETA: 2s - loss: 0.2259 - accuracy: 0.9304
 469/1688 [=======>......................] - ETA: 1s - loss: 0.2182 - accuracy: 0.9326
 501/1688 [=======>......................] - ETA: 1s - loss: 0.2106 - accuracy: 0.9348
 533/1688 [========>.....................] - ETA: 1s - loss: 0.2017 - accuracy: 0.9374
 565/1688 [=========>....................] - ETA: 1s - loss: 0.1951 - accuracy: 0.9397
 596/1688 [=========>....................] - ETA: 1s - loss: 0.1896 - accuracy: 0.9414
 627/1688 [==========>...................] - ETA: 1s - loss: 0.1840 - accuracy: 0.9432
 659/1688 [==========>...................] - ETA: 1s - loss: 0.1790 - accuracy: 0.9448
 691/1688 [===========>..................] - ETA: 1s - loss: 0.1745 - accuracy: 0.9461
 723/1688 [===========>..................] - ETA: 1s - loss: 0.1697 - accuracy: 0.9476
 754/1688 [============>.................] - ETA: 1s - loss: 0.1655 - accuracy: 0.9491
 786/1688 [============>.................] - ETA: 1s - loss: 0.1618 - accuracy: 0.9501
 818/1688 [=============>................] - ETA: 1s - loss: 0.1586 - accuracy: 0.9512
 849/1688 [==============>...............] - ETA: 1s - loss: 0.1562 - accuracy: 0.9519
 880/1688 [==============>...............] - ETA: 1s - loss: 0.1536 - accuracy: 0.9526
 911/1688 [===============>..............] - ETA: 1s - loss: 0.1505 - accuracy: 0.9536
 942/1688 [===============>..............] - ETA: 1s - loss: 0.1474 - accuracy: 0.9546
 974/1688 [================>.............] - ETA: 1s - loss: 0.1443 - accuracy: 0.9555
1005/1688 [================>.............] - ETA: 1s - loss: 0.1423 - accuracy: 0.9562
1036/1688 [=================>............] - ETA: 1s - loss: 0.1406 - accuracy: 0.9567
1067/1688 [=================>............] - ETA: 1s - loss: 0.1381 - accuracy: 0.9573
1098/1688 [==================>...........] - ETA: 0s - loss: 0.1367 - accuracy: 0.9578
1129/1688 [===================>..........] - ETA: 0s - loss: 0.1347 - accuracy: 0.9584
1161/1688 [===================>..........] - ETA: 0s - loss: 0.1330 - accuracy: 0.9588
1194/1688 [====================>.........] - ETA: 0s - loss: 0.1313 - accuracy: 0.9593
1226/1688 [====================>.........] - ETA: 0s - loss: 0.1293 - accuracy: 0.9600
1258/1688 [=====================>........] - ETA: 0s - loss: 0.1278 - accuracy: 0.9606
1290/1688 [=====================>........] - ETA: 0s - loss: 0.1263 - accuracy: 0.9609
1321/1688 [======================>.......] - ETA: 0s - loss: 0.1251 - accuracy: 0.9613
1352/1688 [=======================>......] - ETA: 0s - loss: 0.1235 - accuracy: 0.9618
1383/1688 [=======================>......] - ETA: 0s - loss: 0.1222 - accuracy: 0.9621
1415/1688 [========================>.....] - ETA: 0s - loss: 0.1216 - accuracy: 0.9623
1447/1688 [========================>.....] - ETA: 0s - loss: 0.1207 - accuracy: 0.9626
1478/1688 [=========================>....] - ETA: 0s - loss: 0.1202 - accuracy: 0.9628
1509/1688 [=========================>....] - ETA: 0s - loss: 0.1194 - accuracy: 0.9631
1540/1688 [==========================>...] - ETA: 0s - loss: 0.1187 - accuracy: 0.9632
1572/1688 [==========================>...] - ETA: 0s - loss: 0.1173 - accuracy: 0.9636
1604/1688 [===========================>..] - ETA: 0s - loss: 0.1161 - accuracy: 0.9641
1636/1688 [============================>.] - ETA: 0s - loss: 0.1153 - accuracy: 0.9643
1667/1688 [============================>.] - ETA: 0s - loss: 0.1149 - accuracy: 0.9645
1688/1688 [==============================] - ETA: 0s - loss: 0.1143 - accuracy: 0.9647
1688/1688 [==============================] - 3s 2ms/step - loss: 0.1143 - accuracy: 0.9647 - val_loss: 0.0426 - val_accuracy: 0.9880
Epoch 2/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0200 - accuracy: 1.0000
  34/1688 [..............................] - ETA: 2s - loss: 0.0581 - accuracy: 0.9789
  67/1688 [>.............................] - ETA: 2s - loss: 0.0571 - accuracy: 0.9813
  99/1688 [>.............................] - ETA: 2s - loss: 0.0574 - accuracy: 0.9811
 130/1688 [=>............................] - ETA: 2s - loss: 0.0517 - accuracy: 0.9827
 162/1688 [=>............................] - ETA: 2s - loss: 0.0505 - accuracy: 0.9828
 194/1688 [==>...........................] - ETA: 2s - loss: 0.0525 - accuracy: 0.9826
 226/1688 [===>..........................] - ETA: 2s - loss: 0.0525 - accuracy: 0.9833
 258/1688 [===>..........................] - ETA: 2s - loss: 0.0559 - accuracy: 0.9823
 290/1688 [====>.........................] - ETA: 2s - loss: 0.0561 - accuracy: 0.9820
 321/1688 [====>.........................] - ETA: 2s - loss: 0.0549 - accuracy: 0.9823
 353/1688 [=====>........................] - ETA: 2s - loss: 0.0529 - accuracy: 0.9831
 384/1688 [=====>........................] - ETA: 2s - loss: 0.0524 - accuracy: 0.9831
 415/1688 [======>.......................] - ETA: 2s - loss: 0.0504 - accuracy: 0.9837
 447/1688 [======>.......................] - ETA: 1s - loss: 0.0492 - accuracy: 0.9840
 478/1688 [=======>......................] - ETA: 1s - loss: 0.0499 - accuracy: 0.9839
 509/1688 [========>.....................] - ETA: 1s - loss: 0.0502 - accuracy: 0.9840
 541/1688 [========>.....................] - ETA: 1s - loss: 0.0497 - accuracy: 0.9840
 572/1688 [=========>....................] - ETA: 1s - loss: 0.0495 - accuracy: 0.9840
 603/1688 [=========>....................] - ETA: 1s - loss: 0.0488 - accuracy: 0.9844
 635/1688 [==========>...................] - ETA: 1s - loss: 0.0491 - accuracy: 0.9846
 666/1688 [==========>...................] - ETA: 1s - loss: 0.0492 - accuracy: 0.9846
 697/1688 [===========>..................] - ETA: 1s - loss: 0.0502 - accuracy: 0.9843
 728/1688 [===========>..................] - ETA: 1s - loss: 0.0503 - accuracy: 0.9843
 759/1688 [============>.................] - ETA: 1s - loss: 0.0496 - accuracy: 0.9845
 791/1688 [=============>................] - ETA: 1s - loss: 0.0489 - accuracy: 0.9847
 823/1688 [=============>................] - ETA: 1s - loss: 0.0485 - accuracy: 0.9847
 855/1688 [==============>...............] - ETA: 1s - loss: 0.0485 - accuracy: 0.9848
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0490 - accuracy: 0.9846
 919/1688 [===============>..............] - ETA: 1s - loss: 0.0497 - accuracy: 0.9845
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0491 - accuracy: 0.9846
 982/1688 [================>.............] - ETA: 1s - loss: 0.0493 - accuracy: 0.9846
1012/1688 [================>.............] - ETA: 1s - loss: 0.0494 - accuracy: 0.9843
1043/1688 [=================>............] - ETA: 1s - loss: 0.0497 - accuracy: 0.9843
1075/1688 [==================>...........] - ETA: 0s - loss: 0.0500 - accuracy: 0.9843
1106/1688 [==================>...........] - ETA: 0s - loss: 0.0499 - accuracy: 0.9843
1137/1688 [===================>..........] - ETA: 0s - loss: 0.0496 - accuracy: 0.9844
1168/1688 [===================>..........] - ETA: 0s - loss: 0.0499 - accuracy: 0.9844
1199/1688 [====================>.........] - ETA: 0s - loss: 0.0500 - accuracy: 0.9844
1230/1688 [====================>.........] - ETA: 0s - loss: 0.0494 - accuracy: 0.9845
1261/1688 [=====================>........] - ETA: 0s - loss: 0.0491 - accuracy: 0.9846
1292/1688 [=====================>........] - ETA: 0s - loss: 0.0490 - accuracy: 0.9847
1324/1688 [======================>.......] - ETA: 0s - loss: 0.0488 - accuracy: 0.9846
1355/1688 [=======================>......] - ETA: 0s - loss: 0.0486 - accuracy: 0.9846
1386/1688 [=======================>......] - ETA: 0s - loss: 0.0488 - accuracy: 0.9845
1418/1688 [========================>.....] - ETA: 0s - loss: 0.0485 - accuracy: 0.9846
1450/1688 [========================>.....] - ETA: 0s - loss: 0.0486 - accuracy: 0.9845
1481/1688 [=========================>....] - ETA: 0s - loss: 0.0484 - accuracy: 0.9846
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0482 - accuracy: 0.9847
1543/1688 [==========================>...] - ETA: 0s - loss: 0.0483 - accuracy: 0.9847
1575/1688 [==========================>...] - ETA: 0s - loss: 0.0479 - accuracy: 0.9848
1606/1688 [===========================>..] - ETA: 0s - loss: 0.0480 - accuracy: 0.9849
1638/1688 [============================>.] - ETA: 0s - loss: 0.0479 - accuracy: 0.9850
1669/1688 [============================>.] - ETA: 0s - loss: 0.0479 - accuracy: 0.9850
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0481 - accuracy: 0.9849 - val_loss: 0.0564 - val_accuracy: 0.9847
Epoch 3/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0335 - accuracy: 0.9688
  33/1688 [..............................] - ETA: 2s - loss: 0.0306 - accuracy: 0.9896
  64/1688 [>.............................] - ETA: 2s - loss: 0.0232 - accuracy: 0.9932
  95/1688 [>.............................] - ETA: 2s - loss: 0.0243 - accuracy: 0.9928
 127/1688 [=>............................] - ETA: 2s - loss: 0.0267 - accuracy: 0.9921
 159/1688 [=>............................] - ETA: 2s - loss: 0.0279 - accuracy: 0.9912
 191/1688 [==>...........................] - ETA: 2s - loss: 0.0261 - accuracy: 0.9921
 222/1688 [==>...........................] - ETA: 2s - loss: 0.0261 - accuracy: 0.9924
 253/1688 [===>..........................] - ETA: 2s - loss: 0.0268 - accuracy: 0.9920
 285/1688 [====>.........................] - ETA: 2s - loss: 0.0269 - accuracy: 0.9919
 317/1688 [====>.........................] - ETA: 2s - loss: 0.0258 - accuracy: 0.9921
 348/1688 [=====>........................] - ETA: 2s - loss: 0.0262 - accuracy: 0.9921
 379/1688 [=====>........................] - ETA: 2s - loss: 0.0267 - accuracy: 0.9920
 411/1688 [======>.......................] - ETA: 2s - loss: 0.0265 - accuracy: 0.9919
 442/1688 [======>.......................] - ETA: 2s - loss: 0.0254 - accuracy: 0.9922
 474/1688 [=======>......................] - ETA: 1s - loss: 0.0259 - accuracy: 0.9919
 506/1688 [=======>......................] - ETA: 1s - loss: 0.0265 - accuracy: 0.9917
 539/1688 [========>.....................] - ETA: 1s - loss: 0.0274 - accuracy: 0.9914
 570/1688 [=========>....................] - ETA: 1s - loss: 0.0274 - accuracy: 0.9914
 602/1688 [=========>....................] - ETA: 1s - loss: 0.0282 - accuracy: 0.9911
 633/1688 [==========>...................] - ETA: 1s - loss: 0.0287 - accuracy: 0.9908
 665/1688 [==========>...................] - ETA: 1s - loss: 0.0303 - accuracy: 0.9900
 698/1688 [===========>..................] - ETA: 1s - loss: 0.0305 - accuracy: 0.9900
 730/1688 [===========>..................] - ETA: 1s - loss: 0.0304 - accuracy: 0.9900
 762/1688 [============>.................] - ETA: 1s - loss: 0.0306 - accuracy: 0.9900
 793/1688 [=============>................] - ETA: 1s - loss: 0.0317 - accuracy: 0.9897
 824/1688 [=============>................] - ETA: 1s - loss: 0.0326 - accuracy: 0.9897
 855/1688 [==============>...............] - ETA: 1s - loss: 0.0331 - accuracy: 0.9895
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0342 - accuracy: 0.9893
 918/1688 [===============>..............] - ETA: 1s - loss: 0.0340 - accuracy: 0.9893
 949/1688 [===============>..............] - ETA: 1s - loss: 0.0348 - accuracy: 0.9890
 981/1688 [================>.............] - ETA: 1s - loss: 0.0347 - accuracy: 0.9890
1014/1688 [=================>............] - ETA: 1s - loss: 0.0348 - accuracy: 0.9891
1045/1688 [=================>............] - ETA: 1s - loss: 0.0342 - accuracy: 0.9892
1076/1688 [==================>...........] - ETA: 0s - loss: 0.0337 - accuracy: 0.9893
1108/1688 [==================>...........] - ETA: 0s - loss: 0.0337 - accuracy: 0.9893
1139/1688 [===================>..........] - ETA: 0s - loss: 0.0336 - accuracy: 0.9894
1169/1688 [===================>..........] - ETA: 0s - loss: 0.0335 - accuracy: 0.9894
1200/1688 [====================>.........] - ETA: 0s - loss: 0.0335 - accuracy: 0.9895
1231/1688 [====================>.........] - ETA: 0s - loss: 0.0333 - accuracy: 0.9895
1261/1688 [=====================>........] - ETA: 0s - loss: 0.0331 - accuracy: 0.9897
1292/1688 [=====================>........] - ETA: 0s - loss: 0.0337 - accuracy: 0.9895
1324/1688 [======================>.......] - ETA: 0s - loss: 0.0341 - accuracy: 0.9893
1355/1688 [=======================>......] - ETA: 0s - loss: 0.0348 - accuracy: 0.9891
1386/1688 [=======================>......] - ETA: 0s - loss: 0.0353 - accuracy: 0.9890
1418/1688 [========================>.....] - ETA: 0s - loss: 0.0356 - accuracy: 0.9889
1450/1688 [========================>.....] - ETA: 0s - loss: 0.0358 - accuracy: 0.9888
1481/1688 [=========================>....] - ETA: 0s - loss: 0.0355 - accuracy: 0.9889
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0353 - accuracy: 0.9890
1544/1688 [==========================>...] - ETA: 0s - loss: 0.0351 - accuracy: 0.9890
1576/1688 [===========================>..] - ETA: 0s - loss: 0.0351 - accuracy: 0.9889
1608/1688 [===========================>..] - ETA: 0s - loss: 0.0349 - accuracy: 0.9889
1639/1688 [============================>.] - ETA: 0s - loss: 0.0348 - accuracy: 0.9889
1671/1688 [============================>.] - ETA: 0s - loss: 0.0350 - accuracy: 0.9889
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0351 - accuracy: 0.9889 - val_loss: 0.0517 - val_accuracy: 0.9852
Epoch 4/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0222 - accuracy: 1.0000
  33/1688 [..............................] - ETA: 2s - loss: 0.0250 - accuracy: 0.9934
  64/1688 [>.............................] - ETA: 2s - loss: 0.0255 - accuracy: 0.9917
  95/1688 [>.............................] - ETA: 2s - loss: 0.0302 - accuracy: 0.9914
 126/1688 [=>............................] - ETA: 2s - loss: 0.0251 - accuracy: 0.9928
 157/1688 [=>............................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9932
 188/1688 [==>...........................] - ETA: 2s - loss: 0.0226 - accuracy: 0.9929
 219/1688 [==>...........................] - ETA: 2s - loss: 0.0229 - accuracy: 0.9930
 251/1688 [===>..........................] - ETA: 2s - loss: 0.0230 - accuracy: 0.9932
 282/1688 [====>.........................] - ETA: 2s - loss: 0.0229 - accuracy: 0.9931
 313/1688 [====>.........................] - ETA: 2s - loss: 0.0220 - accuracy: 0.9934
 344/1688 [=====>........................] - ETA: 2s - loss: 0.0222 - accuracy: 0.9932
 376/1688 [=====>........................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9930
 408/1688 [======>.......................] - ETA: 2s - loss: 0.0231 - accuracy: 0.9922
 440/1688 [======>.......................] - ETA: 2s - loss: 0.0241 - accuracy: 0.9919
 472/1688 [=======>......................] - ETA: 1s - loss: 0.0236 - accuracy: 0.9921
 503/1688 [=======>......................] - ETA: 1s - loss: 0.0239 - accuracy: 0.9919
 534/1688 [========>.....................] - ETA: 1s - loss: 0.0240 - accuracy: 0.9918
 566/1688 [=========>....................] - ETA: 1s - loss: 0.0238 - accuracy: 0.9917
 598/1688 [=========>....................] - ETA: 1s - loss: 0.0236 - accuracy: 0.9916
 629/1688 [==========>...................] - ETA: 1s - loss: 0.0239 - accuracy: 0.9915
 661/1688 [==========>...................] - ETA: 1s - loss: 0.0240 - accuracy: 0.9914
 692/1688 [===========>..................] - ETA: 1s - loss: 0.0236 - accuracy: 0.9916
 723/1688 [===========>..................] - ETA: 1s - loss: 0.0233 - accuracy: 0.9916
 752/1688 [============>.................] - ETA: 1s - loss: 0.0228 - accuracy: 0.9918
 784/1688 [============>.................] - ETA: 1s - loss: 0.0233 - accuracy: 0.9917
 816/1688 [=============>................] - ETA: 1s - loss: 0.0233 - accuracy: 0.9917
 848/1688 [==============>...............] - ETA: 1s - loss: 0.0238 - accuracy: 0.9914
 878/1688 [==============>...............] - ETA: 1s - loss: 0.0241 - accuracy: 0.9912
 910/1688 [===============>..............] - ETA: 1s - loss: 0.0239 - accuracy: 0.9912
 942/1688 [===============>..............] - ETA: 1s - loss: 0.0235 - accuracy: 0.9913
 973/1688 [================>.............] - ETA: 1s - loss: 0.0235 - accuracy: 0.9914
1005/1688 [================>.............] - ETA: 1s - loss: 0.0238 - accuracy: 0.9913
1037/1688 [=================>............] - ETA: 1s - loss: 0.0235 - accuracy: 0.9914
1069/1688 [=================>............] - ETA: 0s - loss: 0.0235 - accuracy: 0.9914
1100/1688 [==================>...........] - ETA: 0s - loss: 0.0234 - accuracy: 0.9915
1131/1688 [===================>..........] - ETA: 0s - loss: 0.0234 - accuracy: 0.9916
1163/1688 [===================>..........] - ETA: 0s - loss: 0.0232 - accuracy: 0.9917
1195/1688 [====================>.........] - ETA: 0s - loss: 0.0231 - accuracy: 0.9919
1226/1688 [====================>.........] - ETA: 0s - loss: 0.0234 - accuracy: 0.9918
1258/1688 [=====================>........] - ETA: 0s - loss: 0.0240 - accuracy: 0.9917
1289/1688 [=====================>........] - ETA: 0s - loss: 0.0243 - accuracy: 0.9916
1319/1688 [======================>.......] - ETA: 0s - loss: 0.0243 - accuracy: 0.9916
1350/1688 [======================>.......] - ETA: 0s - loss: 0.0241 - accuracy: 0.9917
1382/1688 [=======================>......] - ETA: 0s - loss: 0.0245 - accuracy: 0.9917
1414/1688 [========================>.....] - ETA: 0s - loss: 0.0245 - accuracy: 0.9916
1445/1688 [========================>.....] - ETA: 0s - loss: 0.0244 - accuracy: 0.9917
1477/1688 [=========================>....] - ETA: 0s - loss: 0.0243 - accuracy: 0.9918
1508/1688 [=========================>....] - ETA: 0s - loss: 0.0244 - accuracy: 0.9917
1539/1688 [==========================>...] - ETA: 0s - loss: 0.0245 - accuracy: 0.9916
1570/1688 [==========================>...] - ETA: 0s - loss: 0.0245 - accuracy: 0.9917
1601/1688 [===========================>..] - ETA: 0s - loss: 0.0246 - accuracy: 0.9916
1633/1688 [============================>.] - ETA: 0s - loss: 0.0247 - accuracy: 0.9915
1665/1688 [============================>.] - ETA: 0s - loss: 0.0246 - accuracy: 0.9916
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0249 - accuracy: 0.9915 - val_loss: 0.0616 - val_accuracy: 0.9850
Epoch 5/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0106 - accuracy: 1.0000
  35/1688 [..............................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9946
  67/1688 [>.............................] - ETA: 2s - loss: 0.0157 - accuracy: 0.9953
  99/1688 [>.............................] - ETA: 2s - loss: 0.0170 - accuracy: 0.9949
 130/1688 [=>............................] - ETA: 2s - loss: 0.0148 - accuracy: 0.9957
 162/1688 [=>............................] - ETA: 2s - loss: 0.0164 - accuracy: 0.9954
 193/1688 [==>...........................] - ETA: 2s - loss: 0.0158 - accuracy: 0.9955
 225/1688 [==>...........................] - ETA: 2s - loss: 0.0146 - accuracy: 0.9957
 257/1688 [===>..........................] - ETA: 2s - loss: 0.0148 - accuracy: 0.9954
 289/1688 [====>.........................] - ETA: 2s - loss: 0.0155 - accuracy: 0.9950
 320/1688 [====>.........................] - ETA: 2s - loss: 0.0157 - accuracy: 0.9948
 351/1688 [=====>........................] - ETA: 2s - loss: 0.0158 - accuracy: 0.9947
 383/1688 [=====>........................] - ETA: 2s - loss: 0.0171 - accuracy: 0.9945
 415/1688 [======>.......................] - ETA: 2s - loss: 0.0178 - accuracy: 0.9941
 447/1688 [======>.......................] - ETA: 1s - loss: 0.0179 - accuracy: 0.9940
 478/1688 [=======>......................] - ETA: 1s - loss: 0.0186 - accuracy: 0.9937
 509/1688 [========>.....................] - ETA: 1s - loss: 0.0182 - accuracy: 0.9940
 540/1688 [========>.....................] - ETA: 1s - loss: 0.0179 - accuracy: 0.9940
 572/1688 [=========>....................] - ETA: 1s - loss: 0.0177 - accuracy: 0.9940
 603/1688 [=========>....................] - ETA: 1s - loss: 0.0176 - accuracy: 0.9940
 634/1688 [==========>...................] - ETA: 1s - loss: 0.0174 - accuracy: 0.9939
 666/1688 [==========>...................] - ETA: 1s - loss: 0.0172 - accuracy: 0.9939
 698/1688 [===========>..................] - ETA: 1s - loss: 0.0171 - accuracy: 0.9940
 730/1688 [===========>..................] - ETA: 1s - loss: 0.0170 - accuracy: 0.9939
 761/1688 [============>.................] - ETA: 1s - loss: 0.0174 - accuracy: 0.9939
 792/1688 [=============>................] - ETA: 1s - loss: 0.0180 - accuracy: 0.9938
 824/1688 [=============>................] - ETA: 1s - loss: 0.0183 - accuracy: 0.9938
 855/1688 [==============>...............] - ETA: 1s - loss: 0.0181 - accuracy: 0.9938
 887/1688 [==============>...............] - ETA: 1s - loss: 0.0182 - accuracy: 0.9938
 918/1688 [===============>..............] - ETA: 1s - loss: 0.0180 - accuracy: 0.9938
 950/1688 [===============>..............] - ETA: 1s - loss: 0.0179 - accuracy: 0.9938
 982/1688 [================>.............] - ETA: 1s - loss: 0.0181 - accuracy: 0.9939
1014/1688 [=================>............] - ETA: 1s - loss: 0.0182 - accuracy: 0.9939
1046/1688 [=================>............] - ETA: 1s - loss: 0.0184 - accuracy: 0.9938
1077/1688 [==================>...........] - ETA: 0s - loss: 0.0185 - accuracy: 0.9938
1109/1688 [==================>...........] - ETA: 0s - loss: 0.0187 - accuracy: 0.9938
1140/1688 [===================>..........] - ETA: 0s - loss: 0.0186 - accuracy: 0.9939
1172/1688 [===================>..........] - ETA: 0s - loss: 0.0185 - accuracy: 0.9939
1203/1688 [====================>.........] - ETA: 0s - loss: 0.0183 - accuracy: 0.9940
1235/1688 [====================>.........] - ETA: 0s - loss: 0.0179 - accuracy: 0.9942
1267/1688 [=====================>........] - ETA: 0s - loss: 0.0176 - accuracy: 0.9943
1297/1688 [======================>.......] - ETA: 0s - loss: 0.0176 - accuracy: 0.9943
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0175 - accuracy: 0.9943
1361/1688 [=======================>......] - ETA: 0s - loss: 0.0175 - accuracy: 0.9943
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0175 - accuracy: 0.9942
1424/1688 [========================>.....] - ETA: 0s - loss: 0.0176 - accuracy: 0.9941
1457/1688 [========================>.....] - ETA: 0s - loss: 0.0178 - accuracy: 0.9941
1489/1688 [=========================>....] - ETA: 0s - loss: 0.0182 - accuracy: 0.9940
1522/1688 [==========================>...] - ETA: 0s - loss: 0.0182 - accuracy: 0.9940
1555/1688 [==========================>...] - ETA: 0s - loss: 0.0189 - accuracy: 0.9937
1586/1688 [===========================>..] - ETA: 0s - loss: 0.0189 - accuracy: 0.9937
1617/1688 [===========================>..] - ETA: 0s - loss: 0.0188 - accuracy: 0.9937
1649/1688 [============================>.] - ETA: 0s - loss: 0.0187 - accuracy: 0.9937
1681/1688 [============================>.] - ETA: 0s - loss: 0.0187 - accuracy: 0.9938
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0187 - accuracy: 0.9938 - val_loss: 0.0661 - val_accuracy: 0.9835
Epoch 6/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0023 - accuracy: 1.0000
  36/1688 [..............................] - ETA: 2s - loss: 0.0225 - accuracy: 0.9939
  67/1688 [>.............................] - ETA: 2s - loss: 0.0173 - accuracy: 0.9963
  99/1688 [>.............................] - ETA: 2s - loss: 0.0154 - accuracy: 0.9962
 130/1688 [=>............................] - ETA: 2s - loss: 0.0137 - accuracy: 0.9964
 161/1688 [=>............................] - ETA: 2s - loss: 0.0133 - accuracy: 0.9957
 191/1688 [==>...........................] - ETA: 2s - loss: 0.0132 - accuracy: 0.9957
 223/1688 [==>...........................] - ETA: 2s - loss: 0.0135 - accuracy: 0.9955
 254/1688 [===>..........................] - ETA: 2s - loss: 0.0125 - accuracy: 0.9961
 285/1688 [====>.........................] - ETA: 2s - loss: 0.0134 - accuracy: 0.9956
 317/1688 [====>.........................] - ETA: 2s - loss: 0.0132 - accuracy: 0.9958
 348/1688 [=====>........................] - ETA: 2s - loss: 0.0139 - accuracy: 0.9955
 379/1688 [=====>........................] - ETA: 2s - loss: 0.0133 - accuracy: 0.9958
 410/1688 [======>.......................] - ETA: 2s - loss: 0.0135 - accuracy: 0.9957
 442/1688 [======>.......................] - ETA: 2s - loss: 0.0138 - accuracy: 0.9954
 473/1688 [=======>......................] - ETA: 1s - loss: 0.0137 - accuracy: 0.9955
 505/1688 [=======>......................] - ETA: 1s - loss: 0.0142 - accuracy: 0.9953
 536/1688 [========>.....................] - ETA: 1s - loss: 0.0143 - accuracy: 0.9953
 567/1688 [=========>....................] - ETA: 1s - loss: 0.0144 - accuracy: 0.9953
 597/1688 [=========>....................] - ETA: 1s - loss: 0.0140 - accuracy: 0.9953
 629/1688 [==========>...................] - ETA: 1s - loss: 0.0136 - accuracy: 0.9955
 660/1688 [==========>...................] - ETA: 1s - loss: 0.0134 - accuracy: 0.9956
 692/1688 [===========>..................] - ETA: 1s - loss: 0.0137 - accuracy: 0.9954
 724/1688 [===========>..................] - ETA: 1s - loss: 0.0139 - accuracy: 0.9954
 756/1688 [============>.................] - ETA: 1s - loss: 0.0135 - accuracy: 0.9956
 788/1688 [=============>................] - ETA: 1s - loss: 0.0137 - accuracy: 0.9956
 819/1688 [=============>................] - ETA: 1s - loss: 0.0142 - accuracy: 0.9953
 850/1688 [==============>...............] - ETA: 1s - loss: 0.0150 - accuracy: 0.9950
 881/1688 [==============>...............] - ETA: 1s - loss: 0.0150 - accuracy: 0.9949
 913/1688 [===============>..............] - ETA: 1s - loss: 0.0155 - accuracy: 0.9948
 944/1688 [===============>..............] - ETA: 1s - loss: 0.0155 - accuracy: 0.9948
 976/1688 [================>.............] - ETA: 1s - loss: 0.0165 - accuracy: 0.9944
1007/1688 [================>.............] - ETA: 1s - loss: 0.0167 - accuracy: 0.9943
1038/1688 [=================>............] - ETA: 1s - loss: 0.0165 - accuracy: 0.9943
1069/1688 [=================>............] - ETA: 0s - loss: 0.0168 - accuracy: 0.9942
1100/1688 [==================>...........] - ETA: 0s - loss: 0.0167 - accuracy: 0.9943
1131/1688 [===================>..........] - ETA: 0s - loss: 0.0171 - accuracy: 0.9942
1163/1688 [===================>..........] - ETA: 0s - loss: 0.0167 - accuracy: 0.9943
1195/1688 [====================>.........] - ETA: 0s - loss: 0.0170 - accuracy: 0.9942
1227/1688 [====================>.........] - ETA: 0s - loss: 0.0169 - accuracy: 0.9942
1259/1688 [=====================>........] - ETA: 0s - loss: 0.0167 - accuracy: 0.9943
1291/1688 [=====================>........] - ETA: 0s - loss: 0.0169 - accuracy: 0.9942
1321/1688 [======================>.......] - ETA: 0s - loss: 0.0170 - accuracy: 0.9941
1353/1688 [=======================>......] - ETA: 0s - loss: 0.0167 - accuracy: 0.9942
1384/1688 [=======================>......] - ETA: 0s - loss: 0.0166 - accuracy: 0.9942
1417/1688 [========================>.....] - ETA: 0s - loss: 0.0166 - accuracy: 0.9943
1449/1688 [========================>.....] - ETA: 0s - loss: 0.0166 - accuracy: 0.9943
1480/1688 [=========================>....] - ETA: 0s - loss: 0.0166 - accuracy: 0.9943
1512/1688 [=========================>....] - ETA: 0s - loss: 0.0168 - accuracy: 0.9943
1543/1688 [==========================>...] - ETA: 0s - loss: 0.0167 - accuracy: 0.9943
1574/1688 [==========================>...] - ETA: 0s - loss: 0.0166 - accuracy: 0.9943
1605/1688 [===========================>..] - ETA: 0s - loss: 0.0168 - accuracy: 0.9943
1637/1688 [============================>.] - ETA: 0s - loss: 0.0168 - accuracy: 0.9943
1669/1688 [============================>.] - ETA: 0s - loss: 0.0167 - accuracy: 0.9943
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0166 - accuracy: 0.9944 - val_loss: 0.0449 - val_accuracy: 0.9887
Epoch 7/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0043 - accuracy: 1.0000
  35/1688 [..............................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9946
  68/1688 [>.............................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9963
 100/1688 [>.............................] - ETA: 2s - loss: 0.0113 - accuracy: 0.9966
 131/1688 [=>............................] - ETA: 2s - loss: 0.0099 - accuracy: 0.9969
 163/1688 [=>............................] - ETA: 2s - loss: 0.0094 - accuracy: 0.9971
 195/1688 [==>...........................] - ETA: 2s - loss: 0.0088 - accuracy: 0.9971
 226/1688 [===>..........................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9968
 258/1688 [===>..........................] - ETA: 2s - loss: 0.0102 - accuracy: 0.9964
 289/1688 [====>.........................] - ETA: 2s - loss: 0.0100 - accuracy: 0.9966
 321/1688 [====>.........................] - ETA: 2s - loss: 0.0100 - accuracy: 0.9966
 353/1688 [=====>........................] - ETA: 2s - loss: 0.0098 - accuracy: 0.9966
 387/1688 [=====>........................] - ETA: 2s - loss: 0.0097 - accuracy: 0.9966
 419/1688 [======>.......................] - ETA: 2s - loss: 0.0098 - accuracy: 0.9967
 451/1688 [=======>......................] - ETA: 1s - loss: 0.0095 - accuracy: 0.9968
 484/1688 [=======>......................] - ETA: 1s - loss: 0.0100 - accuracy: 0.9967
 516/1688 [========>.....................] - ETA: 1s - loss: 0.0099 - accuracy: 0.9967
 547/1688 [========>.....................] - ETA: 1s - loss: 0.0095 - accuracy: 0.9967
 578/1688 [=========>....................] - ETA: 1s - loss: 0.0098 - accuracy: 0.9967
 609/1688 [=========>....................] - ETA: 1s - loss: 0.0095 - accuracy: 0.9968
 641/1688 [==========>...................] - ETA: 1s - loss: 0.0097 - accuracy: 0.9967
 672/1688 [==========>...................] - ETA: 1s - loss: 0.0094 - accuracy: 0.9968
 702/1688 [===========>..................] - ETA: 1s - loss: 0.0095 - accuracy: 0.9968
 734/1688 [============>.................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9968
 765/1688 [============>.................] - ETA: 1s - loss: 0.0094 - accuracy: 0.9968
 797/1688 [=============>................] - ETA: 1s - loss: 0.0099 - accuracy: 0.9967
 829/1688 [=============>................] - ETA: 1s - loss: 0.0099 - accuracy: 0.9967
 861/1688 [==============>...............] - ETA: 1s - loss: 0.0099 - accuracy: 0.9967
 893/1688 [==============>...............] - ETA: 1s - loss: 0.0100 - accuracy: 0.9967
 924/1688 [===============>..............] - ETA: 1s - loss: 0.0103 - accuracy: 0.9966
 956/1688 [===============>..............] - ETA: 1s - loss: 0.0104 - accuracy: 0.9965
 988/1688 [================>.............] - ETA: 1s - loss: 0.0106 - accuracy: 0.9963
1020/1688 [=================>............] - ETA: 1s - loss: 0.0108 - accuracy: 0.9963
1051/1688 [=================>............] - ETA: 1s - loss: 0.0110 - accuracy: 0.9963
1083/1688 [==================>...........] - ETA: 0s - loss: 0.0111 - accuracy: 0.9963
1114/1688 [==================>...........] - ETA: 0s - loss: 0.0111 - accuracy: 0.9962
1146/1688 [===================>..........] - ETA: 0s - loss: 0.0117 - accuracy: 0.9961
1177/1688 [===================>..........] - ETA: 0s - loss: 0.0118 - accuracy: 0.9960
1208/1688 [====================>.........] - ETA: 0s - loss: 0.0118 - accuracy: 0.9960
1239/1688 [=====================>........] - ETA: 0s - loss: 0.0119 - accuracy: 0.9960
1270/1688 [=====================>........] - ETA: 0s - loss: 0.0117 - accuracy: 0.9960
1302/1688 [======================>.......] - ETA: 0s - loss: 0.0118 - accuracy: 0.9960
1334/1688 [======================>.......] - ETA: 0s - loss: 0.0119 - accuracy: 0.9959
1366/1688 [=======================>......] - ETA: 0s - loss: 0.0121 - accuracy: 0.9959
1398/1688 [=======================>......] - ETA: 0s - loss: 0.0120 - accuracy: 0.9960
1430/1688 [========================>.....] - ETA: 0s - loss: 0.0119 - accuracy: 0.9960
1461/1688 [========================>.....] - ETA: 0s - loss: 0.0121 - accuracy: 0.9960
1493/1688 [=========================>....] - ETA: 0s - loss: 0.0122 - accuracy: 0.9959
1525/1688 [==========================>...] - ETA: 0s - loss: 0.0121 - accuracy: 0.9959
1557/1688 [==========================>...] - ETA: 0s - loss: 0.0119 - accuracy: 0.9960
1588/1688 [===========================>..] - ETA: 0s - loss: 0.0119 - accuracy: 0.9960
1620/1688 [===========================>..] - ETA: 0s - loss: 0.0120 - accuracy: 0.9959
1652/1688 [============================>.] - ETA: 0s - loss: 0.0122 - accuracy: 0.9959
1684/1688 [============================>.] - ETA: 0s - loss: 0.0124 - accuracy: 0.9958
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0124 - accuracy: 0.9958 - val_loss: 0.0589 - val_accuracy: 0.9870
Epoch 8/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0216 - accuracy: 1.0000
  32/1688 [..............................] - ETA: 2s - loss: 0.0120 - accuracy: 0.9961
  63/1688 [>.............................] - ETA: 2s - loss: 0.0138 - accuracy: 0.9960
  95/1688 [>.............................] - ETA: 2s - loss: 0.0125 - accuracy: 0.9964
 126/1688 [=>............................] - ETA: 2s - loss: 0.0117 - accuracy: 0.9958
 158/1688 [=>............................] - ETA: 2s - loss: 0.0103 - accuracy: 0.9964
 189/1688 [==>...........................] - ETA: 2s - loss: 0.0115 - accuracy: 0.9964
 221/1688 [==>...........................] - ETA: 2s - loss: 0.0106 - accuracy: 0.9966
 253/1688 [===>..........................] - ETA: 2s - loss: 0.0099 - accuracy: 0.9968
 284/1688 [====>.........................] - ETA: 2s - loss: 0.0092 - accuracy: 0.9970
 315/1688 [====>.........................] - ETA: 2s - loss: 0.0086 - accuracy: 0.9972
 348/1688 [=====>........................] - ETA: 2s - loss: 0.0083 - accuracy: 0.9973
 381/1688 [=====>........................] - ETA: 2s - loss: 0.0080 - accuracy: 0.9974
 412/1688 [======>.......................] - ETA: 2s - loss: 0.0083 - accuracy: 0.9972
 443/1688 [======>.......................] - ETA: 1s - loss: 0.0089 - accuracy: 0.9970
 473/1688 [=======>......................] - ETA: 1s - loss: 0.0087 - accuracy: 0.9971
 505/1688 [=======>......................] - ETA: 1s - loss: 0.0092 - accuracy: 0.9968
 538/1688 [========>.....................] - ETA: 1s - loss: 0.0091 - accuracy: 0.9969
 569/1688 [=========>....................] - ETA: 1s - loss: 0.0091 - accuracy: 0.9970
 602/1688 [=========>....................] - ETA: 1s - loss: 0.0091 - accuracy: 0.9969
 633/1688 [==========>...................] - ETA: 1s - loss: 0.0089 - accuracy: 0.9970
 665/1688 [==========>...................] - ETA: 1s - loss: 0.0088 - accuracy: 0.9970
 696/1688 [===========>..................] - ETA: 1s - loss: 0.0097 - accuracy: 0.9968
 727/1688 [===========>..................] - ETA: 1s - loss: 0.0098 - accuracy: 0.9967
 759/1688 [============>.................] - ETA: 1s - loss: 0.0101 - accuracy: 0.9967
 790/1688 [=============>................] - ETA: 1s - loss: 0.0101 - accuracy: 0.9968
 821/1688 [=============>................] - ETA: 1s - loss: 0.0098 - accuracy: 0.9969
 853/1688 [==============>...............] - ETA: 1s - loss: 0.0101 - accuracy: 0.9968
 885/1688 [==============>...............] - ETA: 1s - loss: 0.0100 - accuracy: 0.9969
 916/1688 [===============>..............] - ETA: 1s - loss: 0.0100 - accuracy: 0.9969
 947/1688 [===============>..............] - ETA: 1s - loss: 0.0102 - accuracy: 0.9968
 978/1688 [================>.............] - ETA: 1s - loss: 0.0105 - accuracy: 0.9968
1008/1688 [================>.............] - ETA: 1s - loss: 0.0106 - accuracy: 0.9967
1040/1688 [=================>............] - ETA: 1s - loss: 0.0109 - accuracy: 0.9966
1072/1688 [==================>...........] - ETA: 0s - loss: 0.0110 - accuracy: 0.9966
1104/1688 [==================>...........] - ETA: 0s - loss: 0.0112 - accuracy: 0.9966
1135/1688 [===================>..........] - ETA: 0s - loss: 0.0113 - accuracy: 0.9965
1166/1688 [===================>..........] - ETA: 0s - loss: 0.0112 - accuracy: 0.9966
1198/1688 [====================>.........] - ETA: 0s - loss: 0.0111 - accuracy: 0.9966
1230/1688 [====================>.........] - ETA: 0s - loss: 0.0112 - accuracy: 0.9966
1261/1688 [=====================>........] - ETA: 0s - loss: 0.0111 - accuracy: 0.9966
1293/1688 [=====================>........] - ETA: 0s - loss: 0.0110 - accuracy: 0.9966
1324/1688 [======================>.......] - ETA: 0s - loss: 0.0115 - accuracy: 0.9964
1355/1688 [=======================>......] - ETA: 0s - loss: 0.0116 - accuracy: 0.9963
1387/1688 [=======================>......] - ETA: 0s - loss: 0.0116 - accuracy: 0.9963
1419/1688 [========================>.....] - ETA: 0s - loss: 0.0116 - accuracy: 0.9963
1451/1688 [========================>.....] - ETA: 0s - loss: 0.0116 - accuracy: 0.9963
1483/1688 [=========================>....] - ETA: 0s - loss: 0.0115 - accuracy: 0.9963
1515/1688 [=========================>....] - ETA: 0s - loss: 0.0115 - accuracy: 0.9963
1546/1688 [==========================>...] - ETA: 0s - loss: 0.0116 - accuracy: 0.9962
1577/1688 [===========================>..] - ETA: 0s - loss: 0.0115 - accuracy: 0.9962
1608/1688 [===========================>..] - ETA: 0s - loss: 0.0114 - accuracy: 0.9963
1640/1688 [============================>.] - ETA: 0s - loss: 0.0115 - accuracy: 0.9962
1672/1688 [============================>.] - ETA: 0s - loss: 0.0115 - accuracy: 0.9962
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0115 - accuracy: 0.9963 - val_loss: 0.0501 - val_accuracy: 0.9893
Epoch 9/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0269 - accuracy: 0.9688
  36/1688 [..............................] - ETA: 2s - loss: 0.0141 - accuracy: 0.9957
  70/1688 [>.............................] - ETA: 2s - loss: 0.0099 - accuracy: 0.9960
 102/1688 [>.............................] - ETA: 2s - loss: 0.0087 - accuracy: 0.9963
 133/1688 [=>............................] - ETA: 2s - loss: 0.0076 - accuracy: 0.9967
 163/1688 [=>............................] - ETA: 2s - loss: 0.0078 - accuracy: 0.9965
 195/1688 [==>...........................] - ETA: 2s - loss: 0.0071 - accuracy: 0.9970
 227/1688 [===>..........................] - ETA: 2s - loss: 0.0071 - accuracy: 0.9967
 259/1688 [===>..........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9970
 291/1688 [====>.........................] - ETA: 2s - loss: 0.0073 - accuracy: 0.9970
 323/1688 [====>.........................] - ETA: 2s - loss: 0.0073 - accuracy: 0.9970
 354/1688 [=====>........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9972
 385/1688 [=====>........................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9973
 417/1688 [======>.......................] - ETA: 2s - loss: 0.0070 - accuracy: 0.9973
 448/1688 [======>.......................] - ETA: 1s - loss: 0.0068 - accuracy: 0.9975
 479/1688 [=======>......................] - ETA: 1s - loss: 0.0071 - accuracy: 0.9973
 511/1688 [========>.....................] - ETA: 1s - loss: 0.0070 - accuracy: 0.9973
 542/1688 [========>.....................] - ETA: 1s - loss: 0.0072 - accuracy: 0.9973
 574/1688 [=========>....................] - ETA: 1s - loss: 0.0079 - accuracy: 0.9971
 607/1688 [=========>....................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9971
 638/1688 [==========>...................] - ETA: 1s - loss: 0.0077 - accuracy: 0.9972
 670/1688 [==========>...................] - ETA: 1s - loss: 0.0076 - accuracy: 0.9972
 702/1688 [===========>..................] - ETA: 1s - loss: 0.0076 - accuracy: 0.9972
 733/1688 [============>.................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9972
 764/1688 [============>.................] - ETA: 1s - loss: 0.0074 - accuracy: 0.9971
 795/1688 [=============>................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9972
 826/1688 [=============>................] - ETA: 1s - loss: 0.0073 - accuracy: 0.9972
 857/1688 [==============>...............] - ETA: 1s - loss: 0.0072 - accuracy: 0.9973
 889/1688 [==============>...............] - ETA: 1s - loss: 0.0076 - accuracy: 0.9973
 920/1688 [===============>..............] - ETA: 1s - loss: 0.0076 - accuracy: 0.9973
 953/1688 [===============>..............] - ETA: 1s - loss: 0.0076 - accuracy: 0.9973
 985/1688 [================>.............] - ETA: 1s - loss: 0.0075 - accuracy: 0.9973
1016/1688 [=================>............] - ETA: 1s - loss: 0.0074 - accuracy: 0.9974
1048/1688 [=================>............] - ETA: 1s - loss: 0.0076 - accuracy: 0.9973
1080/1688 [==================>...........] - ETA: 0s - loss: 0.0076 - accuracy: 0.9974
1111/1688 [==================>...........] - ETA: 0s - loss: 0.0077 - accuracy: 0.9973
1143/1688 [===================>..........] - ETA: 0s - loss: 0.0078 - accuracy: 0.9973
1175/1688 [===================>..........] - ETA: 0s - loss: 0.0078 - accuracy: 0.9973
1206/1688 [====================>.........] - ETA: 0s - loss: 0.0078 - accuracy: 0.9974
1236/1688 [====================>.........] - ETA: 0s - loss: 0.0080 - accuracy: 0.9972
1268/1688 [=====================>........] - ETA: 0s - loss: 0.0081 - accuracy: 0.9972
1299/1688 [======================>.......] - ETA: 0s - loss: 0.0082 - accuracy: 0.9971
1331/1688 [======================>.......] - ETA: 0s - loss: 0.0083 - accuracy: 0.9971
1362/1688 [=======================>......] - ETA: 0s - loss: 0.0083 - accuracy: 0.9971
1394/1688 [=======================>......] - ETA: 0s - loss: 0.0085 - accuracy: 0.9970
1426/1688 [========================>.....] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969
1458/1688 [========================>.....] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969
1490/1688 [=========================>....] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969
1521/1688 [==========================>...] - ETA: 0s - loss: 0.0086 - accuracy: 0.9970
1552/1688 [==========================>...] - ETA: 0s - loss: 0.0086 - accuracy: 0.9970
1584/1688 [===========================>..] - ETA: 0s - loss: 0.0086 - accuracy: 0.9970
1615/1688 [===========================>..] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969
1646/1688 [============================>.] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969
1678/1688 [============================>.] - ETA: 0s - loss: 0.0092 - accuracy: 0.9968
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0093 - accuracy: 0.9968 - val_loss: 0.0605 - val_accuracy: 0.9860
Epoch 10/10

   1/1688 [..............................] - ETA: 0s - loss: 0.0856 - accuracy: 0.9688
  34/1688 [..............................] - ETA: 2s - loss: 0.0170 - accuracy: 0.9926
  65/1688 [>.............................] - ETA: 2s - loss: 0.0108 - accuracy: 0.9962
  97/1688 [>.............................] - ETA: 2s - loss: 0.0083 - accuracy: 0.9971
 129/1688 [=>............................] - ETA: 2s - loss: 0.0072 - accuracy: 0.9978
 161/1688 [=>............................] - ETA: 2s - loss: 0.0061 - accuracy: 0.9983
 193/1688 [==>...........................] - ETA: 2s - loss: 0.0053 - accuracy: 0.9985
 224/1688 [==>...........................] - ETA: 2s - loss: 0.0058 - accuracy: 0.9986
 256/1688 [===>..........................] - ETA: 2s - loss: 0.0060 - accuracy: 0.9987
 287/1688 [====>.........................] - ETA: 2s - loss: 0.0057 - accuracy: 0.9988
 319/1688 [====>.........................] - ETA: 2s - loss: 0.0066 - accuracy: 0.9982
 351/1688 [=====>........................] - ETA: 2s - loss: 0.0068 - accuracy: 0.9980
 382/1688 [=====>........................] - ETA: 2s - loss: 0.0079 - accuracy: 0.9976
 414/1688 [======>.......................] - ETA: 2s - loss: 0.0080 - accuracy: 0.9975
 446/1688 [======>.......................] - ETA: 1s - loss: 0.0078 - accuracy: 0.9976
 478/1688 [=======>......................] - ETA: 1s - loss: 0.0079 - accuracy: 0.9976
 509/1688 [========>.....................] - ETA: 1s - loss: 0.0076 - accuracy: 0.9977
 541/1688 [========>.....................] - ETA: 1s - loss: 0.0074 - accuracy: 0.9977
 572/1688 [=========>....................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9977
 604/1688 [=========>....................] - ETA: 1s - loss: 0.0077 - accuracy: 0.9976
 636/1688 [==========>...................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9977
 668/1688 [==========>...................] - ETA: 1s - loss: 0.0076 - accuracy: 0.9977
 701/1688 [===========>..................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9976
 732/1688 [============>.................] - ETA: 1s - loss: 0.0075 - accuracy: 0.9977
 763/1688 [============>.................] - ETA: 1s - loss: 0.0074 - accuracy: 0.9977
 794/1688 [=============>................] - ETA: 1s - loss: 0.0074 - accuracy: 0.9977
 826/1688 [=============>................] - ETA: 1s - loss: 0.0073 - accuracy: 0.9977
 857/1688 [==============>...............] - ETA: 1s - loss: 0.0072 - accuracy: 0.9977
 888/1688 [==============>...............] - ETA: 1s - loss: 0.0071 - accuracy: 0.9977
 919/1688 [===============>..............] - ETA: 1s - loss: 0.0070 - accuracy: 0.9978
 951/1688 [===============>..............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9978
 983/1688 [================>.............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9978
1015/1688 [=================>............] - ETA: 1s - loss: 0.0069 - accuracy: 0.9978
1046/1688 [=================>............] - ETA: 1s - loss: 0.0068 - accuracy: 0.9978
1077/1688 [==================>...........] - ETA: 0s - loss: 0.0069 - accuracy: 0.9977
1109/1688 [==================>...........] - ETA: 0s - loss: 0.0072 - accuracy: 0.9976
1141/1688 [===================>..........] - ETA: 0s - loss: 0.0073 - accuracy: 0.9976
1173/1688 [===================>..........] - ETA: 0s - loss: 0.0075 - accuracy: 0.9975
1204/1688 [====================>.........] - ETA: 0s - loss: 0.0076 - accuracy: 0.9975
1236/1688 [====================>.........] - ETA: 0s - loss: 0.0079 - accuracy: 0.9974
1267/1688 [=====================>........] - ETA: 0s - loss: 0.0078 - accuracy: 0.9974
1298/1688 [======================>.......] - ETA: 0s - loss: 0.0079 - accuracy: 0.9974
1330/1688 [======================>.......] - ETA: 0s - loss: 0.0080 - accuracy: 0.9973
1361/1688 [=======================>......] - ETA: 0s - loss: 0.0080 - accuracy: 0.9974
1392/1688 [=======================>......] - ETA: 0s - loss: 0.0081 - accuracy: 0.9974
1423/1688 [========================>.....] - ETA: 0s - loss: 0.0082 - accuracy: 0.9973
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0084 - accuracy: 0.9972
1486/1688 [=========================>....] - ETA: 0s - loss: 0.0085 - accuracy: 0.9972
1517/1688 [=========================>....] - ETA: 0s - loss: 0.0085 - accuracy: 0.9972
1549/1688 [==========================>...] - ETA: 0s - loss: 0.0086 - accuracy: 0.9971
1580/1688 [===========================>..] - ETA: 0s - loss: 0.0086 - accuracy: 0.9971
1611/1688 [===========================>..] - ETA: 0s - loss: 0.0087 - accuracy: 0.9971
1643/1688 [============================>.] - ETA: 0s - loss: 0.0088 - accuracy: 0.9971
1674/1688 [============================>.] - ETA: 0s - loss: 0.0089 - accuracy: 0.9970
1688/1688 [==============================] - 3s 2ms/step - loss: 0.0089 - accuracy: 0.9970 - val_loss: 0.0555 - val_accuracy: 0.9877
Test score: 0.04424111545085907
Test accuracy: 0.9882000088691711

4. Model quantization

We can now turn to quantization to get a discretized version of the model, where the weights and activations are quantized so as to be suitable for implementation in the Akida NSoC.

For this, we just have to quantize the Keras model using the quantize function. Here, we decide to quantize to the maximum allowed bitwidths for the first layer weights (8-bit), the subsequent layer weights (4-bit) and the activations (4-bit).

The quantized model is a Keras model where the neural layers (Conv2D, Dense) and the ReLU layers are replaced with custom CNN2SNN quantized layers (QuantizedConv2D, QuantizedDense, QuantizedReLU). All Keras API functions can be applied on this new model: summary(), compile(), fit(). etc.

Note

The quantize function folds the batch normalization layers into the corresponding neural layer. The new weights are computed according to this folding operation.

from cnn2snn import quantize

model_quantized = quantize(model_keras,
                           input_weight_quantization=8,
                           weight_quantization=4,
                           activ_quantization=4)
model_quantized.summary()

Out:

Model: "mnistnet"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_10 (InputLayer)        [(None, 28, 28, 1)]       0
_________________________________________________________________
conv2d (QuantizedConv2D)     (None, 26, 26, 32)        320
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 13, 13, 32)        0
_________________________________________________________________
re_lu (ActivationDiscreteRel (None, 13, 13, 32)        0
_________________________________________________________________
separable_conv2d (QuantizedS (None, 13, 13, 64)        2400
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 7, 7, 64)          0
_________________________________________________________________
re_lu_1 (ActivationDiscreteR (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
dense (QuantizedDense)       (None, 10)                31370
=================================================================
Total params: 34,090
Trainable params: 34,090
Non-trainable params: 0
_________________________________________________________________

Check the quantized model accuracy.

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after 8-4-4 quantization:', score[1])

Out:

Test accuracy after 8-4-4 quantization: 0.9876999855041504

Since we used the maximum allowed bitwidths for weights and activations, the accuracy of the quantized model is equivalent to the one of the base model, but for lower bitwidth, the quantization usually introduces a performance drop.

Let’s try to quantize specific layers to a lower bitwidth. The CNN2SNN toolkit provides the quantize_layer function: each layer can be individually quantized.

Here, we quantize the “re_lu_1” layer to binary activations (bitwidth=1) and the “dense” layer with 2-bit weights.

from cnn2snn import quantize_layer

model_quantized = quantize_layer(model_quantized, "re_lu_1", bitwidth=1)
model_quantized = quantize_layer(model_quantized, "dense", bitwidth=2)

model_quantized.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer='adam',
    metrics=['accuracy'])

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after low bitwidth quantization:', score[1])

# To recover the original model accuracy, a quantization-aware training phase
# is required.

Out:

Test accuracy after low bitwidth quantization: 0.6855999827384949

5. Model fine tuning (quantization-aware training)

This quantization-aware training (fine tuning) allows to cover the performance drop due to the quantization step.

Note that since this step is a fine tuning, the number of epochs can be lowered, compared to the training from scratch of the standard model.

model_quantized.fit(x_train, y_train, epochs=5, validation_split=0.1)

score = model_quantized.evaluate(x_test, y_test, verbose=0)
print('Test accuracy after fine tuning:', score[1])

Out:

Epoch 1/5

   1/1688 [..............................] - ETA: 0s - loss: 0.7904 - accuracy: 0.6875
  25/1688 [..............................] - ETA: 3s - loss: 0.4981 - accuracy: 0.8400
  49/1688 [..............................] - ETA: 3s - loss: 0.3528 - accuracy: 0.8929
  72/1688 [>.............................] - ETA: 3s - loss: 0.2909 - accuracy: 0.9123
  96/1688 [>.............................] - ETA: 3s - loss: 0.2462 - accuracy: 0.9274
 120/1688 [=>............................] - ETA: 3s - loss: 0.2215 - accuracy: 0.9331
 144/1688 [=>............................] - ETA: 3s - loss: 0.1960 - accuracy: 0.9412
 168/1688 [=>............................] - ETA: 3s - loss: 0.1803 - accuracy: 0.9461
 191/1688 [==>...........................] - ETA: 3s - loss: 0.1694 - accuracy: 0.9491
 215/1688 [==>...........................] - ETA: 3s - loss: 0.1579 - accuracy: 0.9519
 239/1688 [===>..........................] - ETA: 3s - loss: 0.1489 - accuracy: 0.9544
 262/1688 [===>..........................] - ETA: 3s - loss: 0.1408 - accuracy: 0.9566
 286/1688 [====>.........................] - ETA: 3s - loss: 0.1333 - accuracy: 0.9585
 309/1688 [====>.........................] - ETA: 2s - loss: 0.1279 - accuracy: 0.9600
 332/1688 [====>.........................] - ETA: 2s - loss: 0.1237 - accuracy: 0.9612
 356/1688 [=====>........................] - ETA: 2s - loss: 0.1187 - accuracy: 0.9629
 380/1688 [=====>........................] - ETA: 2s - loss: 0.1143 - accuracy: 0.9644
 404/1688 [======>.......................] - ETA: 2s - loss: 0.1104 - accuracy: 0.9657
 428/1688 [======>.......................] - ETA: 2s - loss: 0.1062 - accuracy: 0.9669
 452/1688 [=======>......................] - ETA: 2s - loss: 0.1027 - accuracy: 0.9679
 475/1688 [=======>......................] - ETA: 2s - loss: 0.0993 - accuracy: 0.9691
 499/1688 [=======>......................] - ETA: 2s - loss: 0.0962 - accuracy: 0.9701
 523/1688 [========>.....................] - ETA: 2s - loss: 0.0937 - accuracy: 0.9710
 546/1688 [========>.....................] - ETA: 2s - loss: 0.0927 - accuracy: 0.9706
 570/1688 [=========>....................] - ETA: 2s - loss: 0.0910 - accuracy: 0.9711
 594/1688 [=========>....................] - ETA: 2s - loss: 0.0885 - accuracy: 0.9718
 618/1688 [=========>....................] - ETA: 2s - loss: 0.0867 - accuracy: 0.9722
 642/1688 [==========>...................] - ETA: 2s - loss: 0.0849 - accuracy: 0.9727
 666/1688 [==========>...................] - ETA: 2s - loss: 0.0838 - accuracy: 0.9730
 690/1688 [===========>..................] - ETA: 2s - loss: 0.0819 - accuracy: 0.9737
 714/1688 [===========>..................] - ETA: 2s - loss: 0.0808 - accuracy: 0.9742
 738/1688 [============>.................] - ETA: 2s - loss: 0.0793 - accuracy: 0.9746
 761/1688 [============>.................] - ETA: 1s - loss: 0.0785 - accuracy: 0.9748
 785/1688 [============>.................] - ETA: 1s - loss: 0.0777 - accuracy: 0.9750
 809/1688 [=============>................] - ETA: 1s - loss: 0.0766 - accuracy: 0.9753
 833/1688 [=============>................] - ETA: 1s - loss: 0.0751 - accuracy: 0.9758
 857/1688 [==============>...............] - ETA: 1s - loss: 0.0741 - accuracy: 0.9761
 881/1688 [==============>...............] - ETA: 1s - loss: 0.0731 - accuracy: 0.9763
 905/1688 [===============>..............] - ETA: 1s - loss: 0.0719 - accuracy: 0.9767
 929/1688 [===============>..............] - ETA: 1s - loss: 0.0715 - accuracy: 0.9769
 953/1688 [===============>..............] - ETA: 1s - loss: 0.0705 - accuracy: 0.9771
 977/1688 [================>.............] - ETA: 1s - loss: 0.0694 - accuracy: 0.9775
1000/1688 [================>.............] - ETA: 1s - loss: 0.0689 - accuracy: 0.9775
1024/1688 [=================>............] - ETA: 1s - loss: 0.0682 - accuracy: 0.9778
1047/1688 [=================>............] - ETA: 1s - loss: 0.0676 - accuracy: 0.9780
1071/1688 [==================>...........] - ETA: 1s - loss: 0.0672 - accuracy: 0.9781
1095/1688 [==================>...........] - ETA: 1s - loss: 0.0663 - accuracy: 0.9783
1119/1688 [==================>...........] - ETA: 1s - loss: 0.0657 - accuracy: 0.9785
1143/1688 [===================>..........] - ETA: 1s - loss: 0.0648 - accuracy: 0.9788
1166/1688 [===================>..........] - ETA: 1s - loss: 0.0640 - accuracy: 0.9789
1189/1688 [====================>.........] - ETA: 1s - loss: 0.0636 - accuracy: 0.9791
1212/1688 [====================>.........] - ETA: 1s - loss: 0.0632 - accuracy: 0.9791
1235/1688 [====================>.........] - ETA: 0s - loss: 0.0630 - accuracy: 0.9792
1259/1688 [=====================>........] - ETA: 0s - loss: 0.0625 - accuracy: 0.9794
1282/1688 [=====================>........] - ETA: 0s - loss: 0.0623 - accuracy: 0.9793
1305/1688 [======================>.......] - ETA: 0s - loss: 0.0617 - accuracy: 0.9795
1329/1688 [======================>.......] - ETA: 0s - loss: 0.0610 - accuracy: 0.9797
1353/1688 [=======================>......] - ETA: 0s - loss: 0.0604 - accuracy: 0.9798
1377/1688 [=======================>......] - ETA: 0s - loss: 0.0600 - accuracy: 0.9799
1401/1688 [=======================>......] - ETA: 0s - loss: 0.0597 - accuracy: 0.9801
1425/1688 [========================>.....] - ETA: 0s - loss: 0.0592 - accuracy: 0.9802
1449/1688 [========================>.....] - ETA: 0s - loss: 0.0587 - accuracy: 0.9803
1472/1688 [=========================>....] - ETA: 0s - loss: 0.0582 - accuracy: 0.9805
1496/1688 [=========================>....] - ETA: 0s - loss: 0.0584 - accuracy: 0.9804
1520/1688 [==========================>...] - ETA: 0s - loss: 0.0580 - accuracy: 0.9806
1544/1688 [==========================>...] - ETA: 0s - loss: 0.0574 - accuracy: 0.9807
1568/1688 [==========================>...] - ETA: 0s - loss: 0.0568 - accuracy: 0.9809
1592/1688 [===========================>..] - ETA: 0s - loss: 0.0564 - accuracy: 0.9810
1615/1688 [===========================>..] - ETA: 0s - loss: 0.0559 - accuracy: 0.9812
1639/1688 [============================>.] - ETA: 0s - loss: 0.0554 - accuracy: 0.9813
1663/1688 [============================>.] - ETA: 0s - loss: 0.0551 - accuracy: 0.9814
1686/1688 [============================>.] - ETA: 0s - loss: 0.0551 - accuracy: 0.9814
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0550 - accuracy: 0.9814 - val_loss: 0.0567 - val_accuracy: 0.9857
Epoch 2/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0084 - accuracy: 1.0000
  25/1688 [..............................] - ETA: 3s - loss: 0.0285 - accuracy: 0.9900
  49/1688 [..............................] - ETA: 3s - loss: 0.0269 - accuracy: 0.9892
  73/1688 [>.............................] - ETA: 3s - loss: 0.0333 - accuracy: 0.9880
  96/1688 [>.............................] - ETA: 3s - loss: 0.0287 - accuracy: 0.9899
 119/1688 [=>............................] - ETA: 3s - loss: 0.0266 - accuracy: 0.9908
 143/1688 [=>............................] - ETA: 3s - loss: 0.0257 - accuracy: 0.9917
 167/1688 [=>............................] - ETA: 3s - loss: 0.0243 - accuracy: 0.9921
 190/1688 [==>...........................] - ETA: 3s - loss: 0.0264 - accuracy: 0.9916
 214/1688 [==>...........................] - ETA: 3s - loss: 0.0260 - accuracy: 0.9915
 238/1688 [===>..........................] - ETA: 3s - loss: 0.0246 - accuracy: 0.9923
 262/1688 [===>..........................] - ETA: 3s - loss: 0.0234 - accuracy: 0.9927
 286/1688 [====>.........................] - ETA: 3s - loss: 0.0226 - accuracy: 0.9929
 310/1688 [====>.........................] - ETA: 2s - loss: 0.0220 - accuracy: 0.9931
 334/1688 [====>.........................] - ETA: 2s - loss: 0.0211 - accuracy: 0.9934
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0211 - accuracy: 0.9934
 381/1688 [=====>........................] - ETA: 2s - loss: 0.0211 - accuracy: 0.9932
 405/1688 [======>.......................] - ETA: 2s - loss: 0.0212 - accuracy: 0.9931
 429/1688 [======>.......................] - ETA: 2s - loss: 0.0223 - accuracy: 0.9930
 453/1688 [=======>......................] - ETA: 2s - loss: 0.0222 - accuracy: 0.9930
 476/1688 [=======>......................] - ETA: 2s - loss: 0.0224 - accuracy: 0.9929
 501/1688 [=======>......................] - ETA: 2s - loss: 0.0218 - accuracy: 0.9931
 525/1688 [========>.....................] - ETA: 2s - loss: 0.0214 - accuracy: 0.9934
 549/1688 [========>.....................] - ETA: 2s - loss: 0.0211 - accuracy: 0.9935
 573/1688 [=========>....................] - ETA: 2s - loss: 0.0213 - accuracy: 0.9933
 596/1688 [=========>....................] - ETA: 2s - loss: 0.0210 - accuracy: 0.9934
 620/1688 [==========>...................] - ETA: 2s - loss: 0.0212 - accuracy: 0.9933
 644/1688 [==========>...................] - ETA: 2s - loss: 0.0216 - accuracy: 0.9931
 668/1688 [==========>...................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9929
 692/1688 [===========>..................] - ETA: 2s - loss: 0.0223 - accuracy: 0.9928
 716/1688 [===========>..................] - ETA: 2s - loss: 0.0227 - accuracy: 0.9926
 740/1688 [============>.................] - ETA: 2s - loss: 0.0242 - accuracy: 0.9922
 764/1688 [============>.................] - ETA: 1s - loss: 0.0243 - accuracy: 0.9922
 788/1688 [=============>................] - ETA: 1s - loss: 0.0243 - accuracy: 0.9922
 811/1688 [=============>................] - ETA: 1s - loss: 0.0244 - accuracy: 0.9921
 835/1688 [=============>................] - ETA: 1s - loss: 0.0246 - accuracy: 0.9921
 859/1688 [==============>...............] - ETA: 1s - loss: 0.0247 - accuracy: 0.9920
 883/1688 [==============>...............] - ETA: 1s - loss: 0.0247 - accuracy: 0.9920
 907/1688 [===============>..............] - ETA: 1s - loss: 0.0248 - accuracy: 0.9918
 931/1688 [===============>..............] - ETA: 1s - loss: 0.0247 - accuracy: 0.9918
 955/1688 [===============>..............] - ETA: 1s - loss: 0.0250 - accuracy: 0.9918
 978/1688 [================>.............] - ETA: 1s - loss: 0.0247 - accuracy: 0.9919
1002/1688 [================>.............] - ETA: 1s - loss: 0.0246 - accuracy: 0.9919
1026/1688 [=================>............] - ETA: 1s - loss: 0.0248 - accuracy: 0.9919
1050/1688 [=================>............] - ETA: 1s - loss: 0.0248 - accuracy: 0.9919
1074/1688 [==================>...........] - ETA: 1s - loss: 0.0253 - accuracy: 0.9918
1098/1688 [==================>...........] - ETA: 1s - loss: 0.0252 - accuracy: 0.9919
1121/1688 [==================>...........] - ETA: 1s - loss: 0.0256 - accuracy: 0.9917
1145/1688 [===================>..........] - ETA: 1s - loss: 0.0255 - accuracy: 0.9917
1169/1688 [===================>..........] - ETA: 1s - loss: 0.0257 - accuracy: 0.9916
1193/1688 [====================>.........] - ETA: 1s - loss: 0.0258 - accuracy: 0.9915
1217/1688 [====================>.........] - ETA: 1s - loss: 0.0259 - accuracy: 0.9914
1241/1688 [=====================>........] - ETA: 0s - loss: 0.0261 - accuracy: 0.9914
1264/1688 [=====================>........] - ETA: 0s - loss: 0.0259 - accuracy: 0.9914
1287/1688 [=====================>........] - ETA: 0s - loss: 0.0260 - accuracy: 0.9914
1311/1688 [======================>.......] - ETA: 0s - loss: 0.0260 - accuracy: 0.9914
1335/1688 [======================>.......] - ETA: 0s - loss: 0.0262 - accuracy: 0.9912
1359/1688 [=======================>......] - ETA: 0s - loss: 0.0260 - accuracy: 0.9913
1383/1688 [=======================>......] - ETA: 0s - loss: 0.0263 - accuracy: 0.9912
1407/1688 [========================>.....] - ETA: 0s - loss: 0.0261 - accuracy: 0.9913
1431/1688 [========================>.....] - ETA: 0s - loss: 0.0261 - accuracy: 0.9913
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0262 - accuracy: 0.9912
1479/1688 [=========================>....] - ETA: 0s - loss: 0.0265 - accuracy: 0.9912
1503/1688 [=========================>....] - ETA: 0s - loss: 0.0263 - accuracy: 0.9912
1527/1688 [==========================>...] - ETA: 0s - loss: 0.0263 - accuracy: 0.9912
1551/1688 [==========================>...] - ETA: 0s - loss: 0.0265 - accuracy: 0.9911
1575/1688 [==========================>...] - ETA: 0s - loss: 0.0265 - accuracy: 0.9911
1599/1688 [===========================>..] - ETA: 0s - loss: 0.0267 - accuracy: 0.9911
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0268 - accuracy: 0.9910
1647/1688 [============================>.] - ETA: 0s - loss: 0.0268 - accuracy: 0.9910
1671/1688 [============================>.] - ETA: 0s - loss: 0.0268 - accuracy: 0.9910
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0268 - accuracy: 0.9910 - val_loss: 0.0623 - val_accuracy: 0.9843
Epoch 3/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0034 - accuracy: 1.0000
  25/1688 [..............................] - ETA: 3s - loss: 0.0123 - accuracy: 0.9975
  49/1688 [..............................] - ETA: 3s - loss: 0.0146 - accuracy: 0.9968
  72/1688 [>.............................] - ETA: 3s - loss: 0.0168 - accuracy: 0.9952
  95/1688 [>.............................] - ETA: 3s - loss: 0.0199 - accuracy: 0.9937
 119/1688 [=>............................] - ETA: 3s - loss: 0.0216 - accuracy: 0.9926
 143/1688 [=>............................] - ETA: 3s - loss: 0.0196 - accuracy: 0.9934
 167/1688 [=>............................] - ETA: 3s - loss: 0.0196 - accuracy: 0.9935
 190/1688 [==>...........................] - ETA: 3s - loss: 0.0194 - accuracy: 0.9933
 214/1688 [==>...........................] - ETA: 3s - loss: 0.0183 - accuracy: 0.9936
 238/1688 [===>..........................] - ETA: 3s - loss: 0.0180 - accuracy: 0.9938
 262/1688 [===>..........................] - ETA: 3s - loss: 0.0179 - accuracy: 0.9938
 286/1688 [====>.........................] - ETA: 3s - loss: 0.0193 - accuracy: 0.9932
 310/1688 [====>.........................] - ETA: 2s - loss: 0.0194 - accuracy: 0.9932
 334/1688 [====>.........................] - ETA: 2s - loss: 0.0191 - accuracy: 0.9934
 358/1688 [=====>........................] - ETA: 2s - loss: 0.0196 - accuracy: 0.9931
 382/1688 [=====>........................] - ETA: 2s - loss: 0.0191 - accuracy: 0.9933
 406/1688 [======>.......................] - ETA: 2s - loss: 0.0195 - accuracy: 0.9930
 429/1688 [======>.......................] - ETA: 2s - loss: 0.0204 - accuracy: 0.9926
 453/1688 [=======>......................] - ETA: 2s - loss: 0.0206 - accuracy: 0.9927
 477/1688 [=======>......................] - ETA: 2s - loss: 0.0209 - accuracy: 0.9927
 501/1688 [=======>......................] - ETA: 2s - loss: 0.0208 - accuracy: 0.9926
 524/1688 [========>.....................] - ETA: 2s - loss: 0.0205 - accuracy: 0.9928
 548/1688 [========>.....................] - ETA: 2s - loss: 0.0208 - accuracy: 0.9927
 572/1688 [=========>....................] - ETA: 2s - loss: 0.0216 - accuracy: 0.9924
 596/1688 [=========>....................] - ETA: 2s - loss: 0.0219 - accuracy: 0.9923
 620/1688 [==========>...................] - ETA: 2s - loss: 0.0217 - accuracy: 0.9923
 644/1688 [==========>...................] - ETA: 2s - loss: 0.0214 - accuracy: 0.9924
 668/1688 [==========>...................] - ETA: 2s - loss: 0.0210 - accuracy: 0.9925
 692/1688 [===========>..................] - ETA: 2s - loss: 0.0209 - accuracy: 0.9925
 716/1688 [===========>..................] - ETA: 2s - loss: 0.0206 - accuracy: 0.9927
 740/1688 [============>.................] - ETA: 2s - loss: 0.0209 - accuracy: 0.9925
 764/1688 [============>.................] - ETA: 1s - loss: 0.0207 - accuracy: 0.9926
 788/1688 [=============>................] - ETA: 1s - loss: 0.0204 - accuracy: 0.9928
 812/1688 [=============>................] - ETA: 1s - loss: 0.0203 - accuracy: 0.9928
 835/1688 [=============>................] - ETA: 1s - loss: 0.0200 - accuracy: 0.9929
 859/1688 [==============>...............] - ETA: 1s - loss: 0.0198 - accuracy: 0.9930
 883/1688 [==============>...............] - ETA: 1s - loss: 0.0198 - accuracy: 0.9930
 907/1688 [===============>..............] - ETA: 1s - loss: 0.0196 - accuracy: 0.9930
 931/1688 [===============>..............] - ETA: 1s - loss: 0.0198 - accuracy: 0.9930
 955/1688 [===============>..............] - ETA: 1s - loss: 0.0197 - accuracy: 0.9930
 979/1688 [================>.............] - ETA: 1s - loss: 0.0197 - accuracy: 0.9930
1003/1688 [================>.............] - ETA: 1s - loss: 0.0196 - accuracy: 0.9931
1026/1688 [=================>............] - ETA: 1s - loss: 0.0197 - accuracy: 0.9931
1050/1688 [=================>............] - ETA: 1s - loss: 0.0201 - accuracy: 0.9929
1074/1688 [==================>...........] - ETA: 1s - loss: 0.0199 - accuracy: 0.9930
1098/1688 [==================>...........] - ETA: 1s - loss: 0.0200 - accuracy: 0.9929
1122/1688 [==================>...........] - ETA: 1s - loss: 0.0200 - accuracy: 0.9929
1145/1688 [===================>..........] - ETA: 1s - loss: 0.0198 - accuracy: 0.9929
1169/1688 [===================>..........] - ETA: 1s - loss: 0.0200 - accuracy: 0.9929
1193/1688 [====================>.........] - ETA: 1s - loss: 0.0199 - accuracy: 0.9929
1217/1688 [====================>.........] - ETA: 1s - loss: 0.0201 - accuracy: 0.9928
1241/1688 [=====================>........] - ETA: 0s - loss: 0.0202 - accuracy: 0.9928
1265/1688 [=====================>........] - ETA: 0s - loss: 0.0206 - accuracy: 0.9927
1289/1688 [=====================>........] - ETA: 0s - loss: 0.0205 - accuracy: 0.9927
1313/1688 [======================>.......] - ETA: 0s - loss: 0.0207 - accuracy: 0.9927
1337/1688 [======================>.......] - ETA: 0s - loss: 0.0211 - accuracy: 0.9926
1361/1688 [=======================>......] - ETA: 0s - loss: 0.0210 - accuracy: 0.9926
1385/1688 [=======================>......] - ETA: 0s - loss: 0.0211 - accuracy: 0.9926
1408/1688 [========================>.....] - ETA: 0s - loss: 0.0211 - accuracy: 0.9925
1432/1688 [========================>.....] - ETA: 0s - loss: 0.0211 - accuracy: 0.9925
1456/1688 [========================>.....] - ETA: 0s - loss: 0.0209 - accuracy: 0.9926
1480/1688 [=========================>....] - ETA: 0s - loss: 0.0209 - accuracy: 0.9925
1504/1688 [=========================>....] - ETA: 0s - loss: 0.0207 - accuracy: 0.9926
1528/1688 [==========================>...] - ETA: 0s - loss: 0.0207 - accuracy: 0.9926
1551/1688 [==========================>...] - ETA: 0s - loss: 0.0207 - accuracy: 0.9926
1575/1688 [==========================>...] - ETA: 0s - loss: 0.0208 - accuracy: 0.9926
1599/1688 [===========================>..] - ETA: 0s - loss: 0.0210 - accuracy: 0.9926
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0211 - accuracy: 0.9925
1646/1688 [============================>.] - ETA: 0s - loss: 0.0214 - accuracy: 0.9924
1670/1688 [============================>.] - ETA: 0s - loss: 0.0213 - accuracy: 0.9925
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0212 - accuracy: 0.9925 - val_loss: 0.0492 - val_accuracy: 0.9873
Epoch 4/5

   1/1688 [..............................] - ETA: 0s - loss: 0.0019 - accuracy: 1.0000
  25/1688 [..............................] - ETA: 3s - loss: 0.0105 - accuracy: 0.9962
  48/1688 [..............................] - ETA: 3s - loss: 0.0108 - accuracy: 0.9967
  72/1688 [>.............................] - ETA: 3s - loss: 0.0099 - accuracy: 0.9961
  96/1688 [>.............................] - ETA: 3s - loss: 0.0108 - accuracy: 0.9961
 120/1688 [=>............................] - ETA: 3s - loss: 0.0124 - accuracy: 0.9956
 144/1688 [=>............................] - ETA: 3s - loss: 0.0110 - accuracy: 0.9963
 168/1688 [=>............................] - ETA: 3s - loss: 0.0118 - accuracy: 0.9957
 192/1688 [==>...........................] - ETA: 3s - loss: 0.0126 - accuracy: 0.9954
 216/1688 [==>...........................] - ETA: 3s - loss: 0.0142 - accuracy: 0.9952
 240/1688 [===>..........................] - ETA: 3s - loss: 0.0140 - accuracy: 0.9951
 264/1688 [===>..........................] - ETA: 3s - loss: 0.0147 - accuracy: 0.9948
 288/1688 [====>.........................] - ETA: 3s - loss: 0.0142 - accuracy: 0.9949
 312/1688 [====>.........................] - ETA: 2s - loss: 0.0148 - accuracy: 0.9944
 336/1688 [====>.........................] - ETA: 2s - loss: 0.0162 - accuracy: 0.9938
 359/1688 [=====>........................] - ETA: 2s - loss: 0.0167 - accuracy: 0.9936
 383/1688 [=====>........................] - ETA: 2s - loss: 0.0177 - accuracy: 0.9933
 407/1688 [======>.......................] - ETA: 2s - loss: 0.0177 - accuracy: 0.9931
 430/1688 [======>.......................] - ETA: 2s - loss: 0.0184 - accuracy: 0.9931
 454/1688 [=======>......................] - ETA: 2s - loss: 0.0186 - accuracy: 0.9932
 478/1688 [=======>......................] - ETA: 2s - loss: 0.0189 - accuracy: 0.9931
 502/1688 [=======>......................] - ETA: 2s - loss: 0.0192 - accuracy: 0.9930
 526/1688 [========>.....................] - ETA: 2s - loss: 0.0190 - accuracy: 0.9931
 550/1688 [========>.....................] - ETA: 2s - loss: 0.0189 - accuracy: 0.9931
 574/1688 [=========>....................] - ETA: 2s - loss: 0.0188 - accuracy: 0.9931
 598/1688 [=========>....................] - ETA: 2s - loss: 0.0185 - accuracy: 0.9932
 622/1688 [==========>...................] - ETA: 2s - loss: 0.0184 - accuracy: 0.9933
 646/1688 [==========>...................] - ETA: 2s - loss: 0.0181 - accuracy: 0.9934
 670/1688 [==========>...................] - ETA: 2s - loss: 0.0183 - accuracy: 0.9933
 694/1688 [===========>..................] - ETA: 2s - loss: 0.0187 - accuracy: 0.9932
 718/1688 [===========>..................] - ETA: 2s - loss: 0.0186 - accuracy: 0.9932
 742/1688 [============>.................] - ETA: 2s - loss: 0.0188 - accuracy: 0.9930
 766/1688 [============>.................] - ETA: 1s - loss: 0.0189 - accuracy: 0.9929
 790/1688 [=============>................] - ETA: 1s - loss: 0.0189 - accuracy: 0.9930
 813/1688 [=============>................] - ETA: 1s - loss: 0.0186 - accuracy: 0.9931
 836/1688 [=============>................] - ETA: 1s - loss: 0.0184 - accuracy: 0.9932
 860/1688 [==============>...............] - ETA: 1s - loss: 0.0182 - accuracy: 0.9932
 884/1688 [==============>...............] - ETA: 1s - loss: 0.0187 - accuracy: 0.9930
 907/1688 [===============>..............] - ETA: 1s - loss: 0.0186 - accuracy: 0.9930
 930/1688 [===============>..............] - ETA: 1s - loss: 0.0186 - accuracy: 0.9930
 954/1688 [===============>..............] - ETA: 1s - loss: 0.0187 - accuracy: 0.9930
 978/1688 [================>.............] - ETA: 1s - loss: 0.0190 - accuracy: 0.9928
1002/1688 [================>.............] - ETA: 1s - loss: 0.0190 - accuracy: 0.9928
1025/1688 [=================>............] - ETA: 1s - loss: 0.0192 - accuracy: 0.9927
1048/1688 [=================>............] - ETA: 1s - loss: 0.0194 - accuracy: 0.9927
1071/1688 [==================>...........] - ETA: 1s - loss: 0.0196 - accuracy: 0.9927
1095/1688 [==================>...........] - ETA: 1s - loss: 0.0195 - accuracy: 0.9927
1118/1688 [==================>...........] - ETA: 1s - loss: 0.0197 - accuracy: 0.9927
1141/1688 [===================>..........] - ETA: 1s - loss: 0.0198 - accuracy: 0.9927
1164/1688 [===================>..........] - ETA: 1s - loss: 0.0199 - accuracy: 0.9926
1188/1688 [====================>.........] - ETA: 1s - loss: 0.0200 - accuracy: 0.9925
1212/1688 [====================>.........] - ETA: 1s - loss: 0.0199 - accuracy: 0.9925
1235/1688 [====================>.........] - ETA: 0s - loss: 0.0197 - accuracy: 0.9926
1259/1688 [=====================>........] - ETA: 0s - loss: 0.0196 - accuracy: 0.9927
1283/1688 [=====================>........] - ETA: 0s - loss: 0.0196 - accuracy: 0.9927
1306/1688 [======================>.......] - ETA: 0s - loss: 0.0199 - accuracy: 0.9926
1330/1688 [======================>.......] - ETA: 0s - loss: 0.0203 - accuracy: 0.9925
1353/1688 [=======================>......] - ETA: 0s - loss: 0.0203 - accuracy: 0.9925
1377/1688 [=======================>......] - ETA: 0s - loss: 0.0202 - accuracy: 0.9925
1400/1688 [=======================>......] - ETA: 0s - loss: 0.0204 - accuracy: 0.9925
1423/1688 [========================>.....] - ETA: 0s - loss: 0.0206 - accuracy: 0.9924
1446/1688 [========================>.....] - ETA: 0s - loss: 0.0206 - accuracy: 0.9925
1470/1688 [=========================>....] - ETA: 0s - loss: 0.0206 - accuracy: 0.9925
1494/1688 [=========================>....] - ETA: 0s - loss: 0.0207 - accuracy: 0.9924
1518/1688 [=========================>....] - ETA: 0s - loss: 0.0208 - accuracy: 0.9924
1541/1688 [==========================>...] - ETA: 0s - loss: 0.0211 - accuracy: 0.9924
1565/1688 [==========================>...] - ETA: 0s - loss: 0.0211 - accuracy: 0.9923
1589/1688 [===========================>..] - ETA: 0s - loss: 0.0211 - accuracy: 0.9924
1612/1688 [===========================>..] - ETA: 0s - loss: 0.0214 - accuracy: 0.9923
1636/1688 [============================>.] - ETA: 0s - loss: 0.0216 - accuracy: 0.9922
1660/1688 [============================>.] - ETA: 0s - loss: 0.0217 - accuracy: 0.9922
1684/1688 [============================>.] - ETA: 0s - loss: 0.0218 - accuracy: 0.9922
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0218 - accuracy: 0.9922 - val_loss: 0.0674 - val_accuracy: 0.9853
Epoch 5/5

   1/1688 [..............................] - ETA: 0s - loss: 4.8112e-04 - accuracy: 1.0000
  26/1688 [..............................] - ETA: 3s - loss: 0.0100 - accuracy: 0.9976    
  50/1688 [..............................] - ETA: 3s - loss: 0.0157 - accuracy: 0.9969
  74/1688 [>.............................] - ETA: 3s - loss: 0.0149 - accuracy: 0.9962
  98/1688 [>.............................] - ETA: 3s - loss: 0.0153 - accuracy: 0.9955
 121/1688 [=>............................] - ETA: 3s - loss: 0.0135 - accuracy: 0.9961
 145/1688 [=>............................] - ETA: 3s - loss: 0.0131 - accuracy: 0.9961
 169/1688 [==>...........................] - ETA: 3s - loss: 0.0140 - accuracy: 0.9956
 193/1688 [==>...........................] - ETA: 3s - loss: 0.0147 - accuracy: 0.9953
 217/1688 [==>...........................] - ETA: 3s - loss: 0.0146 - accuracy: 0.9954
 241/1688 [===>..........................] - ETA: 3s - loss: 0.0140 - accuracy: 0.9955
 265/1688 [===>..........................] - ETA: 3s - loss: 0.0140 - accuracy: 0.9954
 289/1688 [====>.........................] - ETA: 2s - loss: 0.0152 - accuracy: 0.9950
 311/1688 [====>.........................] - ETA: 2s - loss: 0.0154 - accuracy: 0.9950
 335/1688 [====>.........................] - ETA: 2s - loss: 0.0153 - accuracy: 0.9951
 359/1688 [=====>........................] - ETA: 2s - loss: 0.0152 - accuracy: 0.9951
 383/1688 [=====>........................] - ETA: 2s - loss: 0.0160 - accuracy: 0.9947
 407/1688 [======>.......................] - ETA: 2s - loss: 0.0163 - accuracy: 0.9945
 431/1688 [======>.......................] - ETA: 2s - loss: 0.0161 - accuracy: 0.9946
 455/1688 [=======>......................] - ETA: 2s - loss: 0.0165 - accuracy: 0.9944
 479/1688 [=======>......................] - ETA: 2s - loss: 0.0165 - accuracy: 0.9943
 503/1688 [=======>......................] - ETA: 2s - loss: 0.0164 - accuracy: 0.9943
 526/1688 [========>.....................] - ETA: 2s - loss: 0.0162 - accuracy: 0.9945
 550/1688 [========>.....................] - ETA: 2s - loss: 0.0159 - accuracy: 0.9946
 573/1688 [=========>....................] - ETA: 2s - loss: 0.0158 - accuracy: 0.9946
 597/1688 [=========>....................] - ETA: 2s - loss: 0.0159 - accuracy: 0.9946
 620/1688 [==========>...................] - ETA: 2s - loss: 0.0162 - accuracy: 0.9944
 644/1688 [==========>...................] - ETA: 2s - loss: 0.0161 - accuracy: 0.9944
 668/1688 [==========>...................] - ETA: 2s - loss: 0.0163 - accuracy: 0.9943
 692/1688 [===========>..................] - ETA: 2s - loss: 0.0166 - accuracy: 0.9943
 716/1688 [===========>..................] - ETA: 2s - loss: 0.0167 - accuracy: 0.9942
 740/1688 [============>.................] - ETA: 2s - loss: 0.0167 - accuracy: 0.9942
 764/1688 [============>.................] - ETA: 1s - loss: 0.0164 - accuracy: 0.9943
 787/1688 [============>.................] - ETA: 1s - loss: 0.0161 - accuracy: 0.9944
 811/1688 [=============>................] - ETA: 1s - loss: 0.0165 - accuracy: 0.9943
 835/1688 [=============>................] - ETA: 1s - loss: 0.0163 - accuracy: 0.9943
 859/1688 [==============>...............] - ETA: 1s - loss: 0.0161 - accuracy: 0.9944
 882/1688 [==============>...............] - ETA: 1s - loss: 0.0164 - accuracy: 0.9944
 906/1688 [===============>..............] - ETA: 1s - loss: 0.0171 - accuracy: 0.9941
 930/1688 [===============>..............] - ETA: 1s - loss: 0.0169 - accuracy: 0.9942
 954/1688 [===============>..............] - ETA: 1s - loss: 0.0167 - accuracy: 0.9943
 978/1688 [================>.............] - ETA: 1s - loss: 0.0166 - accuracy: 0.9943
1001/1688 [================>.............] - ETA: 1s - loss: 0.0168 - accuracy: 0.9943
1024/1688 [=================>............] - ETA: 1s - loss: 0.0166 - accuracy: 0.9944
1048/1688 [=================>............] - ETA: 1s - loss: 0.0167 - accuracy: 0.9943
1071/1688 [==================>...........] - ETA: 1s - loss: 0.0167 - accuracy: 0.9943
1095/1688 [==================>...........] - ETA: 1s - loss: 0.0169 - accuracy: 0.9941
1119/1688 [==================>...........] - ETA: 1s - loss: 0.0176 - accuracy: 0.9940
1143/1688 [===================>..........] - ETA: 1s - loss: 0.0178 - accuracy: 0.9939
1167/1688 [===================>..........] - ETA: 1s - loss: 0.0181 - accuracy: 0.9938
1190/1688 [====================>.........] - ETA: 1s - loss: 0.0183 - accuracy: 0.9937
1214/1688 [====================>.........] - ETA: 1s - loss: 0.0184 - accuracy: 0.9937
1238/1688 [=====================>........] - ETA: 0s - loss: 0.0183 - accuracy: 0.9938
1262/1688 [=====================>........] - ETA: 0s - loss: 0.0181 - accuracy: 0.9938
1286/1688 [=====================>........] - ETA: 0s - loss: 0.0182 - accuracy: 0.9938
1310/1688 [======================>.......] - ETA: 0s - loss: 0.0186 - accuracy: 0.9937
1334/1688 [======================>.......] - ETA: 0s - loss: 0.0188 - accuracy: 0.9937
1358/1688 [=======================>......] - ETA: 0s - loss: 0.0189 - accuracy: 0.9937
1382/1688 [=======================>......] - ETA: 0s - loss: 0.0189 - accuracy: 0.9937
1407/1688 [========================>.....] - ETA: 0s - loss: 0.0194 - accuracy: 0.9935
1431/1688 [========================>.....] - ETA: 0s - loss: 0.0200 - accuracy: 0.9934
1455/1688 [========================>.....] - ETA: 0s - loss: 0.0201 - accuracy: 0.9934
1480/1688 [=========================>....] - ETA: 0s - loss: 0.0202 - accuracy: 0.9933
1504/1688 [=========================>....] - ETA: 0s - loss: 0.0204 - accuracy: 0.9932
1528/1688 [==========================>...] - ETA: 0s - loss: 0.0204 - accuracy: 0.9932
1552/1688 [==========================>...] - ETA: 0s - loss: 0.0210 - accuracy: 0.9931
1576/1688 [===========================>..] - ETA: 0s - loss: 0.0211 - accuracy: 0.9930
1599/1688 [===========================>..] - ETA: 0s - loss: 0.0213 - accuracy: 0.9929
1623/1688 [===========================>..] - ETA: 0s - loss: 0.0212 - accuracy: 0.9929
1647/1688 [============================>.] - ETA: 0s - loss: 0.0214 - accuracy: 0.9929
1670/1688 [============================>.] - ETA: 0s - loss: 0.0213 - accuracy: 0.9929
1688/1688 [==============================] - 4s 2ms/step - loss: 0.0213 - accuracy: 0.9929 - val_loss: 0.0541 - val_accuracy: 0.9880
Test accuracy after fine tuning: 0.9855999946594238

6. Model conversion

After having obtained a quantized model with satisfactory performance, it can be converted to a model suitable to be used in the Akida NSoC in inference mode. The convert function returns a model in Akida format, ready for the Akida NSoC or the Akida Execution Engine.

Note

One needs to supply the coefficients used to rescale the input dataset before the training - here input_scaling.

As with Keras, the summary() method provides a textual representation of the Akida model.

from cnn2snn import convert

model_akida = convert(model_quantized, input_scaling=input_scaling)
model_akida.summary()

results = model_akida.predict(raw_x_test)
accuracy = (raw_y_test == results).mean()

print('Test accuracy after conversion:', accuracy)

# For non-regression purpose
assert accuracy > 0.97

Out:

                                     Model Summary
________________________________________________________________________________________
Layer (type)                               Output shape  Kernel shape
========================================================================================
conv2d (InputConvolutional)                [13, 13, 32]  (3, 3, 1, 32)
________________________________________________________________________________________
separable_conv2d (SeparableConvolutional)  [7, 7, 64]    (3, 3, 32, 1), (1, 1, 32, 64)
________________________________________________________________________________________
dense (FullyConnected)                     [1, 1, 10]    (1, 1, 3136, 10)
________________________________________________________________________________________
Input shape: 28, 28, 1
Backend type: Software - 1.8.10


Test accuracy after conversion: 0.9857

Depending on the number of samples you run, you should find a performance of around 98% (better results can be achieved using more epochs for training).

Total running time of the script: ( 0 minutes 55.646 seconds)

Gallery generated by Sphinx-Gallery