DS-CNN CIFAR10 inference

This tutorial uses the CIFAR-10 dataset (60k training images distributed in 10 object classes) for a classic object classification task with a network built around the Depthwise Separable Convolutional Neural Network (DS-CNN) which is originated from Zhang et al (2018).

The goal of the tutorial is to provide users with an example of a complex model that can be converted to an Akida model and that can be run on Akida NSoC with an accuracy similar to a standard Keras floating point model.

1. Dataset preparation

from tensorflow.keras.datasets import cifar10

# Load CIFAR10 dataset
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Reshape x-data
x_train = x_train.reshape(50000, 32, 32, 3)
x_test = x_test.reshape(10000, 32, 32, 3)
input_shape = (32, 32, 3)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')

# Rescale x-data
a = 255
b = 0

x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

Out:

Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz

     8192/170498071 [..............................] - ETA: 34:25
    40960/170498071 [..............................] - ETA: 13:54
    90112/170498071 [..............................] - ETA: 9:25 
   204800/170498071 [..............................] - ETA: 5:32
   417792/170498071 [..............................] - ETA: 3:22
   761856/170498071 [..............................] - ETA: 2:02
   860160/170498071 [..............................] - ETA: 1:58
  1122304/170498071 [..............................] - ETA: 1:38
  1327104/170498071 [..............................] - ETA: 1:29
  1581056/170498071 [..............................] - ETA: 1:20
  1794048/170498071 [..............................] - ETA: 1:15
  2072576/170498071 [..............................] - ETA: 1:09
  2301952/170498071 [..............................] - ETA: 1:06
  2588672/170498071 [..............................] - ETA: 1:02
  2809856/170498071 [..............................] - ETA: 1:00
  3112960/170498071 [..............................] - ETA: 57s 
  3350528/170498071 [..............................] - ETA: 55s
  3661824/170498071 [..............................] - ETA: 53s
  3923968/170498071 [..............................] - ETA: 51s
  4235264/170498071 [..............................] - ETA: 49s
  4530176/170498071 [..............................] - ETA: 48s
  4857856/170498071 [..............................] - ETA: 46s
  5152768/170498071 [..............................] - ETA: 45s
  5513216/170498071 [..............................] - ETA: 44s
  5857280/170498071 [>.............................] - ETA: 42s
  6234112/170498071 [>.............................] - ETA: 41s
  6578176/170498071 [>.............................] - ETA: 40s
  6971392/170498071 [>.............................] - ETA: 39s
  7340032/170498071 [>.............................] - ETA: 38s
  7741440/170498071 [>.............................] - ETA: 37s
  8101888/170498071 [>.............................] - ETA: 36s
  8527872/170498071 [>.............................] - ETA: 35s
  8888320/170498071 [>.............................] - ETA: 35s
  9363456/170498071 [>.............................] - ETA: 34s
  9756672/170498071 [>.............................] - ETA: 33s
 10248192/170498071 [>.............................] - ETA: 32s
 10641408/170498071 [>.............................] - ETA: 32s
 11165696/170498071 [>.............................] - ETA: 31s
 11575296/170498071 [=>............................] - ETA: 30s
 12132352/170498071 [=>............................] - ETA: 29s
 12591104/170498071 [=>............................] - ETA: 29s
 13139968/170498071 [=>............................] - ETA: 28s
 13590528/170498071 [=>............................] - ETA: 28s
 14180352/170498071 [=>............................] - ETA: 27s
 14671872/170498071 [=>............................] - ETA: 27s
 15228928/170498071 [=>............................] - ETA: 26s
 15769600/170498071 [=>............................] - ETA: 26s
 16310272/170498071 [=>............................] - ETA: 25s
 16965632/170498071 [=>............................] - ETA: 24s
 17522688/170498071 [==>...........................] - ETA: 24s
 18227200/170498071 [==>...........................] - ETA: 23s
 18800640/170498071 [==>...........................] - ETA: 23s
 19521536/170498071 [==>...........................] - ETA: 22s
 20111360/170498071 [==>...........................] - ETA: 22s
 20815872/170498071 [==>...........................] - ETA: 21s
 21520384/170498071 [==>...........................] - ETA: 21s
 22274048/170498071 [==>...........................] - ETA: 21s
 22986752/170498071 [===>..........................] - ETA: 20s
 23732224/170498071 [===>..........................] - ETA: 20s
 24502272/170498071 [===>..........................] - ETA: 19s
 25280512/170498071 [===>..........................] - ETA: 19s
 26075136/170498071 [===>..........................] - ETA: 18s
 26910720/170498071 [===>..........................] - ETA: 18s
 27713536/170498071 [===>..........................] - ETA: 18s
 28499968/170498071 [====>.........................] - ETA: 17s
 29335552/170498071 [====>.........................] - ETA: 17s
 30269440/170498071 [====>.........................] - ETA: 16s
 31088640/170498071 [====>.........................] - ETA: 16s
 31899648/170498071 [====>.........................] - ETA: 16s
 32923648/170498071 [====>.........................] - ETA: 15s
 33824768/170498071 [====>.........................] - ETA: 15s
 34873344/170498071 [=====>........................] - ETA: 15s
 35831808/170498071 [=====>........................] - ETA: 14s
 36872192/170498071 [=====>........................] - ETA: 14s
 37953536/170498071 [=====>........................] - ETA: 14s
 39051264/170498071 [=====>........................] - ETA: 13s
 40181760/170498071 [======>.......................] - ETA: 13s
 41295872/170498071 [======>.......................] - ETA: 13s
 42491904/170498071 [======>.......................] - ETA: 12s
 43540480/170498071 [======>.......................] - ETA: 12s
 44605440/170498071 [======>.......................] - ETA: 12s
 45613056/170498071 [=======>......................] - ETA: 12s
 46661632/170498071 [=======>......................] - ETA: 11s
 47816704/170498071 [=======>......................] - ETA: 11s
 49012736/170498071 [=======>......................] - ETA: 11s
 50126848/170498071 [=======>......................] - ETA: 11s
 51322880/170498071 [========>.....................] - ETA: 10s
 52420608/170498071 [========>.....................] - ETA: 10s
 53567488/170498071 [========>.....................] - ETA: 10s
 54665216/170498071 [========>.....................] - ETA: 10s
 55713792/170498071 [========>.....................] - ETA: 10s
 56745984/170498071 [========>.....................] - ETA: 9s 
 57778176/170498071 [=========>....................] - ETA: 9s
 58810368/170498071 [=========>....................] - ETA: 9s
 59826176/170498071 [=========>....................] - ETA: 9s
 60956672/170498071 [=========>....................] - ETA: 9s
 62038016/170498071 [=========>....................] - ETA: 9s
 63135744/170498071 [==========>...................] - ETA: 8s
 64200704/170498071 [==========>...................] - ETA: 8s
 65216512/170498071 [==========>...................] - ETA: 8s
 66314240/170498071 [==========>...................] - ETA: 8s
 67379200/170498071 [==========>...................] - ETA: 8s
 68345856/170498071 [===========>..................] - ETA: 8s
 69328896/170498071 [===========>..................] - ETA: 8s
 70262784/170498071 [===========>..................] - ETA: 7s
 71196672/170498071 [===========>..................] - ETA: 7s
 72097792/170498071 [===========>..................] - ETA: 7s
 73015296/170498071 [===========>..................] - ETA: 7s
 74063872/170498071 [============>.................] - ETA: 7s
 75046912/170498071 [============>.................] - ETA: 7s
 76046336/170498071 [============>.................] - ETA: 7s
 77078528/170498071 [============>.................] - ETA: 7s
 78159872/170498071 [============>.................] - ETA: 7s
 79159296/170498071 [============>.................] - ETA: 6s
 80191488/170498071 [=============>................] - ETA: 6s
 81223680/170498071 [=============>................] - ETA: 6s
 82206720/170498071 [=============>................] - ETA: 6s
 83238912/170498071 [=============>................] - ETA: 6s
 84221952/170498071 [=============>................] - ETA: 6s
 85221376/170498071 [=============>................] - ETA: 6s
 86188032/170498071 [==============>...............] - ETA: 6s
 87138304/170498071 [==============>...............] - ETA: 6s
 88088576/170498071 [==============>...............] - ETA: 6s
 88940544/170498071 [==============>...............] - ETA: 6s
 89841664/170498071 [==============>...............] - ETA: 5s
 90742784/170498071 [==============>...............] - ETA: 5s
 91774976/170498071 [===============>..............] - ETA: 5s
 92774400/170498071 [===============>..............] - ETA: 5s
 93724672/170498071 [===============>..............] - ETA: 5s
 94789632/170498071 [===============>..............] - ETA: 5s
 96018432/170498071 [===============>..............] - ETA: 5s
 97165312/170498071 [================>.............] - ETA: 5s
 98312192/170498071 [================>.............] - ETA: 5s
 99426304/170498071 [================>.............] - ETA: 5s
100507648/170498071 [================>.............] - ETA: 4s
101572608/170498071 [================>.............] - ETA: 4s
102555648/170498071 [=================>............] - ETA: 4s
103653376/170498071 [=================>............] - ETA: 4s
104865792/170498071 [=================>............] - ETA: 4s
106029056/170498071 [=================>............] - ETA: 4s
107372544/170498071 [=================>............] - ETA: 4s
108568576/170498071 [==================>...........] - ETA: 4s
109846528/170498071 [==================>...........] - ETA: 4s
111239168/170498071 [==================>...........] - ETA: 4s
112484352/170498071 [==================>...........] - ETA: 3s
113696768/170498071 [===================>..........] - ETA: 3s
114974720/170498071 [===================>..........] - ETA: 3s
116326400/170498071 [===================>..........] - ETA: 3s
117612544/170498071 [===================>..........] - ETA: 3s
118923264/170498071 [===================>..........] - ETA: 3s
120250368/170498071 [====================>.........] - ETA: 3s
121430016/170498071 [====================>.........] - ETA: 3s
122724352/170498071 [====================>.........] - ETA: 3s
124100608/170498071 [====================>.........] - ETA: 3s
125378560/170498071 [=====================>........] - ETA: 2s
126722048/170498071 [=====================>........] - ETA: 2s
128049152/170498071 [=====================>........] - ETA: 2s
129359872/170498071 [=====================>........] - ETA: 2s
130670592/170498071 [=====================>........] - ETA: 2s
131981312/170498071 [======================>.......] - ETA: 2s
133341184/170498071 [======================>.......] - ETA: 2s
134733824/170498071 [======================>.......] - ETA: 2s
136028160/170498071 [======================>.......] - ETA: 2s
137371648/170498071 [=======================>......] - ETA: 2s
138682368/170498071 [=======================>......] - ETA: 1s
140058624/170498071 [=======================>......] - ETA: 1s
141467648/170498071 [=======================>......] - ETA: 1s
142794752/170498071 [========================>.....] - ETA: 1s
144171008/170498071 [========================>.....] - ETA: 1s
145514496/170498071 [========================>.....] - ETA: 1s
146939904/170498071 [========================>.....] - ETA: 1s
148168704/170498071 [=========================>....] - ETA: 1s
149561344/170498071 [=========================>....] - ETA: 1s
150904832/170498071 [=========================>....] - ETA: 1s
152264704/170498071 [=========================>....] - ETA: 1s
153624576/170498071 [==========================>...] - ETA: 1s
155000832/170498071 [==========================>...] - ETA: 0s
156426240/170498071 [==========================>...] - ETA: 0s
157835264/170498071 [==========================>...] - ETA: 0s
159178752/170498071 [===========================>..] - ETA: 0s
160571392/170498071 [===========================>..] - ETA: 0s
161931264/170498071 [===========================>..] - ETA: 0s
163340288/170498071 [===========================>..] - ETA: 0s
164651008/170498071 [===========================>..] - ETA: 0s
166076416/170498071 [============================>.] - ETA: 0s
167321600/170498071 [============================>.] - ETA: 0s
167501824/170498071 [============================>.] - ETA: 0s
170500096/170498071 [==============================] - 10s 0us/step

2. Create a Keras DS-CNN model

The DS-CNN architecture is available in the Akida models zoo along with pretrained weights.

Note

The pre-trained weights were obtained after training the model with unconstrained float weights and activations for 1000 epochs

from tensorflow.keras.utils import get_file
from tensorflow.keras.models import load_model

# Retrieve the float model with pretrained weights and load it
model_file = get_file(
    "ds_cnn_cifar10.h5",
    "http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5",
    cache_subdir='models/ds_cnn_cifar10')
model_keras = load_model(model_file)
model_keras.summary()

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5

    8192/10836232 [..............................] - ETA: 24s
   73728/10836232 [..............................] - ETA: 11s
  270336/10836232 [..............................] - ETA: 5s 
  466944/10836232 [>.............................] - ETA: 4s
  663552/10836232 [>.............................] - ETA: 3s
  860160/10836232 [=>............................] - ETA: 3s
 1056768/10836232 [=>............................] - ETA: 3s
 1253376/10836232 [==>...........................] - ETA: 3s
 1449984/10836232 [===>..........................] - ETA: 3s
 1646592/10836232 [===>..........................] - ETA: 3s
 1843200/10836232 [====>.........................] - ETA: 3s
 2039808/10836232 [====>.........................] - ETA: 2s
 2236416/10836232 [=====>........................] - ETA: 2s
 2433024/10836232 [=====>........................] - ETA: 2s
 2629632/10836232 [======>.......................] - ETA: 2s
 2826240/10836232 [======>.......................] - ETA: 2s
 3022848/10836232 [=======>......................] - ETA: 2s
 3219456/10836232 [=======>......................] - ETA: 2s
 3416064/10836232 [========>.....................] - ETA: 2s
 3612672/10836232 [=========>....................] - ETA: 2s
 3809280/10836232 [=========>....................] - ETA: 2s
 4005888/10836232 [==========>...................] - ETA: 2s
 4202496/10836232 [==========>...................] - ETA: 2s
 4399104/10836232 [===========>..................] - ETA: 2s
 4595712/10836232 [===========>..................] - ETA: 1s
 4792320/10836232 [============>.................] - ETA: 1s
 4988928/10836232 [============>.................] - ETA: 1s
 5185536/10836232 [=============>................] - ETA: 1s
 5382144/10836232 [=============>................] - ETA: 1s
 5578752/10836232 [==============>...............] - ETA: 1s
 5775360/10836232 [==============>...............] - ETA: 1s
 5971968/10836232 [===============>..............] - ETA: 1s
 6168576/10836232 [================>.............] - ETA: 1s
 6365184/10836232 [================>.............] - ETA: 1s
 6561792/10836232 [=================>............] - ETA: 1s
 6758400/10836232 [=================>............] - ETA: 1s
 6955008/10836232 [==================>...........] - ETA: 1s
 7151616/10836232 [==================>...........] - ETA: 1s
 7348224/10836232 [===================>..........] - ETA: 1s
 7544832/10836232 [===================>..........] - ETA: 1s
 7741440/10836232 [====================>.........] - ETA: 0s
 7938048/10836232 [====================>.........] - ETA: 0s
 8134656/10836232 [=====================>........] - ETA: 0s
 8331264/10836232 [======================>.......] - ETA: 0s
 8527872/10836232 [======================>.......] - ETA: 0s
 8724480/10836232 [=======================>......] - ETA: 0s
 8921088/10836232 [=======================>......] - ETA: 0s
 9117696/10836232 [========================>.....] - ETA: 0s
 9314304/10836232 [========================>.....] - ETA: 0s
 9510912/10836232 [=========================>....] - ETA: 0s
 9707520/10836232 [=========================>....] - ETA: 0s
 9904128/10836232 [==========================>...] - ETA: 0s
10100736/10836232 [==========================>...] - ETA: 0s
10297344/10836232 [===========================>..] - ETA: 0s
10493952/10836232 [============================>.] - ETA: 0s
10690560/10836232 [============================>.] - ETA: 0s
10838016/10836232 [==============================] - 3s 0us/step
Model: "ds_cnn_cifar10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_1 (InputLayer)         [(None, 32, 32, 3)]       0
_________________________________________________________________
conv_0 (Conv2D)              (None, 32, 32, 128)       3456
_________________________________________________________________
conv_0_BN (BatchNormalizatio (None, 32, 32, 128)       512
_________________________________________________________________
conv_0_relu (ReLU)           (None, 32, 32, 128)       0
_________________________________________________________________
separable_1 (SeparableConv2D (None, 32, 32, 128)       17536
_________________________________________________________________
separable_1_BN (BatchNormali (None, 32, 32, 128)       512
_________________________________________________________________
separable_1_relu (ReLU)      (None, 32, 32, 128)       0
_________________________________________________________________
separable_2 (SeparableConv2D (None, 32, 32, 256)       33920
_________________________________________________________________
separable_2_BN (BatchNormali (None, 32, 32, 256)       1024
_________________________________________________________________
separable_2_relu (ReLU)      (None, 32, 32, 256)       0
_________________________________________________________________
separable_3 (SeparableConv2D (None, 32, 32, 256)       67840
_________________________________________________________________
separable_3_maxpool (MaxPool (None, 16, 16, 256)       0
_________________________________________________________________
separable_3_BN (BatchNormali (None, 16, 16, 256)       1024
_________________________________________________________________
separable_3_relu (ReLU)      (None, 16, 16, 256)       0
_________________________________________________________________
separable_4 (SeparableConv2D (None, 16, 16, 512)       133376
_________________________________________________________________
separable_4_BN (BatchNormali (None, 16, 16, 512)       2048
_________________________________________________________________
separable_4_relu (ReLU)      (None, 16, 16, 512)       0
_________________________________________________________________
separable_5 (SeparableConv2D (None, 16, 16, 512)       266752
_________________________________________________________________
separable_5_maxpool (MaxPool (None, 8, 8, 512)         0
_________________________________________________________________
separable_5_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_5_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_6 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_6_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_6_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_7 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_7_maxpool (MaxPool (None, 4, 4, 512)         0
_________________________________________________________________
separable_7_BN (BatchNormali (None, 4, 4, 512)         2048
_________________________________________________________________
separable_7_relu (ReLU)      (None, 4, 4, 512)         0
_________________________________________________________________
separable_8 (SeparableConv2D (None, 4, 4, 1024)        528896
_________________________________________________________________
separable_8_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_8_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_9 (SeparableConv2D (None, 4, 4, 1024)        1057792
_________________________________________________________________
separable_9_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_9_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_10 (SeparableConv2 (None, 4, 4, 10)          19456
_________________________________________________________________
separable_10_global_avg (Glo (None, 10)                0
=================================================================
Total params: 2,681,984
Trainable params: 2,672,256
Non-trainable params: 9,728
_________________________________________________________________

Keras model accuracy is checked against the first n images of the test set.

The table below summarizes the expected results:

#Images

Accuracy

100

96.00 %

1000

94.30 %

10000

93.60 %

Note

Depending on your hardware setup, the processing time may vary.

import numpy as np

from sklearn.metrics import accuracy_score
from timeit import default_timer as timer


# Check Model performance
def check_model_performances(model, x_test, num_images=1000):
    start = timer()
    potentials_keras = model.predict(x_test[:num_images])
    preds_keras = np.squeeze(np.argmax(potentials_keras, 1))

    accuracy = accuracy_score(y_test[:num_images], preds_keras)
    print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
    end = timer()
    print(f'Keras inference on {num_images} images took {end-start:.2f} s.\n')


check_model_performances(model_keras, x_test)

Out:

Accuracy: 94.30%
Keras inference on 1000 images took 2.19 s.

3. Quantized model

Quantizing a model is done using cnn2snn.quantize. After the call, all the layers will have 4-bit weights and 4-bit activations.

This model will therefore satisfy the Akida NSoC requirements but will suffer from a drop in accuracy due to quantization as shown in the table below:

#Images

Float accuracy

Quantized accuracy

100

96.00 %

96.00 %

1000

94.30 %

92.60 %

10000

93.66 %

92.58 %

from cnn2snn import quantize

# Quantize the model to 4-bit weights and activations
model_keras_quantized = quantize(model_keras, 4, 4)

# Check Model performance
check_model_performances(model_keras_quantized, x_test)

Out:

Accuracy: 92.60%
Keras inference on 1000 images took 0.88 s.

4. Pretrained quantized model

The Akida models zoo also contains a pretrained quantized helper that was obtained using the tune action of akida_models CLI on the quantized model for 100 epochs.

Tuning the model, that is training with a lowered learning rate, allows to recover performances up to the initial floating point accuracy.

#Images

Float accuracy

Quantized accuracy

After tuning

100

96.00 %

96.00 %

97.00 %

1000

94.30 %

92.60 %

94.20 %

10000

93.66 %

92.58 %

93.08 %

from akida_models import ds_cnn_cifar10_pretrained

# Use a quantized model with pretrained quantized weights
model_keras_quantized_pretrained = ds_cnn_cifar10_pretrained()

# Check Model performance
check_model_performances(model_keras_quantized_pretrained, x_test)

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10_iq4_wq4_aq4.h5

    8192/10741016 [..............................] - ETA: 23s
   73728/10741016 [..............................] - ETA: 10s
  270336/10741016 [..............................] - ETA: 5s 
  466944/10741016 [>.............................] - ETA: 4s
  663552/10741016 [>.............................] - ETA: 3s
  860160/10741016 [=>............................] - ETA: 3s
 1056768/10741016 [=>............................] - ETA: 3s
 1253376/10741016 [==>...........................] - ETA: 3s
 1449984/10741016 [===>..........................] - ETA: 3s
 1646592/10741016 [===>..........................] - ETA: 3s
 1843200/10741016 [====>.........................] - ETA: 2s
 2039808/10741016 [====>.........................] - ETA: 2s
 2236416/10741016 [=====>........................] - ETA: 2s
 2433024/10741016 [=====>........................] - ETA: 2s
 2629632/10741016 [======>.......................] - ETA: 2s
 2826240/10741016 [======>.......................] - ETA: 2s
 3022848/10741016 [=======>......................] - ETA: 2s
 3219456/10741016 [=======>......................] - ETA: 2s
 3416064/10741016 [========>.....................] - ETA: 2s
 3612672/10741016 [=========>....................] - ETA: 2s
 3809280/10741016 [=========>....................] - ETA: 2s
 4005888/10741016 [==========>...................] - ETA: 2s
 4202496/10741016 [==========>...................] - ETA: 2s
 4399104/10741016 [===========>..................] - ETA: 2s
 4595712/10741016 [===========>..................] - ETA: 1s
 4792320/10741016 [============>.................] - ETA: 1s
 4988928/10741016 [============>.................] - ETA: 1s
 5185536/10741016 [=============>................] - ETA: 1s
 5382144/10741016 [==============>...............] - ETA: 1s
 5578752/10741016 [==============>...............] - ETA: 1s
 5775360/10741016 [===============>..............] - ETA: 1s
 5971968/10741016 [===============>..............] - ETA: 1s
 6168576/10741016 [================>.............] - ETA: 1s
 6365184/10741016 [================>.............] - ETA: 1s
 6561792/10741016 [=================>............] - ETA: 1s
 6758400/10741016 [=================>............] - ETA: 1s
 6955008/10741016 [==================>...........] - ETA: 1s
 7151616/10741016 [==================>...........] - ETA: 1s
 7348224/10741016 [===================>..........] - ETA: 1s
 7544832/10741016 [====================>.........] - ETA: 0s
 7741440/10741016 [====================>.........] - ETA: 0s
 7938048/10741016 [=====================>........] - ETA: 0s
 8134656/10741016 [=====================>........] - ETA: 0s
 8331264/10741016 [======================>.......] - ETA: 0s
 8527872/10741016 [======================>.......] - ETA: 0s
 8724480/10741016 [=======================>......] - ETA: 0s
 8921088/10741016 [=======================>......] - ETA: 0s
 9117696/10741016 [========================>.....] - ETA: 0s
 9314304/10741016 [=========================>....] - ETA: 0s
 9510912/10741016 [=========================>....] - ETA: 0s
 9707520/10741016 [==========================>...] - ETA: 0s
 9904128/10741016 [==========================>...] - ETA: 0s
10100736/10741016 [===========================>..] - ETA: 0s
10297344/10741016 [===========================>..] - ETA: 0s
10493952/10741016 [============================>.] - ETA: 0s
10690560/10741016 [============================>.] - ETA: 0s
10747904/10741016 [==============================] - 3s 0us/step
Accuracy: 94.20%
Keras inference on 1000 images took 0.81 s.

5. Conversion to Akida

5.1 Convert to Akida model

When converting to an Akida model, we just need to pass the Keras model and the input scaling that was used during training to cnn2snn.convert.

from cnn2snn import convert

model_akida = convert(model_keras_quantized_pretrained, input_scaling=(a, b))

5.2 Check hardware compliancy

The Model.summary method provides a detailed description of the Model layers.

It also indicates hardware-incompatibilities if there are any. Hardware compatibility can also be checked manually using model_hardware_incompatibilities.

model_akida.summary()

Out:

                                       Model Summary
___________________________________________________________________________________________
Layer (type)                           Output shape   Kernel shape
===========================================================================================
conv_0 (InputConvolutional)            [32, 32, 128]  (3, 3, 3, 128)
___________________________________________________________________________________________
separable_1 (SeparableConvolutional)   [32, 32, 128]  (3, 3, 128, 1), (1, 1, 128, 128)
___________________________________________________________________________________________
separable_2 (SeparableConvolutional)   [32, 32, 256]  (3, 3, 128, 1), (1, 1, 128, 256)
___________________________________________________________________________________________
separable_3 (SeparableConvolutional)   [16, 16, 256]  (3, 3, 256, 1), (1, 1, 256, 256)
___________________________________________________________________________________________
separable_4 (SeparableConvolutional)   [16, 16, 512]  (3, 3, 256, 1), (1, 1, 256, 512)
___________________________________________________________________________________________
separable_5 (SeparableConvolutional)   [8, 8, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_6 (SeparableConvolutional)   [8, 8, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_7 (SeparableConvolutional)   [4, 4, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_8 (SeparableConvolutional)   [4, 4, 1024]   (3, 3, 512, 1), (1, 1, 512, 1024)
___________________________________________________________________________________________
separable_9 (SeparableConvolutional)   [4, 4, 1024]   (3, 3, 1024, 1), (1, 1, 1024, 1024)
___________________________________________________________________________________________
separable_10 (SeparableConvolutional)  [1, 1, 10]     (3, 3, 1024, 1), (1, 1, 1024, 10)
___________________________________________________________________________________________
Input shape: 32, 32, 3
Backend type: Software - 1.8.10

5.3 Check performance

We check the Akida model accuracy on the first n images of the test set.

The table below summarizes the expected results:

#Images

Keras accuracy

Akida accuracy

100

96.00 %

97.00 %

1000

94.30 %

94.00 %

10000

93.66 %

93.04 %

Due to the conversion process, the predictions may be slightly different between the original Keras model and Akida on some specific images.

This explains why when testing on a limited number of images the accuracy numbers between Keras and Akida may be quite different. On the full test set however, the two models accuracies are very close.

num_images = 1000

# Check Model performance
start = timer()
results = model_akida.predict(raw_x_test[:num_images])
accuracy = accuracy_score(y_test[:num_images], results)

print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
end = timer()
print(f'Akida inference on {num_images} images took {end-start:.2f} s.\n')

# For non-regression purpose
if num_images == 1000:
    assert accuracy == 0.94

Out:

Accuracy: 94.00%
Akida inference on 1000 images took 20.22 s.

Activations sparsity has a great impact on akida inference time. One can have a look at the average input and output sparsity of each layer using Model.get_statistics() For convenience, it is called here on a subset of the dataset.

# Print model statistics
print("Model statistics")
stats = model_akida.get_statistics()
model_akida.predict(raw_x_test[:20])
for _, stat in stats.items():
    print(stat)

Out:

Model statistics
Layer (type)                  output sparsity
conv_0 (InputConvolutional)   0.59
Layer (type)                  input sparsity      output sparsity     ops
separable_1 (SeparableConvolu 0.59                0.52                62696438
Layer (type)                  input sparsity      output sparsity     ops
separable_2 (SeparableConvolu 0.52                0.55                146144939
Layer (type)                  input sparsity      output sparsity     ops
separable_3 (SeparableConvolu 0.55                0.61                273823580
Layer (type)                  input sparsity      output sparsity     ops
separable_4 (SeparableConvolu 0.61                0.65                119144917
Layer (type)                  input sparsity      output sparsity     ops
separable_5 (SeparableConvolu 0.65                0.69                212892409
Layer (type)                  input sparsity      output sparsity     ops
separable_6 (SeparableConvolu 0.69                0.68                46301354
Layer (type)                  input sparsity      output sparsity     ops
separable_7 (SeparableConvolu 0.68                0.74                49046160
Layer (type)                  input sparsity      output sparsity     ops
separable_8 (SeparableConvolu 0.74                0.84                19555155
Layer (type)                  input sparsity      output sparsity     ops
separable_9 (SeparableConvolu 0.84                0.83                24714698
Layer (type)                  input sparsity      output sparsity     ops
separable_10 (SeparableConvol 0.83                0.00                269800

5.4 Show predictions for a random image

import matplotlib.pyplot as plt
import matplotlib.lines as lines
import matplotlib.patches as patches

label_names = [
    'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse',
    'ship', 'truck'
]

# prepare plot
barWidth = 0.75
pause_time = 1

fig = plt.figure(num='CIFAR10 Classification by Akida Execution Engine',
                 figsize=(8, 4))
ax0 = plt.subplot(1, 3, 1)
imgobj = ax0.imshow(np.zeros((32, 32, 3), dtype=np.uint8))
ax0.set_axis_off()
# Results subplots
ax1 = plt.subplot(1, 2, 2)
ax1.xaxis.set_visible(False)
ax0.text(0, 34, 'Actual class:')
actual_class = ax0.text(16, 34, 'None')
ax0.text(0, 37, 'Predicted class:')
predicted_class = ax0.text(20, 37, 'None')

# Take a random test image
i = np.random.randint(y_test.shape[0])

true_idx = int(y_test[i])
pot = model_akida.evaluate(np.expand_dims(raw_x_test[i], axis=0)).squeeze()

rpot = np.arange(len(pot))
ax1.barh(rpot, pot, height=barWidth)
ax1.set_yticks(rpot - 0.07 * barWidth)
ax1.set_yticklabels(label_names)
predicted_idx = pot.argmax()
imgobj.set_data(raw_x_test[i])
if predicted_idx == true_idx:
    ax1.get_children()[predicted_idx].set_color('g')
else:
    ax1.get_children()[predicted_idx].set_color('r')
actual_class.set_text(label_names[true_idx])
predicted_class.set_text(label_names[predicted_idx])
ax1.set_title('Akida\'s predictions')
plt.show()
Akida's predictions

Total running time of the script: ( 0 minutes 47.656 seconds)

Gallery generated by Sphinx-Gallery