DS-CNN CIFAR10 inference

This tutorial uses the CIFAR-10 dataset (60k training images distributed in 10 object classes) for a classic object classification task with a network built around the Depthwise Separable Convolutional Neural Network (DS-CNN) which is originated from Zhang et al (2018).

The goal of the tutorial is to provide users with an example of a complex model that can be converted to an Akida model and that can be run on Akida NSoC with an accuracy similar to a standard Keras floating point model.

1. Dataset preparation

from tensorflow.keras.datasets import cifar10

# Load CIFAR10 dataset
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Reshape x-data
x_train = x_train.reshape(50000, 32, 32, 3)
x_test = x_test.reshape(10000, 32, 32, 3)
input_shape = (32, 32, 3)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')

# Rescale x-data
a = 255
b = 0

x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

Out:

Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz

     8192/170498071 [..............................] - ETA: 37:58
    40960/170498071 [..............................] - ETA: 15:15
    90112/170498071 [..............................] - ETA: 10:20
   188416/170498071 [..............................] - ETA: 6:36 
   401408/170498071 [..............................] - ETA: 3:52
   811008/170498071 [..............................] - ETA: 2:17
  1630208/170498071 [..............................] - ETA: 1:31
  4055040/170498071 [..............................] - ETA: 40s 
  5218304/170498071 [..............................] - ETA: 34s
  6381568/170498071 [>.............................] - ETA: 31s
  7561216/170498071 [>.............................] - ETA: 28s
  8757248/170498071 [>.............................] - ETA: 26s
  9953280/170498071 [>.............................] - ETA: 24s
  9986048/170498071 [>.............................] - ETA: 24s
 11182080/170498071 [>.............................] - ETA: 23s
 12410880/170498071 [=>............................] - ETA: 21s
 12443648/170498071 [=>............................] - ETA: 22s
 13656064/170498071 [=>............................] - ETA: 20s
 13688832/170498071 [=>............................] - ETA: 21s
 14934016/170498071 [=>............................] - ETA: 20s
 14958592/170498071 [=>............................] - ETA: 20s
 16211968/170498071 [=>............................] - ETA: 19s
 16228352/170498071 [=>............................] - ETA: 19s
 17506304/170498071 [==>...........................] - ETA: 18s
 17522688/170498071 [==>...........................] - ETA: 19s
 18817024/170498071 [==>...........................] - ETA: 18s
 18833408/170498071 [==>...........................] - ETA: 18s
 20144128/170498071 [==>...........................] - ETA: 17s
 20160512/170498071 [==>...........................] - ETA: 18s
 21471232/170498071 [==>...........................] - ETA: 17s
 21495808/170498071 [==>...........................] - ETA: 17s
 22831104/170498071 [===>..........................] - ETA: 17s
 24190976/170498071 [===>..........................] - ETA: 16s
 25567232/170498071 [===>..........................] - ETA: 16s
 26943488/170498071 [===>..........................] - ETA: 15s
 26959872/170498071 [===>..........................] - ETA: 15s
 28336128/170498071 [===>..........................] - ETA: 15s
 28352512/170498071 [===>..........................] - ETA: 15s
 29745152/170498071 [====>.........................] - ETA: 15s
 31154176/170498071 [====>.........................] - ETA: 14s
 32579584/170498071 [====>.........................] - ETA: 14s
 34021376/170498071 [====>.........................] - ETA: 14s
 35446784/170498071 [=====>........................] - ETA: 13s
 35495936/170498071 [=====>........................] - ETA: 13s
 36904960/170498071 [=====>........................] - ETA: 13s
 36937728/170498071 [=====>........................] - ETA: 13s
 38363136/170498071 [=====>........................] - ETA: 13s
 38395904/170498071 [=====>........................] - ETA: 13s
 39837696/170498071 [======>.......................] - ETA: 13s
 41328640/170498071 [======>.......................] - ETA: 12s
 42819584/170498071 [======>.......................] - ETA: 12s
 44302336/170498071 [======>.......................] - ETA: 12s
 44310528/170498071 [======>.......................] - ETA: 12s
 45654016/170498071 [=======>......................] - ETA: 11s
 45817856/170498071 [=======>......................] - ETA: 11s
 47210496/170498071 [=======>......................] - ETA: 11s
 47325184/170498071 [=======>......................] - ETA: 11s
 48594944/170498071 [=======>......................] - ETA: 11s
 48848896/170498071 [=======>......................] - ETA: 11s
 50110464/170498071 [=======>......................] - ETA: 11s
 50372608/170498071 [=======>......................] - ETA: 11s
 51896320/170498071 [========>.....................] - ETA: 10s
 51912704/170498071 [========>.....................] - ETA: 11s
 53420032/170498071 [========>.....................] - ETA: 10s
 53452800/170498071 [========>.....................] - ETA: 10s
 54960128/170498071 [========>.....................] - ETA: 10s
 55009280/170498071 [========>.....................] - ETA: 10s
 56508416/170498071 [========>.....................] - ETA: 10s
 56565760/170498071 [========>.....................] - ETA: 10s
 58073088/170498071 [=========>....................] - ETA: 10s
 58122240/170498071 [=========>....................] - ETA: 10s
 59645952/170498071 [=========>....................] - ETA: 9s 
 59678720/170498071 [=========>....................] - ETA: 10s
 61186048/170498071 [=========>....................] - ETA: 9s 
 61235200/170498071 [=========>....................] - ETA: 9s
 62742528/170498071 [==========>...................] - ETA: 9s
 62808064/170498071 [==========>...................] - ETA: 9s
 64331776/170498071 [==========>...................] - ETA: 9s
 64397312/170498071 [==========>...................] - ETA: 9s
 65921024/170498071 [==========>...................] - ETA: 9s
 65970176/170498071 [==========>...................] - ETA: 9s
 67493888/170498071 [==========>...................] - ETA: 9s
 67559424/170498071 [==========>...................] - ETA: 9s
 69083136/170498071 [===========>..................] - ETA: 8s
 69148672/170498071 [===========>..................] - ETA: 8s
 70688768/170498071 [===========>..................] - ETA: 8s
 70737920/170498071 [===========>..................] - ETA: 8s
 72253440/170498071 [===========>..................] - ETA: 8s
 72327168/170498071 [===========>..................] - ETA: 8s
 73818112/170498071 [===========>..................] - ETA: 8s
 73932800/170498071 [============>.................] - ETA: 8s
 75399168/170498071 [============>.................] - ETA: 8s
 75538432/170498071 [============>.................] - ETA: 8s
 76947456/170498071 [============>.................] - ETA: 7s
 77127680/170498071 [============>.................] - ETA: 8s
 78340096/170498071 [============>.................] - ETA: 7s
 78716928/170498071 [============>.................] - ETA: 7s
 79609856/170498071 [=============>................] - ETA: 7s
 80322560/170498071 [=============>................] - ETA: 7s
 81190912/170498071 [=============>................] - ETA: 7s
 81944576/170498071 [=============>................] - ETA: 7s
 82837504/170498071 [=============>................] - ETA: 7s
 83566592/170498071 [=============>................] - ETA: 7s
 84336640/170498071 [=============>................] - ETA: 7s
 85188608/170498071 [=============>................] - ETA: 7s
 85942272/170498071 [==============>...............] - ETA: 7s
 86810624/170498071 [==============>...............] - ETA: 7s
 87564288/170498071 [==============>...............] - ETA: 6s
 88432640/170498071 [==============>...............] - ETA: 6s
 89186304/170498071 [==============>...............] - ETA: 6s
 90054656/170498071 [==============>...............] - ETA: 6s
 90808320/170498071 [==============>...............] - ETA: 6s
 91676672/170498071 [===============>..............] - ETA: 6s
 92405760/170498071 [===============>..............] - ETA: 6s
 93298688/170498071 [===============>..............] - ETA: 6s
 94035968/170498071 [===============>..............] - ETA: 6s
 94937088/170498071 [===============>..............] - ETA: 6s
 95674368/170498071 [===============>..............] - ETA: 6s
 96575488/170498071 [===============>..............] - ETA: 6s
 97312768/170498071 [================>.............] - ETA: 6s
 98213888/170498071 [================>.............] - ETA: 5s
 98951168/170498071 [================>.............] - ETA: 5s
 99852288/170498071 [================>.............] - ETA: 5s
100966400/170498071 [================>.............] - ETA: 5s
101490688/170498071 [================>.............] - ETA: 5s
102588416/170498071 [=================>............] - ETA: 5s
103129088/170498071 [=================>............] - ETA: 5s
104202240/170498071 [=================>............] - ETA: 5s
104767488/170498071 [=================>............] - ETA: 5s
105816064/170498071 [=================>............] - ETA: 5s
106405888/170498071 [=================>............] - ETA: 5s
107446272/170498071 [=================>............] - ETA: 5s
108044288/170498071 [==================>...........] - ETA: 5s
109002752/170498071 [==================>...........] - ETA: 4s
109666304/170498071 [==================>...........] - ETA: 4s
110379008/170498071 [==================>...........] - ETA: 4s
111304704/170498071 [==================>...........] - ETA: 4s
112009216/170498071 [==================>...........] - ETA: 4s
112943104/170498071 [==================>...........] - ETA: 4s
113647616/170498071 [==================>...........] - ETA: 4s
114581504/170498071 [===================>..........] - ETA: 4s
115113984/170498071 [===================>..........] - ETA: 4s
116219904/170498071 [===================>..........] - ETA: 4s
116727808/170498071 [===================>..........] - ETA: 4s
117833728/170498071 [===================>..........] - ETA: 4s
118087680/170498071 [===================>..........] - ETA: 4s
119250944/170498071 [===================>..........] - ETA: 4s
119578624/170498071 [====================>.........] - ETA: 4s
120709120/170498071 [====================>.........] - ETA: 3s
121184256/170498071 [====================>.........] - ETA: 3s
122183680/170498071 [====================>.........] - ETA: 3s
122822656/170498071 [====================>.........] - ETA: 3s
123805696/170498071 [====================>.........] - ETA: 3s
124461056/170498071 [====================>.........] - ETA: 3s
125460480/170498071 [=====================>........] - ETA: 3s
126099456/170498071 [=====================>........] - ETA: 3s
127098880/170498071 [=====================>........] - ETA: 3s
127737856/170498071 [=====================>........] - ETA: 3s
128745472/170498071 [=====================>........] - ETA: 3s
129359872/170498071 [=====================>........] - ETA: 3s
130285568/170498071 [=====================>........] - ETA: 3s
131014656/170498071 [======================>.......] - ETA: 3s
131956736/170498071 [======================>.......] - ETA: 3s
132653056/170498071 [======================>.......] - ETA: 2s
133595136/170498071 [======================>.......] - ETA: 2s
134291456/170498071 [======================>.......] - ETA: 2s
135233536/170498071 [======================>.......] - ETA: 2s
135929856/170498071 [======================>.......] - ETA: 2s
136863744/170498071 [=======================>......] - ETA: 2s
137568256/170498071 [=======================>......] - ETA: 2s
138403840/170498071 [=======================>......] - ETA: 2s
139206656/170498071 [=======================>......] - ETA: 2s
140025856/170498071 [=======================>......] - ETA: 2s
140845056/170498071 [=======================>......] - ETA: 2s
141664256/170498071 [=======================>......] - ETA: 2s
142499840/170498071 [========================>.....] - ETA: 2s
143310848/170498071 [========================>.....] - ETA: 2s
144138240/170498071 [========================>.....] - ETA: 2s
144941056/170498071 [========================>.....] - ETA: 1s
145776640/170498071 [========================>.....] - ETA: 1s
146522112/170498071 [========================>.....] - ETA: 1s
147415040/170498071 [========================>.....] - ETA: 1s
148152320/170498071 [=========================>....] - ETA: 1s
149053440/170498071 [=========================>....] - ETA: 1s
149757952/170498071 [=========================>....] - ETA: 1s
150691840/170498071 [=========================>....] - ETA: 1s
151437312/170498071 [=========================>....] - ETA: 1s
152313856/170498071 [=========================>....] - ETA: 1s
153075712/170498071 [=========================>....] - ETA: 1s
153968640/170498071 [==========================>...] - ETA: 1s
154705920/170498071 [==========================>...] - ETA: 1s
155607040/170498071 [==========================>...] - ETA: 1s
156327936/170498071 [==========================>...] - ETA: 1s
157261824/170498071 [==========================>...] - ETA: 1s
157999104/170498071 [==========================>...] - ETA: 0s
158900224/170498071 [==========================>...] - ETA: 0s
159580160/170498071 [===========================>..] - ETA: 0s
160538624/170498071 [===========================>..] - ETA: 0s
161210368/170498071 [===========================>..] - ETA: 0s
162177024/170498071 [===========================>..] - ETA: 0s
162865152/170498071 [===========================>..] - ETA: 0s
163831808/170498071 [===========================>..] - ETA: 0s
164683776/170498071 [===========================>..] - ETA: 0s
165486592/170498071 [============================>.] - ETA: 0s
166338560/170498071 [============================>.] - ETA: 0s
167124992/170498071 [============================>.] - ETA: 0s
167796736/170498071 [============================>.] - ETA: 0s
168779776/170498071 [============================>.] - ETA: 0s
169435136/170498071 [============================>.] - ETA: 0s
170418176/170498071 [============================>.] - ETA: 0s
170500096/170498071 [==============================] - 13s 0us/step

2. Create a Keras DS-CNN model

The DS-CNN architecture is available in the Akida models zoo along with pretrained weights.

Note

The pre-trained weights were obtained after training the model with unconstrained float weights and activations for 1000 epochs

from tensorflow.keras.utils import get_file
from tensorflow.keras.models import load_model

# Retrieve the float model with pretrained weights and load it
model_file = get_file(
    "ds_cnn_cifar10.h5",
    "http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5",
    cache_subdir='models/ds_cnn_cifar10')
model_keras = load_model(model_file)
model_keras.summary()

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5

    8192/10836232 [..............................] - ETA: 22s
   65536/10836232 [..............................] - ETA: 11s
  139264/10836232 [..............................] - ETA: 9s 
  335872/10836232 [..............................] - ETA: 5s
  532480/10836232 [>.............................] - ETA: 4s
  729088/10836232 [=>............................] - ETA: 3s
  925696/10836232 [=>............................] - ETA: 3s
 1122304/10836232 [==>...........................] - ETA: 3s
 1318912/10836232 [==>...........................] - ETA: 3s
 1515520/10836232 [===>..........................] - ETA: 3s
 1712128/10836232 [===>..........................] - ETA: 2s
 1908736/10836232 [====>.........................] - ETA: 2s
 2105344/10836232 [====>.........................] - ETA: 2s
 2301952/10836232 [=====>........................] - ETA: 2s
 2498560/10836232 [=====>........................] - ETA: 2s
 2695168/10836232 [======>.......................] - ETA: 2s
 2891776/10836232 [=======>......................] - ETA: 2s
 3088384/10836232 [=======>......................] - ETA: 2s
 3284992/10836232 [========>.....................] - ETA: 2s
 3481600/10836232 [========>.....................] - ETA: 2s
 3678208/10836232 [=========>....................] - ETA: 2s
 3874816/10836232 [=========>....................] - ETA: 2s
 4071424/10836232 [==========>...................] - ETA: 1s
 4268032/10836232 [==========>...................] - ETA: 1s
 4464640/10836232 [===========>..................] - ETA: 1s
 4661248/10836232 [===========>..................] - ETA: 1s
 4857856/10836232 [============>.................] - ETA: 1s
 5054464/10836232 [============>.................] - ETA: 1s
 5251072/10836232 [=============>................] - ETA: 1s
 5447680/10836232 [==============>...............] - ETA: 1s
 5644288/10836232 [==============>...............] - ETA: 1s
 5840896/10836232 [===============>..............] - ETA: 1s
 6037504/10836232 [===============>..............] - ETA: 1s
 6234112/10836232 [================>.............] - ETA: 1s
 6430720/10836232 [================>.............] - ETA: 1s
 6627328/10836232 [=================>............] - ETA: 1s
 6823936/10836232 [=================>............] - ETA: 1s
 7020544/10836232 [==================>...........] - ETA: 1s
 7217152/10836232 [==================>...........] - ETA: 1s
 7413760/10836232 [===================>..........] - ETA: 0s
 7610368/10836232 [====================>.........] - ETA: 0s
 7806976/10836232 [====================>.........] - ETA: 0s
 8003584/10836232 [=====================>........] - ETA: 0s
 8200192/10836232 [=====================>........] - ETA: 0s
 8396800/10836232 [======================>.......] - ETA: 0s
 8593408/10836232 [======================>.......] - ETA: 0s
 8790016/10836232 [=======================>......] - ETA: 0s
 8986624/10836232 [=======================>......] - ETA: 0s
 9183232/10836232 [========================>.....] - ETA: 0s
 9379840/10836232 [========================>.....] - ETA: 0s
 9576448/10836232 [=========================>....] - ETA: 0s
 9773056/10836232 [==========================>...] - ETA: 0s
 9969664/10836232 [==========================>...] - ETA: 0s
10166272/10836232 [===========================>..] - ETA: 0s
10362880/10836232 [===========================>..] - ETA: 0s
10559488/10836232 [============================>.] - ETA: 0s
10756096/10836232 [============================>.] - ETA: 0s
10838016/10836232 [==============================] - 3s 0us/step
Model: "ds_cnn_cifar10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_1 (InputLayer)         [(None, 32, 32, 3)]       0
_________________________________________________________________
conv_0 (Conv2D)              (None, 32, 32, 128)       3456
_________________________________________________________________
conv_0_BN (BatchNormalizatio (None, 32, 32, 128)       512
_________________________________________________________________
conv_0_relu (ReLU)           (None, 32, 32, 128)       0
_________________________________________________________________
separable_1 (SeparableConv2D (None, 32, 32, 128)       17536
_________________________________________________________________
separable_1_BN (BatchNormali (None, 32, 32, 128)       512
_________________________________________________________________
separable_1_relu (ReLU)      (None, 32, 32, 128)       0
_________________________________________________________________
separable_2 (SeparableConv2D (None, 32, 32, 256)       33920
_________________________________________________________________
separable_2_BN (BatchNormali (None, 32, 32, 256)       1024
_________________________________________________________________
separable_2_relu (ReLU)      (None, 32, 32, 256)       0
_________________________________________________________________
separable_3 (SeparableConv2D (None, 32, 32, 256)       67840
_________________________________________________________________
separable_3_maxpool (MaxPool (None, 16, 16, 256)       0
_________________________________________________________________
separable_3_BN (BatchNormali (None, 16, 16, 256)       1024
_________________________________________________________________
separable_3_relu (ReLU)      (None, 16, 16, 256)       0
_________________________________________________________________
separable_4 (SeparableConv2D (None, 16, 16, 512)       133376
_________________________________________________________________
separable_4_BN (BatchNormali (None, 16, 16, 512)       2048
_________________________________________________________________
separable_4_relu (ReLU)      (None, 16, 16, 512)       0
_________________________________________________________________
separable_5 (SeparableConv2D (None, 16, 16, 512)       266752
_________________________________________________________________
separable_5_maxpool (MaxPool (None, 8, 8, 512)         0
_________________________________________________________________
separable_5_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_5_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_6 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_6_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_6_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_7 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_7_maxpool (MaxPool (None, 4, 4, 512)         0
_________________________________________________________________
separable_7_BN (BatchNormali (None, 4, 4, 512)         2048
_________________________________________________________________
separable_7_relu (ReLU)      (None, 4, 4, 512)         0
_________________________________________________________________
separable_8 (SeparableConv2D (None, 4, 4, 1024)        528896
_________________________________________________________________
separable_8_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_8_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_9 (SeparableConv2D (None, 4, 4, 1024)        1057792
_________________________________________________________________
separable_9_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_9_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_10 (SeparableConv2 (None, 4, 4, 10)          19456
_________________________________________________________________
separable_10_global_avg (Glo (None, 10)                0
=================================================================
Total params: 2,681,984
Trainable params: 2,672,256
Non-trainable params: 9,728
_________________________________________________________________

Keras model accuracy is checked against the first n images of the test set.

The table below summarizes the expected results:

#Images

Accuracy

100

96.00 %

1000

94.30 %

10000

93.60 %

Note

Depending on your hardware setup, the processing time may vary.

import numpy as np

from sklearn.metrics import accuracy_score
from timeit import default_timer as timer


# Check Model performance
def check_model_performances(model, x_test, num_images=1000):
    start = timer()
    potentials_keras = model.predict(x_test[:num_images])
    preds_keras = np.squeeze(np.argmax(potentials_keras, 1))

    accuracy = accuracy_score(y_test[:num_images], preds_keras)
    print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
    end = timer()
    print(f'Keras inference on {num_images} images took {end-start:.2f} s.\n')


check_model_performances(model_keras, x_test)

Out:

Accuracy: 94.30%
Keras inference on 1000 images took 2.48 s.

3. Quantized model

Quantizing a model is done using cnn2snn.quantize. After the call, all the layers will have 4-bit weights and 4-bit activations.

This model will therefore satisfy the Akida NSoC requirements but will suffer from a drop in accuracy due to quantization as shown in the table below:

#Images

Float accuracy

Quantized accuracy

100

96.00 %

96.00 %

1000

94.30 %

92.60 %

10000

93.66 %

92.58 %

from cnn2snn import quantize

# Quantize the model to 4-bit weights and activations
model_keras_quantized = quantize(model_keras, 4, 4)

# Check Model performance
check_model_performances(model_keras_quantized, x_test)

Out:

Accuracy: 92.60%
Keras inference on 1000 images took 0.77 s.

4. Pretrained quantized model

The Akida models zoo also contains a pretrained quantized helper that was obtained using the tune action of akida_models CLI on the quantized model for 100 epochs.

Tuning the model, that is training with a lowered learning rate, allows to recover performances up to the initial floating point accuracy.

#Images

Float accuracy

Quantized accuracy

After tuning

100

96.00 %

96.00 %

97.00 %

1000

94.30 %

92.60 %

94.20 %

10000

93.66 %

92.58 %

93.08 %

from akida_models import ds_cnn_cifar10_pretrained

# Use a quantized model with pretrained quantized weights
model_keras_quantized_pretrained = ds_cnn_cifar10_pretrained()

# Check Model performance
check_model_performances(model_keras_quantized_pretrained, x_test)

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10_iq4_wq4_aq4.h5

    8192/10741016 [..............................] - ETA: 20s
   65536/10741016 [..............................] - ETA: 10s
  204800/10741016 [..............................] - ETA: 6s 
  401408/10741016 [>.............................] - ETA: 4s
  598016/10741016 [>.............................] - ETA: 3s
  794624/10741016 [=>............................] - ETA: 3s
  991232/10741016 [=>............................] - ETA: 3s
 1187840/10741016 [==>...........................] - ETA: 3s
 1384448/10741016 [==>...........................] - ETA: 2s
 1581056/10741016 [===>..........................] - ETA: 2s
 1777664/10741016 [===>..........................] - ETA: 2s
 1974272/10741016 [====>.........................] - ETA: 2s
 2170880/10741016 [=====>........................] - ETA: 2s
 2367488/10741016 [=====>........................] - ETA: 2s
 2564096/10741016 [======>.......................] - ETA: 2s
 2760704/10741016 [======>.......................] - ETA: 2s
 2957312/10741016 [=======>......................] - ETA: 2s
 3153920/10741016 [=======>......................] - ETA: 2s
 3350528/10741016 [========>.....................] - ETA: 2s
 3547136/10741016 [========>.....................] - ETA: 2s
 3743744/10741016 [=========>....................] - ETA: 2s
 3940352/10741016 [==========>...................] - ETA: 1s
 4136960/10741016 [==========>...................] - ETA: 1s
 4333568/10741016 [===========>..................] - ETA: 1s
 4530176/10741016 [===========>..................] - ETA: 1s
 4726784/10741016 [============>.................] - ETA: 1s
 4923392/10741016 [============>.................] - ETA: 1s
 5120000/10741016 [=============>................] - ETA: 1s
 5316608/10741016 [=============>................] - ETA: 1s
 5513216/10741016 [==============>...............] - ETA: 1s
 5709824/10741016 [==============>...............] - ETA: 1s
 5906432/10741016 [===============>..............] - ETA: 1s
 6103040/10741016 [================>.............] - ETA: 1s
 6299648/10741016 [================>.............] - ETA: 1s
 6496256/10741016 [=================>............] - ETA: 1s
 6692864/10741016 [=================>............] - ETA: 1s
 6889472/10741016 [==================>...........] - ETA: 1s
 7086080/10741016 [==================>...........] - ETA: 1s
 7282688/10741016 [===================>..........] - ETA: 0s
 7479296/10741016 [===================>..........] - ETA: 0s
 7675904/10741016 [====================>.........] - ETA: 0s
 7872512/10741016 [====================>.........] - ETA: 0s
 8069120/10741016 [=====================>........] - ETA: 0s
 8265728/10741016 [======================>.......] - ETA: 0s
 8462336/10741016 [======================>.......] - ETA: 0s
 8658944/10741016 [=======================>......] - ETA: 0s
 8855552/10741016 [=======================>......] - ETA: 0s
 9052160/10741016 [========================>.....] - ETA: 0s
 9248768/10741016 [========================>.....] - ETA: 0s
 9445376/10741016 [=========================>....] - ETA: 0s
 9641984/10741016 [=========================>....] - ETA: 0s
 9838592/10741016 [==========================>...] - ETA: 0s
10035200/10741016 [===========================>..] - ETA: 0s
10231808/10741016 [===========================>..] - ETA: 0s
10428416/10741016 [============================>.] - ETA: 0s
10625024/10741016 [============================>.] - ETA: 0s
10747904/10741016 [==============================] - 3s 0us/step
Accuracy: 94.20%
Keras inference on 1000 images took 0.80 s.

5. Conversion to Akida

5.1 Convert to Akida model

When converting to an Akida model, we just need to pass the Keras model and the input scaling that was used during training to cnn2snn.convert.

from cnn2snn import convert

model_akida = convert(model_keras_quantized_pretrained, input_scaling=(a, b))

5.2 Check hardware compliancy

The Model.summary method provides a detailed description of the Model layers.

It also indicates hardware-incompatibilities if there are any. Hardware compatibility can also be checked manually using model_hardware_incompatibilities.

model_akida.summary()

Out:

                                       Model Summary
___________________________________________________________________________________________
Layer (type)                           Output shape   Kernel shape
===========================================================================================
conv_0 (InputConvolutional)            [32, 32, 128]  (3, 3, 3, 128)
___________________________________________________________________________________________
separable_1 (SeparableConvolutional)   [32, 32, 128]  (3, 3, 128, 1), (1, 1, 128, 128)
___________________________________________________________________________________________
separable_2 (SeparableConvolutional)   [32, 32, 256]  (3, 3, 128, 1), (1, 1, 128, 256)
___________________________________________________________________________________________
separable_3 (SeparableConvolutional)   [16, 16, 256]  (3, 3, 256, 1), (1, 1, 256, 256)
___________________________________________________________________________________________
separable_4 (SeparableConvolutional)   [16, 16, 512]  (3, 3, 256, 1), (1, 1, 256, 512)
___________________________________________________________________________________________
separable_5 (SeparableConvolutional)   [8, 8, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_6 (SeparableConvolutional)   [8, 8, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_7 (SeparableConvolutional)   [4, 4, 512]    (3, 3, 512, 1), (1, 1, 512, 512)
___________________________________________________________________________________________
separable_8 (SeparableConvolutional)   [4, 4, 1024]   (3, 3, 512, 1), (1, 1, 512, 1024)
___________________________________________________________________________________________
separable_9 (SeparableConvolutional)   [4, 4, 1024]   (3, 3, 1024, 1), (1, 1, 1024, 1024)
___________________________________________________________________________________________
separable_10 (SeparableConvolutional)  [1, 1, 10]     (3, 3, 1024, 1), (1, 1, 1024, 10)
___________________________________________________________________________________________
Input shape: 32, 32, 3
Backend type: Software - 1.8.13

5.3 Check performance

We check the Akida model accuracy on the first n images of the test set.

The table below summarizes the expected results:

#Images

Keras accuracy

Akida accuracy

100

96.00 %

97.00 %

1000

94.30 %

94.00 %

10000

93.66 %

93.04 %

Due to the conversion process, the predictions may be slightly different between the original Keras model and Akida on some specific images.

This explains why when testing on a limited number of images the accuracy numbers between Keras and Akida may be quite different. On the full test set however, the two models accuracies are very close.

num_images = 1000

# Check Model performance
start = timer()
results = model_akida.predict(raw_x_test[:num_images])
accuracy = accuracy_score(y_test[:num_images], results)

print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
end = timer()
print(f'Akida inference on {num_images} images took {end-start:.2f} s.\n')

# For non-regression purpose
if num_images == 1000:
    assert accuracy == 0.94

Out:

Accuracy: 94.00%
Akida inference on 1000 images took 24.64 s.

Activations sparsity has a great impact on akida inference time. One can have a look at the average input and output sparsity of each layer using Model.get_statistics() For convenience, it is called here on a subset of the dataset.

# Print model statistics
print("Model statistics")
stats = model_akida.get_statistics()
model_akida.predict(raw_x_test[:20])
for _, stat in stats.items():
    print(stat)

Out:

Model statistics
Layer (type)                  output sparsity
conv_0 (InputConvolutional)   0.59
Layer (type)                  output sparsity
separable_1 (SeparableConvolu 0.51
Layer (type)                  output sparsity
separable_2 (SeparableConvolu 0.54
Layer (type)                  output sparsity
separable_3 (SeparableConvolu 0.63
Layer (type)                  output sparsity
separable_4 (SeparableConvolu 0.64
Layer (type)                  output sparsity
separable_5 (SeparableConvolu 0.71
Layer (type)                  output sparsity
separable_6 (SeparableConvolu 0.68
Layer (type)                  output sparsity
separable_7 (SeparableConvolu 0.75
Layer (type)                  output sparsity
separable_8 (SeparableConvolu 0.84
Layer (type)                  output sparsity
separable_9 (SeparableConvolu 0.84
Layer (type)                  output sparsity
separable_10 (SeparableConvol N/A

5.4 Show predictions for a random image

import matplotlib.pyplot as plt
import matplotlib.lines as lines
import matplotlib.patches as patches

label_names = [
    'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse',
    'ship', 'truck'
]

# prepare plot
barWidth = 0.75
pause_time = 1

fig = plt.figure(num='CIFAR10 Classification by Akida Execution Engine',
                 figsize=(8, 4))
ax0 = plt.subplot(1, 3, 1)
imgobj = ax0.imshow(np.zeros((32, 32, 3), dtype=np.uint8))
ax0.set_axis_off()
# Results subplots
ax1 = plt.subplot(1, 2, 2)
ax1.xaxis.set_visible(False)
ax0.text(0, 34, 'Actual class:')
actual_class = ax0.text(16, 34, 'None')
ax0.text(0, 37, 'Predicted class:')
predicted_class = ax0.text(20, 37, 'None')

# Take a random test image
i = np.random.randint(y_test.shape[0])

true_idx = int(y_test[i])
pot = model_akida.evaluate(np.expand_dims(raw_x_test[i], axis=0)).squeeze()

rpot = np.arange(len(pot))
ax1.barh(rpot, pot, height=barWidth)
ax1.set_yticks(rpot - 0.07 * barWidth)
ax1.set_yticklabels(label_names)
predicted_idx = pot.argmax()
imgobj.set_data(raw_x_test[i])
if predicted_idx == true_idx:
    ax1.get_children()[predicted_idx].set_color('g')
else:
    ax1.get_children()[predicted_idx].set_color('r')
actual_class.set_text(label_names[true_idx])
predicted_class.set_text(label_names[predicted_idx])
ax1.set_title('Akida\'s predictions')
plt.show()
Akida's predictions

Total running time of the script: ( 0 minutes 53.239 seconds)

Gallery generated by Sphinx-Gallery