DS-CNN CIFAR10 inference

This tutorial uses the CIFAR-10 dataset (60k training images distributed in 10 object classes) for a classic object classification task with a network built around the Depthwise Separable Convolutional Neural Network (DS-CNN) which is originated from Zhang et al (2018).

The goal of the tutorial is to provide users with an example of a complex model that can be converted to an Akida model and that can be run on Akida NSoC with an accuracy similar to a standard Keras floating point model.

1. Dataset preparation

from tensorflow.keras.datasets import cifar10

# Load CIFAR10 dataset
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Reshape x-data
x_train = x_train.reshape(50000, 32, 32, 3)
x_test = x_test.reshape(10000, 32, 32, 3)
input_shape = (32, 32, 3)

# Set aside raw test data for use with Akida Execution Engine later
raw_x_test = x_test.astype('uint8')

# Rescale x-data
a = 255
b = 0

x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train = (x_train - b) / a
x_test = (x_test - b) / a

Out:

Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz

     8192/170498071 [..............................] - ETA: 38:10
    40960/170498071 [..............................] - ETA: 15:16
    90112/170498071 [..............................] - ETA: 10:22
   204800/170498071 [..............................] - ETA: 6:05 
   417792/170498071 [..............................] - ETA: 3:43
   860160/170498071 [..............................] - ETA: 2:10
  1695744/170498071 [..............................] - ETA: 1:27
  4464640/170498071 [..............................] - ETA: 36s 
  5660672/170498071 [..............................] - ETA: 32s
  6889472/170498071 [>.............................] - ETA: 28s
  8118272/170498071 [>.............................] - ETA: 25s
  8331264/170498071 [>.............................] - ETA: 25s
  9363456/170498071 [>.............................] - ETA: 23s
  9592832/170498071 [>.............................] - ETA: 24s
 10641408/170498071 [>.............................] - ETA: 23s
 11919360/170498071 [=>............................] - ETA: 22s
 13213696/170498071 [=>............................] - ETA: 20s
 13418496/170498071 [=>............................] - ETA: 20s
 14524416/170498071 [=>............................] - ETA: 19s
 14704640/170498071 [=>............................] - ETA: 19s
 15867904/170498071 [=>............................] - ETA: 19s
 17195008/170498071 [==>...........................] - ETA: 18s
 17350656/170498071 [==>...........................] - ETA: 18s
 18554880/170498071 [==>...........................] - ETA: 17s
 18702336/170498071 [==>...........................] - ETA: 17s
 19914752/170498071 [==>...........................] - ETA: 17s
 20062208/170498071 [==>...........................] - ETA: 17s
 21291008/170498071 [==>...........................] - ETA: 16s
 21438464/170498071 [==>...........................] - ETA: 16s
 22683648/170498071 [==>...........................] - ETA: 16s
 22831104/170498071 [===>..........................] - ETA: 16s
 24076288/170498071 [===>..........................] - ETA: 15s
 24231936/170498071 [===>..........................] - ETA: 16s
 25501696/170498071 [===>..........................] - ETA: 15s
 25649152/170498071 [===>..........................] - ETA: 15s
 26927104/170498071 [===>..........................] - ETA: 15s
 27074560/170498071 [===>..........................] - ETA: 15s
 28368896/170498071 [===>..........................] - ETA: 14s
 28516352/170498071 [====>.........................] - ETA: 14s
 29827072/170498071 [====>.........................] - ETA: 14s
 31285248/170498071 [====>.........................] - ETA: 14s
 32759808/170498071 [====>.........................] - ETA: 13s
 32825344/170498071 [====>.........................] - ETA: 13s
 34234368/170498071 [=====>........................] - ETA: 13s
 34299904/170498071 [=====>........................] - ETA: 13s
 35741696/170498071 [=====>........................] - ETA: 13s
 37249024/170498071 [=====>........................] - ETA: 13s
 38756352/170498071 [=====>........................] - ETA: 12s
 40280064/170498071 [======>.......................] - ETA: 12s
 41803776/170498071 [======>.......................] - ETA: 12s
 43343872/170498071 [======>.......................] - ETA: 11s
 44900352/170498071 [======>.......................] - ETA: 11s
 46456832/170498071 [=======>......................] - ETA: 11s
 48013312/170498071 [=======>......................] - ETA: 11s
 49586176/170498071 [=======>......................] - ETA: 11s
 51175424/170498071 [========>.....................] - ETA: 10s
 52748288/170498071 [========>.....................] - ETA: 10s
 54353920/170498071 [========>.....................] - ETA: 10s
 55943168/170498071 [========>.....................] - ETA: 10s
 57548800/170498071 [=========>....................] - ETA: 9s 
 59154432/170498071 [=========>....................] - ETA: 9s
 60776448/170498071 [=========>....................] - ETA: 9s
 62398464/170498071 [=========>....................] - ETA: 9s
 64020480/170498071 [==========>...................] - ETA: 9s
 65658880/170498071 [==========>...................] - ETA: 8s
 67297280/170498071 [==========>...................] - ETA: 8s
 68935680/170498071 [===========>..................] - ETA: 8s
 70590464/170498071 [===========>..................] - ETA: 8s
 72228864/170498071 [===========>..................] - ETA: 8s
 73834496/170498071 [===========>..................] - ETA: 8s
 73883648/170498071 [============>.................] - ETA: 8s
 75481088/170498071 [============>.................] - ETA: 7s
 75538432/170498071 [============>.................] - ETA: 7s
 77127680/170498071 [============>.................] - ETA: 7s
 77193216/170498071 [============>.................] - ETA: 7s
 78782464/170498071 [============>.................] - ETA: 7s
 78848000/170498071 [============>.................] - ETA: 7s
 80388096/170498071 [=============>................] - ETA: 7s
 80502784/170498071 [=============>................] - ETA: 7s
 82010112/170498071 [=============>................] - ETA: 7s
 82173952/170498071 [=============>................] - ETA: 7s
 83648512/170498071 [=============>................] - ETA: 7s
 83828736/170498071 [=============>................] - ETA: 7s
 85311488/170498071 [==============>...............] - ETA: 6s
 85499904/170498071 [==============>...............] - ETA: 6s
 86990848/170498071 [==============>...............] - ETA: 6s
 87171072/170498071 [==============>...............] - ETA: 6s
 88653824/170498071 [==============>...............] - ETA: 6s
 88858624/170498071 [==============>...............] - ETA: 6s
 90349568/170498071 [==============>...............] - ETA: 6s
 90529792/170498071 [==============>...............] - ETA: 6s
 92004352/170498071 [===============>..............] - ETA: 6s
 92200960/170498071 [===============>..............] - ETA: 6s
 93601792/170498071 [===============>..............] - ETA: 6s
 93888512/170498071 [===============>..............] - ETA: 6s
 95264768/170498071 [===============>..............] - ETA: 5s
 95576064/170498071 [===============>..............] - ETA: 5s
 96919552/170498071 [================>.............] - ETA: 5s
 97263616/170498071 [================>.............] - ETA: 5s
 98574336/170498071 [================>.............] - ETA: 5s
 98951168/170498071 [================>.............] - ETA: 5s
100237312/170498071 [================>.............] - ETA: 5s
100622336/170498071 [================>.............] - ETA: 5s
101924864/170498071 [================>.............] - ETA: 5s
102326272/170498071 [=================>............] - ETA: 5s
103604224/170498071 [=================>............] - ETA: 5s
104013824/170498071 [=================>............] - ETA: 5s
105291776/170498071 [=================>............] - ETA: 5s
105701376/170498071 [=================>............] - ETA: 5s
106971136/170498071 [=================>............] - ETA: 4s
107405312/170498071 [=================>............] - ETA: 4s
108642304/170498071 [==================>...........] - ETA: 4s
109092864/170498071 [==================>...........] - ETA: 4s
110321664/170498071 [==================>...........] - ETA: 4s
110796800/170498071 [==================>...........] - ETA: 4s
112345088/170498071 [==================>...........] - ETA: 4s
112484352/170498071 [==================>...........] - ETA: 4s
113664000/170498071 [==================>...........] - ETA: 4s
114188288/170498071 [===================>..........] - ETA: 4s
115318784/170498071 [===================>..........] - ETA: 4s
115875840/170498071 [===================>..........] - ETA: 4s
116981760/170498071 [===================>..........] - ETA: 4s
117563392/170498071 [===================>..........] - ETA: 4s
118611968/170498071 [===================>..........] - ETA: 3s
119267328/170498071 [===================>..........] - ETA: 3s
120291328/170498071 [====================>.........] - ETA: 3s
120971264/170498071 [====================>.........] - ETA: 3s
121946112/170498071 [====================>.........] - ETA: 3s
122675200/170498071 [====================>.........] - ETA: 3s
123641856/170498071 [====================>.........] - ETA: 3s
124379136/170498071 [====================>.........] - ETA: 3s
125345792/170498071 [=====================>........] - ETA: 3s
126083072/170498071 [=====================>........] - ETA: 3s
127066112/170498071 [=====================>........] - ETA: 3s
127787008/170498071 [=====================>........] - ETA: 3s
128753664/170498071 [=====================>........] - ETA: 3s
129490944/170498071 [=====================>........] - ETA: 3s
130457600/170498071 [=====================>........] - ETA: 3s
131178496/170498071 [======================>.......] - ETA: 2s
132096000/170498071 [======================>.......] - ETA: 2s
132882432/170498071 [======================>.......] - ETA: 2s
133767168/170498071 [======================>.......] - ETA: 2s
134586368/170498071 [======================>.......] - ETA: 2s
135446528/170498071 [======================>.......] - ETA: 2s
136290304/170498071 [======================>.......] - ETA: 2s
137117696/170498071 [=======================>......] - ETA: 2s
137994240/170498071 [=======================>......] - ETA: 2s
138960896/170498071 [=======================>......] - ETA: 2s
139681792/170498071 [=======================>......] - ETA: 2s
140492800/170498071 [=======================>......] - ETA: 2s
141385728/170498071 [=======================>......] - ETA: 2s
142352384/170498071 [========================>.....] - ETA: 2s
143089664/170498071 [========================>.....] - ETA: 2s
143876096/170498071 [========================>.....] - ETA: 1s
144793600/170498071 [========================>.....] - ETA: 1s
145555456/170498071 [========================>.....] - ETA: 1s
146497536/170498071 [========================>.....] - ETA: 1s
147226624/170498071 [========================>.....] - ETA: 1s
148201472/170498071 [=========================>....] - ETA: 1s
149053440/170498071 [=========================>....] - ETA: 1s
149905408/170498071 [=========================>....] - ETA: 1s
150568960/170498071 [=========================>....] - ETA: 1s
151609344/170498071 [=========================>....] - ETA: 1s
152412160/170498071 [=========================>....] - ETA: 1s
153296896/170498071 [=========================>....] - ETA: 1s
153894912/170498071 [==========================>...] - ETA: 1s
155000832/170498071 [==========================>...] - ETA: 1s
155574272/170498071 [==========================>...] - ETA: 1s
156704768/170498071 [==========================>...] - ETA: 1s
157286400/170498071 [==========================>...] - ETA: 0s
158408704/170498071 [==========================>...] - ETA: 0s
158982144/170498071 [==========================>...] - ETA: 0s
160112640/170498071 [===========================>..] - ETA: 0s
160702464/170498071 [===========================>..] - ETA: 0s
161816576/170498071 [===========================>..] - ETA: 0s
162390016/170498071 [===========================>..] - ETA: 0s
163504128/170498071 [===========================>..] - ETA: 0s
164093952/170498071 [===========================>..] - ETA: 0s
165208064/170498071 [============================>.] - ETA: 0s
165765120/170498071 [============================>.] - ETA: 0s
166912000/170498071 [============================>.] - ETA: 0s
167460864/170498071 [============================>.] - ETA: 0s
168615936/170498071 [============================>.] - ETA: 0s
169156608/170498071 [============================>.] - ETA: 0s
170319872/170498071 [============================>.] - ETA: 0s
170500096/170498071 [==============================] - 12s 0us/step

2. Create a Keras DS-CNN model

The DS-CNN architecture is available in the Akida models zoo along with pretrained weights.

Note

The pre-trained weights were obtained after training the model with unconstrained float weights and activations for 1000 epochs

from tensorflow.keras.utils import get_file
from tensorflow.keras.models import load_model

# Retrieve the float model with pretrained weights and load it
model_file = get_file(
    "ds_cnn_cifar10.h5",
    "http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5",
    cache_subdir='models/ds_cnn_cifar10')
model_keras = load_model(model_file)
model_keras.summary()

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10.h5

    8192/10836232 [..............................] - ETA: 1:34
   73728/10836232 [..............................] - ETA: 18s 
  270336/10836232 [..............................] - ETA: 7s 
  466944/10836232 [>.............................] - ETA: 5s
  663552/10836232 [>.............................] - ETA: 4s
  860160/10836232 [=>............................] - ETA: 4s
 1056768/10836232 [=>............................] - ETA: 3s
 1253376/10836232 [==>...........................] - ETA: 3s
 1449984/10836232 [===>..........................] - ETA: 3s
 1646592/10836232 [===>..........................] - ETA: 3s
 1843200/10836232 [====>.........................] - ETA: 3s
 2039808/10836232 [====>.........................] - ETA: 3s
 2236416/10836232 [=====>........................] - ETA: 2s
 2433024/10836232 [=====>........................] - ETA: 2s
 2629632/10836232 [======>.......................] - ETA: 2s
 2826240/10836232 [======>.......................] - ETA: 2s
 3022848/10836232 [=======>......................] - ETA: 2s
 3219456/10836232 [=======>......................] - ETA: 2s
 3416064/10836232 [========>.....................] - ETA: 2s
 3612672/10836232 [=========>....................] - ETA: 2s
 3809280/10836232 [=========>....................] - ETA: 2s
 4005888/10836232 [==========>...................] - ETA: 2s
 4202496/10836232 [==========>...................] - ETA: 2s
 4399104/10836232 [===========>..................] - ETA: 2s
 4595712/10836232 [===========>..................] - ETA: 2s
 4792320/10836232 [============>.................] - ETA: 1s
 4988928/10836232 [============>.................] - ETA: 1s
 5185536/10836232 [=============>................] - ETA: 1s
 5382144/10836232 [=============>................] - ETA: 1s
 5578752/10836232 [==============>...............] - ETA: 1s
 5775360/10836232 [==============>...............] - ETA: 1s
 5971968/10836232 [===============>..............] - ETA: 1s
 6168576/10836232 [================>.............] - ETA: 1s
 6365184/10836232 [================>.............] - ETA: 1s
 6561792/10836232 [=================>............] - ETA: 1s
 6758400/10836232 [=================>............] - ETA: 1s
 6955008/10836232 [==================>...........] - ETA: 1s
 7151616/10836232 [==================>...........] - ETA: 1s
 7348224/10836232 [===================>..........] - ETA: 1s
 7544832/10836232 [===================>..........] - ETA: 1s
 7741440/10836232 [====================>.........] - ETA: 0s
 7938048/10836232 [====================>.........] - ETA: 0s
 8134656/10836232 [=====================>........] - ETA: 0s
 8331264/10836232 [======================>.......] - ETA: 0s
 8527872/10836232 [======================>.......] - ETA: 0s
 8724480/10836232 [=======================>......] - ETA: 0s
 8921088/10836232 [=======================>......] - ETA: 0s
 9117696/10836232 [========================>.....] - ETA: 0s
 9314304/10836232 [========================>.....] - ETA: 0s
 9510912/10836232 [=========================>....] - ETA: 0s
 9707520/10836232 [=========================>....] - ETA: 0s
 9904128/10836232 [==========================>...] - ETA: 0s
10100736/10836232 [==========================>...] - ETA: 0s
10297344/10836232 [===========================>..] - ETA: 0s
10493952/10836232 [============================>.] - ETA: 0s
10690560/10836232 [============================>.] - ETA: 0s
10838016/10836232 [==============================] - 3s 0us/step
Model: "ds_cnn_cifar10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_1 (InputLayer)         [(None, 32, 32, 3)]       0
_________________________________________________________________
conv_0 (Conv2D)              (None, 32, 32, 128)       3456
_________________________________________________________________
conv_0_BN (BatchNormalizatio (None, 32, 32, 128)       512
_________________________________________________________________
conv_0_relu (ReLU)           (None, 32, 32, 128)       0
_________________________________________________________________
separable_1 (SeparableConv2D (None, 32, 32, 128)       17536
_________________________________________________________________
separable_1_BN (BatchNormali (None, 32, 32, 128)       512
_________________________________________________________________
separable_1_relu (ReLU)      (None, 32, 32, 128)       0
_________________________________________________________________
separable_2 (SeparableConv2D (None, 32, 32, 256)       33920
_________________________________________________________________
separable_2_BN (BatchNormali (None, 32, 32, 256)       1024
_________________________________________________________________
separable_2_relu (ReLU)      (None, 32, 32, 256)       0
_________________________________________________________________
separable_3 (SeparableConv2D (None, 32, 32, 256)       67840
_________________________________________________________________
separable_3_maxpool (MaxPool (None, 16, 16, 256)       0
_________________________________________________________________
separable_3_BN (BatchNormali (None, 16, 16, 256)       1024
_________________________________________________________________
separable_3_relu (ReLU)      (None, 16, 16, 256)       0
_________________________________________________________________
separable_4 (SeparableConv2D (None, 16, 16, 512)       133376
_________________________________________________________________
separable_4_BN (BatchNormali (None, 16, 16, 512)       2048
_________________________________________________________________
separable_4_relu (ReLU)      (None, 16, 16, 512)       0
_________________________________________________________________
separable_5 (SeparableConv2D (None, 16, 16, 512)       266752
_________________________________________________________________
separable_5_maxpool (MaxPool (None, 8, 8, 512)         0
_________________________________________________________________
separable_5_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_5_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_6 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_6_BN (BatchNormali (None, 8, 8, 512)         2048
_________________________________________________________________
separable_6_relu (ReLU)      (None, 8, 8, 512)         0
_________________________________________________________________
separable_7 (SeparableConv2D (None, 8, 8, 512)         266752
_________________________________________________________________
separable_7_maxpool (MaxPool (None, 4, 4, 512)         0
_________________________________________________________________
separable_7_BN (BatchNormali (None, 4, 4, 512)         2048
_________________________________________________________________
separable_7_relu (ReLU)      (None, 4, 4, 512)         0
_________________________________________________________________
separable_8 (SeparableConv2D (None, 4, 4, 1024)        528896
_________________________________________________________________
separable_8_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_8_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_9 (SeparableConv2D (None, 4, 4, 1024)        1057792
_________________________________________________________________
separable_9_BN (BatchNormali (None, 4, 4, 1024)        4096
_________________________________________________________________
separable_9_relu (ReLU)      (None, 4, 4, 1024)        0
_________________________________________________________________
separable_10 (SeparableConv2 (None, 4, 4, 10)          19456
_________________________________________________________________
separable_10_global_avg (Glo (None, 10)                0
=================================================================
Total params: 2,681,984
Trainable params: 2,672,256
Non-trainable params: 9,728
_________________________________________________________________

Keras model accuracy is checked against the first n images of the test set.

The table below summarizes the expected results:

#Images

Accuracy

100

96.00 %

1000

94.30 %

10000

93.60 %

Note

Depending on your hardware setup, the processing time may vary.

import numpy as np

from sklearn.metrics import accuracy_score
from timeit import default_timer as timer


# Check Model performance
def check_model_performances(model, x_test, num_images=1000):
    start = timer()
    potentials_keras = model.predict(x_test[:num_images])
    preds_keras = np.squeeze(np.argmax(potentials_keras, 1))

    accuracy = accuracy_score(y_test[:num_images], preds_keras)
    print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
    end = timer()
    print(f'Keras inference on {num_images} images took {end-start:.2f} s.\n')


check_model_performances(model_keras, x_test)

Out:

Accuracy: 94.30%
Keras inference on 1000 images took 3.02 s.

3. Quantized model

Quantizing a model is done using cnn2snn.quantize. After the call, all the layers will have 4-bit weights and 4-bit activations.

This model will therefore satisfy the Akida NSoC requirements but will suffer from a drop in accuracy due to quantization as shown in the table below:

#Images

Float accuracy

Quantized accuracy

100

96.00 %

96.00 %

1000

94.30 %

92.60 %

10000

93.66 %

92.58 %

from cnn2snn import quantize

# Quantize the model to 4-bit weights and activations
model_keras_quantized = quantize(model_keras, 4, 4)

# Check Model performance
check_model_performances(model_keras_quantized, x_test)

Out:

Accuracy: 92.60%
Keras inference on 1000 images took 0.82 s.

4. Pretrained quantized model

The Akida models zoo also contains a pretrained quantized helper that was obtained using the tune action of akida_models CLI on the quantized model for 100 epochs.

Tuning the model, that is training with a lowered learning rate, allows to recover performances up to the initial floating point accuracy.

#Images

Float accuracy

Quantized accuracy

After tuning

100

96.00 %

96.00 %

97.00 %

1000

94.30 %

92.60 %

94.20 %

10000

93.66 %

92.58 %

93.08 %

from akida_models import ds_cnn_cifar10_pretrained

# Use a quantized model with pretrained quantized weights
model_keras_quantized_pretrained = ds_cnn_cifar10_pretrained()

# Check Model performance
check_model_performances(model_keras_quantized_pretrained, x_test)

Out:

Downloading data from http://data.brainchip.com/models/ds_cnn/ds_cnn_cifar10_iq4_wq4_aq4.h5

    8192/10741016 [..............................] - ETA: 36s
  106496/10741016 [..............................] - ETA: 7s 
  270336/10741016 [..............................] - ETA: 5s
  466944/10741016 [>.............................] - ETA: 3s
  663552/10741016 [>.............................] - ETA: 3s
  860160/10741016 [=>............................] - ETA: 3s
 1056768/10741016 [=>............................] - ETA: 3s
 1253376/10741016 [==>...........................] - ETA: 2s
 1449984/10741016 [===>..........................] - ETA: 2s
 1646592/10741016 [===>..........................] - ETA: 2s
 1843200/10741016 [====>.........................] - ETA: 2s
 2039808/10741016 [====>.........................] - ETA: 2s
 2236416/10741016 [=====>........................] - ETA: 2s
 2433024/10741016 [=====>........................] - ETA: 2s
 2629632/10741016 [======>.......................] - ETA: 2s
 2826240/10741016 [======>.......................] - ETA: 2s
 3022848/10741016 [=======>......................] - ETA: 2s
 3219456/10741016 [=======>......................] - ETA: 2s
 3416064/10741016 [========>.....................] - ETA: 2s
 3612672/10741016 [=========>....................] - ETA: 2s
 3809280/10741016 [=========>....................] - ETA: 1s
 4005888/10741016 [==========>...................] - ETA: 1s
 4202496/10741016 [==========>...................] - ETA: 1s
 4399104/10741016 [===========>..................] - ETA: 1s
 4595712/10741016 [===========>..................] - ETA: 1s
 4792320/10741016 [============>.................] - ETA: 1s
 4988928/10741016 [============>.................] - ETA: 1s
 5185536/10741016 [=============>................] - ETA: 1s
 5382144/10741016 [==============>...............] - ETA: 1s
 5578752/10741016 [==============>...............] - ETA: 1s
 5775360/10741016 [===============>..............] - ETA: 1s
 5971968/10741016 [===============>..............] - ETA: 1s
 6168576/10741016 [================>.............] - ETA: 1s
 6365184/10741016 [================>.............] - ETA: 1s
 6561792/10741016 [=================>............] - ETA: 1s
 6758400/10741016 [=================>............] - ETA: 1s
 6955008/10741016 [==================>...........] - ETA: 1s
 7151616/10741016 [==================>...........] - ETA: 0s
 7348224/10741016 [===================>..........] - ETA: 0s
 7544832/10741016 [====================>.........] - ETA: 0s
 7741440/10741016 [====================>.........] - ETA: 0s
 7938048/10741016 [=====================>........] - ETA: 0s
 8134656/10741016 [=====================>........] - ETA: 0s
 8331264/10741016 [======================>.......] - ETA: 0s
 8527872/10741016 [======================>.......] - ETA: 0s
 8724480/10741016 [=======================>......] - ETA: 0s
 8921088/10741016 [=======================>......] - ETA: 0s
 9117696/10741016 [========================>.....] - ETA: 0s
 9314304/10741016 [=========================>....] - ETA: 0s
 9510912/10741016 [=========================>....] - ETA: 0s
 9707520/10741016 [==========================>...] - ETA: 0s
 9904128/10741016 [==========================>...] - ETA: 0s
10100736/10741016 [===========================>..] - ETA: 0s
10297344/10741016 [===========================>..] - ETA: 0s
10493952/10741016 [============================>.] - ETA: 0s
10690560/10741016 [============================>.] - ETA: 0s
10747904/10741016 [==============================] - 3s 0us/step
Accuracy: 94.20%
Keras inference on 1000 images took 0.83 s.

5. Conversion to Akida

5.1 Convert to Akida model

When converting to an Akida model, we just need to pass the Keras model and the input scaling that was used during training to cnn2snn.convert.

from cnn2snn import convert

model_akida = convert(model_keras_quantized_pretrained, input_scaling=(a, b))

5.2 Check hardware compliancy

The Model.summary method provides a detailed description of the Model layers.

It also indicates hardware-incompatibilities if there are any. Hardware compatibility can also be checked manually using model_hardware_incompatibilities.

model_akida.summary()

Out:

                Model Summary
______________________________________________
Input shape  Output shape  Sequences  Layers
==============================================
[32, 32, 3]  [1, 1, 10]    1          11
______________________________________________

/usr/local/lib/python3.6/dist-packages/numpy/core/_asarray.py:83: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
  return array(a, dtype, copy=False, order=order)
              SW/conv_0-separable_10 (Software)
_____________________________________________________________
Layer (type)              Output shape   Kernel shape
=============================================================
conv_0 (InputConv.)       [32, 32, 128]  (3, 3, 3, 128)
_____________________________________________________________
separable_1 (Sep.Conv.)   [32, 32, 128]  (3, 3, 128, 1)
_____________________________________________________________
                                         (1, 1, 128, 128)
_____________________________________________________________
separable_2 (Sep.Conv.)   [32, 32, 256]  (3, 3, 128, 1)
_____________________________________________________________
                                         (1, 1, 128, 256)
_____________________________________________________________
separable_3 (Sep.Conv.)   [16, 16, 256]  (3, 3, 256, 1)
_____________________________________________________________
                                         (1, 1, 256, 256)
_____________________________________________________________
separable_4 (Sep.Conv.)   [16, 16, 512]  (3, 3, 256, 1)
_____________________________________________________________
                                         (1, 1, 256, 512)
_____________________________________________________________
separable_5 (Sep.Conv.)   [8, 8, 512]    (3, 3, 512, 1)
_____________________________________________________________
                                         (1, 1, 512, 512)
_____________________________________________________________
separable_6 (Sep.Conv.)   [8, 8, 512]    (3, 3, 512, 1)
_____________________________________________________________
                                         (1, 1, 512, 512)
_____________________________________________________________
separable_7 (Sep.Conv.)   [4, 4, 512]    (3, 3, 512, 1)
_____________________________________________________________
                                         (1, 1, 512, 512)
_____________________________________________________________
separable_8 (Sep.Conv.)   [4, 4, 1024]   (3, 3, 512, 1)
_____________________________________________________________
                                         (1, 1, 512, 1024)
_____________________________________________________________
separable_9 (Sep.Conv.)   [4, 4, 1024]   (3, 3, 1024, 1)
_____________________________________________________________
                                         (1, 1, 1024, 1024)
_____________________________________________________________
separable_10 (Sep.Conv.)  [1, 1, 10]     (3, 3, 1024, 1)
_____________________________________________________________
                                         (1, 1, 1024, 10)
_____________________________________________________________

5.3 Check performance

We check the Akida model accuracy on the first n images of the test set.

The table below summarizes the expected results:

#Images

Keras accuracy

Akida accuracy

100

96.00 %

97.00 %

1000

94.30 %

94.00 %

10000

93.66 %

93.04 %

Due to the conversion process, the predictions may be slightly different between the original Keras model and Akida on some specific images.

This explains why when testing on a limited number of images the accuracy numbers between Keras and Akida may be quite different. On the full test set however, the two models accuracies are very close.

num_images = 1000

# Check Model performance
start = timer()
results = model_akida.predict(raw_x_test[:num_images])
accuracy = accuracy_score(y_test[:num_images], results)

print("Accuracy: " + "{0:.2f}".format(100 * accuracy) + "%")
end = timer()
print(f'Akida inference on {num_images} images took {end-start:.2f} s.\n')

# For non-regression purpose
if num_images == 1000:
    assert accuracy == 0.94

Out:

Accuracy: 94.00%
Akida inference on 1000 images took 22.69 s.

Activations sparsity has a great impact on akida inference time. One can have a look at the average input and output sparsity of each layer using Model.statistics

# Print model statistics
print("Model statistics")
print(model_akida.statistics)

Out:

Model statistics

Sequence SW/conv_0-separable_10
Average framerate = 44.09 fps
Layer (type)                  output sparsity
conv_0 (InputConv.)           0.59
Layer (type)                  output sparsity
separable_1 (Sep.Conv.)       0.51
Layer (type)                  output sparsity
separable_2 (Sep.Conv.)       0.54
Layer (type)                  output sparsity
separable_3 (Sep.Conv.)       0.63
Layer (type)                  output sparsity
separable_4 (Sep.Conv.)       0.64
Layer (type)                  output sparsity
separable_5 (Sep.Conv.)       0.71
Layer (type)                  output sparsity
separable_6 (Sep.Conv.)       0.68
Layer (type)                  output sparsity
separable_7 (Sep.Conv.)       0.75
Layer (type)                  output sparsity
separable_8 (Sep.Conv.)       0.84
Layer (type)                  output sparsity
separable_9 (Sep.Conv.)       0.84
Layer (type)                  output sparsity
separable_10 (Sep.Conv.)      N/A

5.4 Show predictions for a random image

import matplotlib.pyplot as plt
import matplotlib.lines as lines
import matplotlib.patches as patches

label_names = [
    'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse',
    'ship', 'truck'
]

# prepare plot
barWidth = 0.75
pause_time = 1

fig = plt.figure(num='CIFAR10 Classification by Akida Execution Engine',
                 figsize=(8, 4))
ax0 = plt.subplot(1, 3, 1)
imgobj = ax0.imshow(np.zeros((32, 32, 3), dtype=np.uint8))
ax0.set_axis_off()
# Results subplots
ax1 = plt.subplot(1, 2, 2)
ax1.xaxis.set_visible(False)
ax0.text(0, 34, 'Actual class:')
actual_class = ax0.text(16, 34, 'None')
ax0.text(0, 37, 'Predicted class:')
predicted_class = ax0.text(20, 37, 'None')

# Take a random test image
i = np.random.randint(y_test.shape[0])

true_idx = int(y_test[i])
pot = model_akida.evaluate(np.expand_dims(raw_x_test[i], axis=0)).squeeze()

rpot = np.arange(len(pot))
ax1.barh(rpot, pot, height=barWidth)
ax1.set_yticks(rpot - 0.07 * barWidth)
ax1.set_yticklabels(label_names)
predicted_idx = pot.argmax()
imgobj.set_data(raw_x_test[i])
if predicted_idx == true_idx:
    ax1.get_children()[predicted_idx].set_color('g')
else:
    ax1.get_children()[predicted_idx].set_color('r')
actual_class.set_text(label_names[true_idx])
predicted_class.set_text(label_names[predicted_idx])
ax1.set_title('Akida\'s predictions')
plt.show()
Akida's predictions

Total running time of the script: ( 0 minutes 55.106 seconds)

Gallery generated by Sphinx-Gallery