Regression tutorial

This tutorial demonstrates that hardware-compatible Akida models can perform regression tasks at the same accuracy level as a native CNN network.

This is illustrated through an age estimation problem using the UTKFace dataset.

1. Load the dataset

from akida_models.utk_face.preprocessing import load_data

# Load the dataset using akida_models preprocessing tool
x_train, y_train, x_test, y_test = load_data()

# For Akida inference, use uint8 raw data
x_test_akida = x_test.astype('uint8')

Out:

Downloading data from http://data.brainchip.com/dataset-mirror/utk_face/UTKFace_preprocessed.tar.gz

   16384/48742400 [..............................] - ETA: 43s
  204800/48742400 [..............................] - ETA: 15s
  532480/48742400 [..............................] - ETA: 36s
  712704/48742400 [..............................] - ETA: 47s
  835584/48742400 [..............................] - ETA: 43s
 1163264/48742400 [..............................] - ETA: 32s
 1384448/48742400 [..............................] - ETA: 29s
 1613824/48742400 [..............................] - ETA: 26s
 1867776/48742400 [>.............................] - ETA: 24s
 2105344/48742400 [>.............................] - ETA: 22s
 2367488/48742400 [>.............................] - ETA: 20s
 2605056/48742400 [>.............................] - ETA: 19s
 2859008/48742400 [>.............................] - ETA: 18s
 3104768/48742400 [>.............................] - ETA: 17s
 3366912/48742400 [=>............................] - ETA: 17s
 3637248/48742400 [=>............................] - ETA: 16s
 3899392/48742400 [=>............................] - ETA: 15s
 4169728/48742400 [=>............................] - ETA: 15s
 4440064/48742400 [=>............................] - ETA: 14s
 4685824/48742400 [=>............................] - ETA: 14s
 4915200/48742400 [==>...........................] - ETA: 14s
 5169152/48742400 [==>...........................] - ETA: 13s
 5455872/48742400 [==>...........................] - ETA: 13s
 5718016/48742400 [==>...........................] - ETA: 12s
 5988352/48742400 [==>...........................] - ETA: 12s
 6283264/48742400 [==>...........................] - ETA: 12s
 6537216/48742400 [===>..........................] - ETA: 12s
 6815744/48742400 [===>..........................] - ETA: 11s
 7127040/48742400 [===>..........................] - ETA: 11s
 7405568/48742400 [===>..........................] - ETA: 11s
 7700480/48742400 [===>..........................] - ETA: 11s
 7987200/48742400 [===>..........................] - ETA: 10s
 8134656/48742400 [====>.........................] - ETA: 10s
 8355840/48742400 [====>.........................] - ETA: 10s
 8470528/48742400 [====>.........................] - ETA: 10s
 8617984/48742400 [====>.........................] - ETA: 10s
 8806400/48742400 [====>.........................] - ETA: 10s
 8994816/48742400 [====>.........................] - ETA: 10s
 9183232/48742400 [====>.........................] - ETA: 10s
 9363456/48742400 [====>.........................] - ETA: 10s
 9560064/48742400 [====>.........................] - ETA: 10s
 9748480/48742400 [=====>........................] - ETA: 10s
 9936896/48742400 [=====>........................] - ETA: 10s
10141696/48742400 [=====>........................] - ETA: 10s
10346496/48742400 [=====>........................] - ETA: 10s
10543104/48742400 [=====>........................] - ETA: 10s
10739712/48742400 [=====>........................] - ETA: 10s
10944512/48742400 [=====>........................] - ETA: 10s
11165696/48742400 [=====>........................] - ETA: 10s
11386880/48742400 [======>.......................] - ETA: 10s
11608064/48742400 [======>.......................] - ETA: 9s 
11837440/48742400 [======>.......................] - ETA: 9s
12066816/48742400 [======>.......................] - ETA: 9s
12279808/48742400 [======>.......................] - ETA: 9s
12500992/48742400 [======>.......................] - ETA: 9s
12730368/48742400 [======>.......................] - ETA: 9s
13000704/48742400 [=======>......................] - ETA: 9s
13279232/48742400 [=======>......................] - ETA: 9s
13524992/48742400 [=======>......................] - ETA: 9s
13811712/48742400 [=======>......................] - ETA: 9s
14098432/48742400 [=======>......................] - ETA: 8s
14393344/48742400 [=======>......................] - ETA: 8s
14688256/48742400 [========>.....................] - ETA: 8s
14991360/48742400 [========>.....................] - ETA: 8s
15294464/48742400 [========>.....................] - ETA: 8s
15581184/48742400 [========>.....................] - ETA: 8s
15826944/48742400 [========>.....................] - ETA: 8s
15974400/48742400 [========>.....................] - ETA: 8s
16236544/48742400 [========>.....................] - ETA: 8s
16457728/48742400 [=========>....................] - ETA: 8s
16703488/48742400 [=========>....................] - ETA: 7s
16949248/48742400 [=========>....................] - ETA: 7s
17203200/48742400 [=========>....................] - ETA: 7s
17457152/48742400 [=========>....................] - ETA: 7s
17711104/48742400 [=========>....................] - ETA: 7s
17989632/48742400 [==========>...................] - ETA: 7s
18259968/48742400 [==========>...................] - ETA: 7s
18530304/48742400 [==========>...................] - ETA: 7s
18817024/48742400 [==========>...................] - ETA: 7s
19079168/48742400 [==========>...................] - ETA: 7s
19349504/48742400 [==========>...................] - ETA: 7s
19587072/48742400 [===========>..................] - ETA: 7s
19841024/48742400 [===========>..................] - ETA: 6s
20127744/48742400 [===========>..................] - ETA: 6s
20414464/48742400 [===========>..................] - ETA: 6s
20684800/48742400 [===========>..................] - ETA: 6s
20963328/48742400 [===========>..................] - ETA: 6s
21258240/48742400 [============>.................] - ETA: 6s
21561344/48742400 [============>.................] - ETA: 6s
21823488/48742400 [============>.................] - ETA: 6s
22093824/48742400 [============>.................] - ETA: 6s
22396928/48742400 [============>.................] - ETA: 6s
22691840/48742400 [============>.................] - ETA: 6s
22978560/48742400 [=============>................] - ETA: 5s
23289856/48742400 [=============>................] - ETA: 5s
23560192/48742400 [=============>................] - ETA: 5s
23871488/48742400 [=============>................] - ETA: 5s
24174592/48742400 [=============>................] - ETA: 5s
24477696/48742400 [==============>...............] - ETA: 5s
24788992/48742400 [==============>...............] - ETA: 5s
25083904/48742400 [==============>...............] - ETA: 5s
25403392/48742400 [==============>...............] - ETA: 5s
25690112/48742400 [==============>...............] - ETA: 5s
26001408/48742400 [===============>..............] - ETA: 5s
26304512/48742400 [===============>..............] - ETA: 5s
26624000/48742400 [===============>..............] - ETA: 4s
26927104/48742400 [===============>..............] - ETA: 4s
27254784/48742400 [===============>..............] - ETA: 4s
27426816/48742400 [===============>..............] - ETA: 4s
27746304/48742400 [================>.............] - ETA: 4s
27942912/48742400 [================>.............] - ETA: 4s
28172288/48742400 [================>.............] - ETA: 4s
28401664/48742400 [================>.............] - ETA: 4s
28598272/48742400 [================>.............] - ETA: 4s
28819456/48742400 [================>.............] - ETA: 4s
29057024/48742400 [================>.............] - ETA: 4s
29302784/48742400 [=================>............] - ETA: 4s
29540352/48742400 [=================>............] - ETA: 4s
29777920/48742400 [=================>............] - ETA: 4s
29999104/48742400 [=================>............] - ETA: 4s
30220288/48742400 [=================>............] - ETA: 4s
30466048/48742400 [=================>............] - ETA: 4s
30711808/48742400 [=================>............] - ETA: 4s
30957568/48742400 [==================>...........] - ETA: 3s
31195136/48742400 [==================>...........] - ETA: 3s
31432704/48742400 [==================>...........] - ETA: 3s
31645696/48742400 [==================>...........] - ETA: 3s
31809536/48742400 [==================>...........] - ETA: 3s
31948800/48742400 [==================>...........] - ETA: 3s
32129024/48742400 [==================>...........] - ETA: 3s
32276480/48742400 [==================>...........] - ETA: 3s
32464896/48742400 [==================>...........] - ETA: 3s
32669696/48742400 [===================>..........] - ETA: 3s
32874496/48742400 [===================>..........] - ETA: 3s
33071104/48742400 [===================>..........] - ETA: 3s
33234944/48742400 [===================>..........] - ETA: 3s
33488896/48742400 [===================>..........] - ETA: 3s
33726464/48742400 [===================>..........] - ETA: 3s
33947648/48742400 [===================>..........] - ETA: 3s
34217984/48742400 [====================>.........] - ETA: 3s
34471936/48742400 [====================>.........] - ETA: 3s
34750464/48742400 [====================>.........] - ETA: 3s
35020800/48742400 [====================>.........] - ETA: 3s
35307520/48742400 [====================>.........] - ETA: 3s
35577856/48742400 [====================>.........] - ETA: 2s
35864576/48742400 [=====================>........] - ETA: 2s
36143104/48742400 [=====================>........] - ETA: 2s
36405248/48742400 [=====================>........] - ETA: 2s
36651008/48742400 [=====================>........] - ETA: 2s
36929536/48742400 [=====================>........] - ETA: 2s
37208064/48742400 [=====================>........] - ETA: 2s
37494784/48742400 [======================>.......] - ETA: 2s
37797888/48742400 [======================>.......] - ETA: 2s
38117376/48742400 [======================>.......] - ETA: 2s
38404096/48742400 [======================>.......] - ETA: 2s
38674432/48742400 [======================>.......] - ETA: 2s
38961152/48742400 [======================>.......] - ETA: 2s
39247872/48742400 [=======================>......] - ETA: 2s
39526400/48742400 [=======================>......] - ETA: 2s
39829504/48742400 [=======================>......] - ETA: 1s
40124416/48742400 [=======================>......] - ETA: 1s
40394752/48742400 [=======================>......] - ETA: 1s
40697856/48742400 [========================>.....] - ETA: 1s
40968192/48742400 [========================>.....] - ETA: 1s
41263104/48742400 [========================>.....] - ETA: 1s
41558016/48742400 [========================>.....] - ETA: 1s
41852928/48742400 [========================>.....] - ETA: 1s
42164224/48742400 [========================>.....] - ETA: 1s
42459136/48742400 [=========================>....] - ETA: 1s
42762240/48742400 [=========================>....] - ETA: 1s
43073536/48742400 [=========================>....] - ETA: 1s
43368448/48742400 [=========================>....] - ETA: 1s
43565056/48742400 [=========================>....] - ETA: 1s
43843584/48742400 [=========================>....] - ETA: 1s
44130304/48742400 [==========================>...] - ETA: 0s
44433408/48742400 [==========================>...] - ETA: 0s
44736512/48742400 [==========================>...] - ETA: 0s
45031424/48742400 [==========================>...] - ETA: 0s
45309952/48742400 [==========================>...] - ETA: 0s
45588480/48742400 [===========================>..] - ETA: 0s
45891584/48742400 [===========================>..] - ETA: 0s
46178304/48742400 [===========================>..] - ETA: 0s
46440448/48742400 [===========================>..] - ETA: 0s
46743552/48742400 [===========================>..] - ETA: 0s
47022080/48742400 [===========================>..] - ETA: 0s
47300608/48742400 [============================>.] - ETA: 0s
47579136/48742400 [============================>.] - ETA: 0s
47865856/48742400 [============================>.] - ETA: 0s
48144384/48742400 [============================>.] - ETA: 0s
48447488/48742400 [============================>.] - ETA: 0s
48742400/48742400 [==============================] - 10s 0us/step

48750592/48742400 [==============================] - 10s 0us/step

2. Load a pre-trained native Keras model

The model is a simplified version inspired from VGG architecture. It consists of a succession of convolutional and pooling layers and ends with two fully connected layers that outputs a single value corresponding to the estimated age. This model architecture is compatible with the design constraints before quantization. It is the starting point for a model runnable on the Akida NSoC.

The pre-trained native Keras model loaded below was trained on 300 epochs. The model file is available on the BrainChip data server.

The performance of the model is evaluated using the “Mean Absolute Error” (MAE). The MAE, used as a metric in regression problem, is calculated as an average of absolute differences between the target values and the predictions. The MAE is a linear score, i.e. all the individual differences are equally weighted in the average.

from tensorflow.keras.utils import get_file
from tensorflow.keras.models import load_model

# Retrieve the model file from the BrainChip data server
model_file = get_file("vgg_utk_face.h5",
                      "http://data.brainchip.com/models/vgg/vgg_utk_face.h5",
                      cache_subdir='models')

# Load the native Keras pre-trained model
model_keras = load_model(model_file)
model_keras.summary()

Out:

Downloading data from http://data.brainchip.com/models/vgg/vgg_utk_face.h5

  16384/1907648 [..............................] - ETA: 1s
 294912/1907648 [===>..........................] - ETA: 0s
 696320/1907648 [=========>....................] - ETA: 0s
1196032/1907648 [=================>............] - ETA: 0s
1761280/1907648 [==========================>...] - ETA: 0s
1908736/1907648 [==============================] - 0s 0us/step

1916928/1907648 [==============================] - 0s 0us/step
Model: "vgg_utk_face"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input_37 (InputLayer)       [(None, 32, 32, 3)]       0

 rescaling_36 (Rescaling)    (None, 32, 32, 3)         0

 conv_0 (Conv2D)             (None, 30, 30, 32)        864

 conv_0_BN (BatchNormalizati  (None, 30, 30, 32)       128
 on)

 conv_0_relu (ReLU)          (None, 30, 30, 32)        0

 conv_1 (Conv2D)             (None, 30, 30, 32)        9216

 conv_1_maxpool (MaxPooling2  (None, 15, 15, 32)       0
 D)

 conv_1_BN (BatchNormalizati  (None, 15, 15, 32)       128
 on)

 conv_1_relu (ReLU)          (None, 15, 15, 32)        0

 dropout (Dropout)           (None, 15, 15, 32)        0

 conv_2 (Conv2D)             (None, 15, 15, 64)        18432

 conv_2_BN (BatchNormalizati  (None, 15, 15, 64)       256
 on)

 conv_2_relu (ReLU)          (None, 15, 15, 64)        0

 conv_3 (Conv2D)             (None, 15, 15, 64)        36864

 conv_3_maxpool (MaxPooling2  (None, 8, 8, 64)         0
 D)

 conv_3_BN (BatchNormalizati  (None, 8, 8, 64)         256
 on)

 conv_3_relu (ReLU)          (None, 8, 8, 64)          0

 dropout_1 (Dropout)         (None, 8, 8, 64)          0

 conv_4 (Conv2D)             (None, 8, 8, 84)          48384

 conv_4_BN (BatchNormalizati  (None, 8, 8, 84)         336
 on)

 conv_4_relu (ReLU)          (None, 8, 8, 84)          0

 dropout_2 (Dropout)         (None, 8, 8, 84)          0

 flatten_3 (Flatten)         (None, 5376)              0

 dense_1 (Dense)             (None, 64)                344064

 dense_1_BN (BatchNormalizat  (None, 64)               256
 ion)

 dense_1_relu (ReLU)         (None, 64)                0

 dense_2 (Dense)             (None, 1)                 65

=================================================================
Total params: 459,249
Trainable params: 458,569
Non-trainable params: 680
_________________________________________________________________
# Compile the native Keras model (required to evaluate the MAE)
model_keras.compile(optimizer='Adam', loss='mae')

# Check Keras model performance
mae_keras = model_keras.evaluate(x_test, y_test, verbose=0)

print("Keras MAE: {0:.4f}".format(mae_keras))

Out:

Keras MAE: 5.8023

3. Load a pre-trained quantized Keras model satisfying Akida NSoC requirements

The above native Keras model is quantized and fine-tuned to get a quantized Keras model satisfying the Akida NSoC requirements. The first convolutional layer of our model uses 8-bit weights and other layers are quantized using 2-bit weights. All activations are 2 bits.

The pre-trained model was obtained after two fine-tuning episodes:

  • the model is first quantized and fine-tuned with 4-bit weights and activations (first convolutional weights are 8 bits)

  • the model is then quantized and fine-tuned with 2-bit weights and activations (first convolutional weights are still 8 bits).

The table below summarizes the “Mean Absolute Error” (MAE) results obtained after every training episode.

Episode

Weights Quant.

Activ. Quant.

MAE

Epochs

1

N/A

N/A

5.80

300

2

8/4 bits

4 bits

5.79

30

3

8/2 bits

2 bits

6.15

30

Here, we directly load the pre-trained quantized Keras model using the akida_models helper.

from akida_models import vgg_utk_face_pretrained

# Load the pre-trained quantized model
model_quantized_keras = vgg_utk_face_pretrained()
model_quantized_keras.summary()

Out:

Downloading data from http://data.brainchip.com/models/vgg/vgg_utk_face_iq8_wq2_aq2.h5

  16384/1877128 [..............................] - ETA: 1s
 286720/1877128 [===>..........................] - ETA: 0s
 679936/1877128 [=========>....................] - ETA: 0s
1089536/1877128 [================>.............] - ETA: 0s
1597440/1877128 [========================>.....] - ETA: 0s
1884160/1877128 [==============================] - 0s 0us/step

1892352/1877128 [==============================] - 0s 0us/step
Model: "vgg_utk_face"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input_38 (InputLayer)       [(None, 32, 32, 3)]       0

 rescaling_37 (Rescaling)    (None, 32, 32, 3)         0

 conv_0 (QuantizedConv2D)    (None, 30, 30, 32)        896

 conv_0_relu (ActivationDisc  (None, 30, 30, 32)       0
 reteRelu)

 conv_1 (QuantizedConv2D)    (None, 30, 30, 32)        9248

 conv_1_maxpool (MaxPooling2  (None, 15, 15, 32)       0
 D)

 conv_1_relu (ActivationDisc  (None, 15, 15, 32)       0
 reteRelu)

 dropout_3 (Dropout)         (None, 15, 15, 32)        0

 conv_2 (QuantizedConv2D)    (None, 15, 15, 64)        18496

 conv_2_relu (ActivationDisc  (None, 15, 15, 64)       0
 reteRelu)

 conv_3 (QuantizedConv2D)    (None, 15, 15, 64)        36928

 conv_3_maxpool (MaxPooling2  (None, 8, 8, 64)         0
 D)

 conv_3_relu (ActivationDisc  (None, 8, 8, 64)         0
 reteRelu)

 dropout_4 (Dropout)         (None, 8, 8, 64)          0

 conv_4 (QuantizedConv2D)    (None, 8, 8, 84)          48468

 conv_4_relu (ActivationDisc  (None, 8, 8, 84)         0
 reteRelu)

 dropout_5 (Dropout)         (None, 8, 8, 84)          0

 flatten_4 (Flatten)         (None, 5376)              0

 dense_1 (QuantizedDense)    (None, 64)                344128

 dense_1_relu (ActivationDis  (None, 64)               0
 creteRelu)

 dense_2 (QuantizedDense)    (None, 1)                 65

=================================================================
Total params: 458,229
Trainable params: 458,229
Non-trainable params: 0
_________________________________________________________________
# Compile the quantized Keras model (required to evaluate the MAE)
model_quantized_keras.compile(optimizer='Adam', loss='mae')

# Check Keras model performance
mae_quant = model_quantized_keras.evaluate(x_test, y_test, verbose=0)

print("Keras MAE: {0:.4f}".format(mae_quant))

Out:

Keras MAE: 6.1465

4. Conversion to Akida

The quantized Keras model is now converted into an Akida model. After conversion, we evaluate the performance on the UTKFace dataset.

Since activations sparsity has a great impact on Akida inference time, we also have a look at the average input and output sparsity of each layer on a subset of the dataset.

from cnn2snn import convert

# Convert the model
model_akida = convert(model_quantized_keras)
model_akida.summary()

Out:

                Model Summary
______________________________________________
Input shape  Output shape  Sequences  Layers
==============================================
[32, 32, 3]  [1, 1, 1]     1          7
______________________________________________

            SW/conv_0-dense_2 (Software)
_____________________________________________________
Layer (type)         Output shape  Kernel shape
=====================================================
conv_0 (InputConv.)  [30, 30, 32]  (3, 3, 3, 32)
_____________________________________________________
conv_1 (Conv.)       [15, 15, 32]  (3, 3, 32, 32)
_____________________________________________________
conv_2 (Conv.)       [15, 15, 64]  (3, 3, 32, 64)
_____________________________________________________
conv_3 (Conv.)       [8, 8, 64]    (3, 3, 64, 64)
_____________________________________________________
conv_4 (Conv.)       [8, 8, 84]    (3, 3, 64, 84)
_____________________________________________________
dense_1 (Fully.)     [1, 1, 64]    (1, 1, 5376, 64)
_____________________________________________________
dense_2 (Fully.)     [1, 1, 1]     (1, 1, 64, 1)
_____________________________________________________
import numpy as np

# Check Akida model performance
y_akida = model_akida.predict(x_test_akida)

# Compute and display the MAE
mae_akida = np.sum(np.abs(y_test.squeeze() - y_akida.squeeze())) / len(y_test)
print("Akida MAE: {0:.4f}".format(mae_akida))

# For non-regression purpose
assert abs(mae_keras - mae_akida) < 0.5

Out:

Akida MAE: 6.1791

Let’s summarize the MAE performance for the native Keras, the quantized Keras and the Akida model.

Model

MAE

native Keras

5.80

quantized Keras

6.15

Akida

6.21

5. Estimate age on a single image

import matplotlib.pyplot as plt

# Estimate age on a random single image and display Keras and Akida outputs
id = np.random.randint(0, len(y_test) + 1)
age_keras = model_keras.predict(x_test[id:id + 1])

plt.imshow(x_test_akida[id], interpolation='bicubic')
plt.xticks([]), plt.yticks([])
plt.show()

print("Keras estimated age: {0:.1f}".format(age_keras.squeeze()))
print("Akida estimated age: {0:.1f}".format(y_akida[id].squeeze()))
print(f"Actual age: {y_test[id].squeeze()}")
plot 3 regression

Out:

Keras estimated age: 1.6
Akida estimated age: 1.0
Actual age: 2

Total running time of the script: ( 0 minutes 28.142 seconds)

Gallery generated by Sphinx-Gallery