Segmentation tutorial

This example demonstrates image segmentation with an Akida-compatible model as illustrated through person segmentation using the Portrait128 dataset.

Using pre-trained models for quick runtime, this example shows the evolution of model performance for a trained keras floating point model, a keras quantized and Quantization Aware Trained (QAT) model, and an Akida-converted model. Notice that the performance of the original keras floating point model is maintained throughout the model conversion flow.

1. Load the dataset

import os
import numpy as np
from akida_models import fetch_file

# Download validation set from Brainchip data server, it contains 10% of the original dataset
data_path = fetch_file(fname="val.tar.gz",
                       origin="https://data.brainchip.com/dataset-mirror/portrait128/val.tar.gz",
                       cache_subdir=os.path.join("datasets", "portrait128"),
                       extract=True)

data_dir = os.path.join(os.path.dirname(data_path), "val")
x_val = np.load(os.path.join(data_dir, "val_img.npy"))
y_val = np.load(os.path.join(data_dir, "val_msk.npy")).astype('uint8')
batch_size = 32
steps = x_val.shape[0] // 32

# Visualize some data
import matplotlib.pyplot as plt

id = np.random.randint(0, x_val.shape[0])

fig, axs = plt.subplots(3, 3, constrained_layout=True)
for col in range(3):
    axs[0, col].imshow(x_val[id + col] / 255.)
    axs[0, col].axis('off')
    axs[1, col].imshow(1 - y_val[id + col], cmap='Greys')
    axs[1, col].axis('off')
    axs[2, col].imshow(x_val[id + col] / 255. * y_val[id + col])
    axs[2, col].axis('off')

fig.suptitle('Image, mask and masked image', fontsize=10)
plt.show()
Image, mask and masked image
Downloading data from https://data.brainchip.com/dataset-mirror/portrait128/val.tar.gz.

        0/267313385 [..............................] - ETA: 0s
   114688/267313385 [..............................] - ETA: 1:57
   827392/267313385 [..............................] - ETA: 32s 
  2129920/267313385 [..............................] - ETA: 18s
  3325952/267313385 [..............................] - ETA: 15s
  4685824/267313385 [..............................] - ETA: 14s
  6086656/267313385 [..............................] - ETA: 12s
  7454720/267313385 [..............................] - ETA: 12s
  8814592/267313385 [..............................] - ETA: 11s
 10174464/267313385 [>.............................] - ETA: 11s
 11476992/267313385 [>.............................] - ETA: 11s
 12763136/267313385 [>.............................] - ETA: 11s
 13942784/267313385 [>.............................] - ETA: 10s
 15286272/267313385 [>.............................] - ETA: 10s
 16719872/267313385 [>.............................] - ETA: 10s
 17973248/267313385 [=>............................] - ETA: 10s
 19390464/267313385 [=>............................] - ETA: 10s
 20996096/267313385 [=>............................] - ETA: 10s
 22396928/267313385 [=>............................] - ETA: 9s 
 23773184/267313385 [=>............................] - ETA: 9s
 25092096/267313385 [=>............................] - ETA: 9s
 26451968/267313385 [=>............................] - ETA: 9s
 27852800/267313385 [==>...........................] - ETA: 9s
 29163520/267313385 [==>...........................] - ETA: 9s
 30621696/267313385 [==>...........................] - ETA: 9s
 31932416/267313385 [==>...........................] - ETA: 9s
 33349632/267313385 [==>...........................] - ETA: 9s
 34643968/267313385 [==>...........................] - ETA: 9s
 36052992/267313385 [===>..........................] - ETA: 9s
 37281792/267313385 [===>..........................] - ETA: 9s
 38641664/267313385 [===>..........................] - ETA: 8s
 40026112/267313385 [===>..........................] - ETA: 8s
 41746432/267313385 [===>..........................] - ETA: 8s
 43253760/267313385 [===>..........................] - ETA: 8s
 44703744/267313385 [====>.........................] - ETA: 8s
 45916160/267313385 [====>.........................] - ETA: 8s
 47112192/267313385 [====>.........................] - ETA: 8s
 48472064/267313385 [====>.........................] - ETA: 8s
 49758208/267313385 [====>.........................] - ETA: 8s
 51085312/267313385 [====>.........................] - ETA: 8s
 52477952/267313385 [====>.........................] - ETA: 8s
 53723136/267313385 [=====>........................] - ETA: 8s
 55066624/267313385 [=====>........................] - ETA: 8s
 56360960/267313385 [=====>........................] - ETA: 8s
 57589760/267313385 [=====>........................] - ETA: 8s
 58884096/267313385 [=====>........................] - ETA: 8s
 60350464/267313385 [=====>........................] - ETA: 7s
 61849600/267313385 [=====>........................] - ETA: 7s
 63102976/267313385 [======>.......................] - ETA: 7s
 64339968/267313385 [======>.......................] - ETA: 7s
 65568768/267313385 [======>.......................] - ETA: 7s
 66813952/267313385 [======>.......................] - ETA: 7s
 68157440/267313385 [======>.......................] - ETA: 7s
 69353472/267313385 [======>.......................] - ETA: 7s
 70696960/267313385 [======>.......................] - ETA: 7s
 71860224/267313385 [=======>......................] - ETA: 7s
 73154560/267313385 [=======>......................] - ETA: 7s
 74465280/267313385 [=======>......................] - ETA: 7s
 75644928/267313385 [=======>......................] - ETA: 7s
 76955648/267313385 [=======>......................] - ETA: 7s
 78135296/267313385 [=======>......................] - ETA: 7s
 79233024/267313385 [=======>......................] - ETA: 7s
 80396288/267313385 [========>.....................] - ETA: 7s
 81428480/267313385 [========>.....................] - ETA: 7s
 82804736/267313385 [========>.....................] - ETA: 7s
 84230144/267313385 [========>.....................] - ETA: 7s
 85377024/267313385 [========>.....................] - ETA: 7s
 86720512/267313385 [========>.....................] - ETA: 7s
 87916544/267313385 [========>.....................] - ETA: 7s
 89243648/267313385 [=========>....................] - ETA: 6s
 90488832/267313385 [=========>....................] - ETA: 6s
 91684864/267313385 [=========>....................] - ETA: 6s
 92995584/267313385 [=========>....................] - ETA: 6s
 94158848/267313385 [=========>....................] - ETA: 6s
 95404032/267313385 [=========>....................] - ETA: 6s
 96665600/267313385 [=========>....................] - ETA: 6s
 98099200/267313385 [==========>...................] - ETA: 6s
 99729408/267313385 [==========>...................] - ETA: 6s
101056512/267313385 [==========>...................] - ETA: 6s
102334464/267313385 [==========>...................] - ETA: 6s
103530496/267313385 [==========>...................] - ETA: 6s
104742912/267313385 [==========>...................] - ETA: 6s
105988096/267313385 [==========>...................] - ETA: 6s
107184128/267313385 [===========>..................] - ETA: 6s
108511232/267313385 [===========>..................] - ETA: 6s
109789184/267313385 [===========>..................] - ETA: 6s
111017984/267313385 [===========>..................] - ETA: 6s
112361472/267313385 [===========>..................] - ETA: 6s
113688576/267313385 [===========>..................] - ETA: 5s
114917376/267313385 [===========>..................] - ETA: 5s
116097024/267313385 [============>.................] - ETA: 5s
117260288/267313385 [============>.................] - ETA: 5s
118407168/267313385 [============>.................] - ETA: 5s
119603200/267313385 [============>.................] - ETA: 5s
120733696/267313385 [============>.................] - ETA: 5s
122191872/267313385 [============>.................] - ETA: 5s
123682816/267313385 [============>.................] - ETA: 5s
125009920/267313385 [=============>................] - ETA: 5s
126369792/267313385 [=============>................] - ETA: 5s
127549440/267313385 [=============>................] - ETA: 5s
128991232/267313385 [=============>................] - ETA: 5s
130252800/267313385 [=============>................] - ETA: 5s
131579904/267313385 [=============>................] - ETA: 5s
132890624/267313385 [=============>................] - ETA: 5s
134168576/267313385 [==============>...............] - ETA: 5s
135577600/267313385 [==============>...............] - ETA: 5s
136888320/267313385 [==============>...............] - ETA: 5s
138330112/267313385 [==============>...............] - ETA: 5s
139624448/267313385 [==============>...............] - ETA: 4s
140902400/267313385 [==============>...............] - ETA: 4s
142344192/267313385 [==============>...............] - ETA: 4s
143540224/267313385 [===============>..............] - ETA: 4s
145047552/267313385 [===============>..............] - ETA: 4s
146292736/267313385 [===============>..............] - ETA: 4s
147701760/267313385 [===============>..............] - ETA: 4s
148914176/267313385 [===============>..............] - ETA: 4s
150142976/267313385 [===============>..............] - ETA: 4s
151535616/267313385 [================>.............] - ETA: 4s
152780800/267313385 [================>.............] - ETA: 4s
154075136/267313385 [================>.............] - ETA: 4s
155385856/267313385 [================>.............] - ETA: 4s
156647424/267313385 [================>.............] - ETA: 4s
157958144/267313385 [================>.............] - ETA: 4s
159145984/267313385 [================>.............] - ETA: 4s
160530432/267313385 [=================>............] - ETA: 4s
162029568/267313385 [=================>............] - ETA: 4s
163364864/267313385 [=================>............] - ETA: 4s
164708352/267313385 [=================>............] - ETA: 3s
165920768/267313385 [=================>............] - ETA: 3s
167264256/267313385 [=================>............] - ETA: 3s
168624128/267313385 [=================>............] - ETA: 3s
169918464/267313385 [==================>...........] - ETA: 3s
171327488/267313385 [==================>...........] - ETA: 3s
172621824/267313385 [==================>...........] - ETA: 3s
173899776/267313385 [==================>...........] - ETA: 3s
175292416/267313385 [==================>...........] - ETA: 3s
176603136/267313385 [==================>...........] - ETA: 3s
177979392/267313385 [==================>...........] - ETA: 3s
179273728/267313385 [===================>..........] - ETA: 3s
180633600/267313385 [===================>..........] - ETA: 3s
181927936/267313385 [===================>..........] - ETA: 3s
183369728/267313385 [===================>..........] - ETA: 3s
184664064/267313385 [===================>..........] - ETA: 3s
185548800/267313385 [===================>..........] - ETA: 3s
187023360/267313385 [===================>..........] - ETA: 3s
188481536/267313385 [====================>.........] - ETA: 3s
190021632/267313385 [====================>.........] - ETA: 3s
191217664/267313385 [====================>.........] - ETA: 2s
192561152/267313385 [====================>.........] - ETA: 2s
193855488/267313385 [====================>.........] - ETA: 2s
195084288/267313385 [====================>.........] - ETA: 2s
196460544/267313385 [=====================>........] - ETA: 2s
197722112/267313385 [=====================>........] - ETA: 2s
199163904/267313385 [=====================>........] - ETA: 2s
200507392/267313385 [=====================>........] - ETA: 2s
201818112/267313385 [=====================>........] - ETA: 2s
203243520/267313385 [=====================>........] - ETA: 2s
204488704/267313385 [=====================>........] - ETA: 2s
205864960/267313385 [======================>.......] - ETA: 2s
207224832/267313385 [======================>.......] - ETA: 2s
208502784/267313385 [======================>.......] - ETA: 2s
209911808/267313385 [======================>.......] - ETA: 2s
211173376/267313385 [======================>.......] - ETA: 2s
212615168/267313385 [======================>.......] - ETA: 2s
213925888/267313385 [=======================>......] - ETA: 2s
215302144/267313385 [=======================>......] - ETA: 2s
216629248/267313385 [=======================>......] - ETA: 1s
217890816/267313385 [=======================>......] - ETA: 1s
219185152/267313385 [=======================>......] - ETA: 1s
220397568/267313385 [=======================>......] - ETA: 1s
221757440/267313385 [=======================>......] - ETA: 1s
223100928/267313385 [========================>.....] - ETA: 1s
224362496/267313385 [========================>.....] - ETA: 1s
225705984/267313385 [========================>.....] - ETA: 1s
226983936/267313385 [========================>.....] - ETA: 1s
228327424/267313385 [========================>.....] - ETA: 1s
229703680/267313385 [========================>.....] - ETA: 1s
231047168/267313385 [========================>.....] - ETA: 1s
232325120/267313385 [=========================>....] - ETA: 1s
233668608/267313385 [=========================>....] - ETA: 1s
234995712/267313385 [=========================>....] - ETA: 1s
236306432/267313385 [=========================>....] - ETA: 1s
237625344/267313385 [=========================>....] - ETA: 1s
238985216/267313385 [=========================>....] - ETA: 1s
240320512/267313385 [=========================>....] - ETA: 1s
241614848/267313385 [==========================>...] - ETA: 0s
242892800/267313385 [==========================>...] - ETA: 0s
244105216/267313385 [==========================>...] - ETA: 0s
245415936/267313385 [==========================>...] - ETA: 0s
246628352/267313385 [==========================>...] - ETA: 0s
247889920/267313385 [==========================>...] - ETA: 0s
249118720/267313385 [==========================>...] - ETA: 0s
250380288/267313385 [===========================>..] - ETA: 0s
251543552/267313385 [===========================>..] - ETA: 0s
252723200/267313385 [===========================>..] - ETA: 0s
253493248/267313385 [===========================>..] - ETA: 0s
254607360/267313385 [===========================>..] - ETA: 0s
255795200/267313385 [===========================>..] - ETA: 0s
256770048/267313385 [===========================>..] - ETA: 0s
257785856/267313385 [===========================>..] - ETA: 0s
259194880/267313385 [============================>.] - ETA: 0s
260259840/267313385 [============================>.] - ETA: 0s
261259264/267313385 [============================>.] - ETA: 0s
262045696/267313385 [============================>.] - ETA: 0s
263061504/267313385 [============================>.] - ETA: 0s
264224768/267313385 [============================>.] - ETA: 0s
265371648/267313385 [============================>.] - ETA: 0s
266289152/267313385 [============================>.] - ETA: 0s
267239424/267313385 [============================>.] - ETA: 0s
267313385/267313385 [==============================] - 11s 0us/step
Download complete.

2. Load a pre-trained native Keras model

The model used in this example is AkidaUNet. It has an AkidaNet (0.5) backbone to extract features combined with a succession of separable transposed convolutional blocks to build an image segmentation map. A pre-trained floating point keras model is downloaded to save training time.

Note

  • The “transposed” convolutional feature is new in Akida 2.0.

  • The “separable transposed” operation is realized through the combination of a QuantizeML custom DepthwiseConv2DTranspose layer with a standard pointwise convolution.

The performance of the model is evaluated using both pixel accuracy and Binary IoU. The pixel accuracy describes how well the model can predict the segmentation mask pixel by pixel and the Binary IoU takes into account how close the predicted mask is to the ground truth.

from akida_models.model_io import load_model

# Retrieve the model file from Brainchip data server
model_file = fetch_file(fname="akida_unet_portrait128.h5",
                        origin="https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128.h5",
                        cache_subdir='models')

# Load the native Keras pre-trained model
model_keras = load_model(model_file)
model_keras.summary()
Downloading data from https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128.h5.

      0/4493968 [..............................] - ETA: 0s
 114688/4493968 [..............................] - ETA: 1s
 868352/4493968 [====>.........................] - ETA: 0s
2170880/4493968 [=============>................] - ETA: 0s
3416064/4493968 [=====================>........] - ETA: 0s
4493968/4493968 [==============================] - 0s 0us/step
Download complete.
Model: "akida_unet"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input (InputLayer)          [(None, 128, 128, 3)]     0

 rescaling (Rescaling)       (None, 128, 128, 3)       0

 conv_0 (Conv2D)             (None, 64, 64, 16)        432

 conv_0/BN (BatchNormalizat  (None, 64, 64, 16)        64
 ion)

 conv_0/relu (ReLU)          (None, 64, 64, 16)        0

 conv_1 (Conv2D)             (None, 64, 64, 32)        4608

 conv_1/BN (BatchNormalizat  (None, 64, 64, 32)        128
 ion)

 conv_1/relu (ReLU)          (None, 64, 64, 32)        0

 conv_2 (Conv2D)             (None, 32, 32, 64)        18432

 conv_2/BN (BatchNormalizat  (None, 32, 32, 64)        256
 ion)

 conv_2/relu (ReLU)          (None, 32, 32, 64)        0

 conv_3 (Conv2D)             (None, 32, 32, 64)        36864

 conv_3/BN (BatchNormalizat  (None, 32, 32, 64)        256
 ion)

 conv_3/relu (ReLU)          (None, 32, 32, 64)        0

 dw_separable_4 (DepthwiseC  (None, 16, 16, 64)        576
 onv2D)

 pw_separable_4 (Conv2D)     (None, 16, 16, 128)       8192

 pw_separable_4/BN (BatchNo  (None, 16, 16, 128)       512
 rmalization)

 pw_separable_4/relu (ReLU)  (None, 16, 16, 128)       0

 dw_separable_5 (DepthwiseC  (None, 16, 16, 128)       1152
 onv2D)

 pw_separable_5 (Conv2D)     (None, 16, 16, 128)       16384

 pw_separable_5/BN (BatchNo  (None, 16, 16, 128)       512
 rmalization)

 pw_separable_5/relu (ReLU)  (None, 16, 16, 128)       0

 dw_separable_6 (DepthwiseC  (None, 8, 8, 128)         1152
 onv2D)

 pw_separable_6 (Conv2D)     (None, 8, 8, 256)         32768

 pw_separable_6/BN (BatchNo  (None, 8, 8, 256)         1024
 rmalization)

 pw_separable_6/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_7 (DepthwiseC  (None, 8, 8, 256)         2304
 onv2D)

 pw_separable_7 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_7/BN (BatchNo  (None, 8, 8, 256)         1024
 rmalization)

 pw_separable_7/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_8 (DepthwiseC  (None, 8, 8, 256)         2304
 onv2D)

 pw_separable_8 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_8/BN (BatchNo  (None, 8, 8, 256)         1024
 rmalization)

 pw_separable_8/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_9 (DepthwiseC  (None, 8, 8, 256)         2304
 onv2D)

 pw_separable_9 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_9/BN (BatchNo  (None, 8, 8, 256)         1024
 rmalization)

 pw_separable_9/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_10 (Depthwise  (None, 8, 8, 256)         2304
 Conv2D)

 pw_separable_10 (Conv2D)    (None, 8, 8, 256)         65536

 pw_separable_10/BN (BatchN  (None, 8, 8, 256)         1024
 ormalization)

 pw_separable_10/relu (ReLU  (None, 8, 8, 256)         0
 )

 dw_separable_11 (Depthwise  (None, 8, 8, 256)         2304
 Conv2D)

 pw_separable_11 (Conv2D)    (None, 8, 8, 256)         65536

 pw_separable_11/BN (BatchN  (None, 8, 8, 256)         1024
 ormalization)

 pw_separable_11/relu (ReLU  (None, 8, 8, 256)         0
 )

 dw_separable_12 (Depthwise  (None, 4, 4, 256)         2304
 Conv2D)

 pw_separable_12 (Conv2D)    (None, 4, 4, 512)         131072

 pw_separable_12/BN (BatchN  (None, 4, 4, 512)         2048
 ormalization)

 pw_separable_12/relu (ReLU  (None, 4, 4, 512)         0
 )

 dw_separable_13 (Depthwise  (None, 4, 4, 512)         4608
 Conv2D)

 pw_separable_13 (Conv2D)    (None, 4, 4, 512)         262144

 pw_separable_13/BN (BatchN  (None, 4, 4, 512)         2048
 ormalization)

 pw_separable_13/relu (ReLU  (None, 4, 4, 512)         0
 )

 dw_sepconv_t_0 (DepthwiseC  (None, 8, 8, 512)         5120
 onv2DTranspose)

 pw_sepconv_t_0 (Conv2D)     (None, 8, 8, 256)         131328

 pw_sepconv_t_0/BN (BatchNo  (None, 8, 8, 256)         1024
 rmalization)

 pw_sepconv_t_0/relu (ReLU)  (None, 8, 8, 256)         0

 dropout (Dropout)           (None, 8, 8, 256)         0

 dw_sepconv_t_1 (DepthwiseC  (None, 16, 16, 256)       2560
 onv2DTranspose)

 pw_sepconv_t_1 (Conv2D)     (None, 16, 16, 128)       32896

 pw_sepconv_t_1/BN (BatchNo  (None, 16, 16, 128)       512
 rmalization)

 pw_sepconv_t_1/relu (ReLU)  (None, 16, 16, 128)       0

 dropout_1 (Dropout)         (None, 16, 16, 128)       0

 dw_sepconv_t_2 (DepthwiseC  (None, 32, 32, 128)       1280
 onv2DTranspose)

 pw_sepconv_t_2 (Conv2D)     (None, 32, 32, 64)        8256

 pw_sepconv_t_2/BN (BatchNo  (None, 32, 32, 64)        256
 rmalization)

 pw_sepconv_t_2/relu (ReLU)  (None, 32, 32, 64)        0

 dropout_2 (Dropout)         (None, 32, 32, 64)        0

 dw_sepconv_t_3 (DepthwiseC  (None, 64, 64, 64)        640
 onv2DTranspose)

 pw_sepconv_t_3 (Conv2D)     (None, 64, 64, 32)        2080

 pw_sepconv_t_3/BN (BatchNo  (None, 64, 64, 32)        128
 rmalization)

 pw_sepconv_t_3/relu (ReLU)  (None, 64, 64, 32)        0

 dropout_3 (Dropout)         (None, 64, 64, 32)        0

 dw_sepconv_t_4 (DepthwiseC  (None, 128, 128, 32)      320
 onv2DTranspose)

 pw_sepconv_t_4 (Conv2D)     (None, 128, 128, 16)      528

 pw_sepconv_t_4/BN (BatchNo  (None, 128, 128, 16)      64
 rmalization)

 pw_sepconv_t_4/relu (ReLU)  (None, 128, 128, 16)      0

 dropout_4 (Dropout)         (None, 128, 128, 16)      0

 head (Conv2D)               (None, 128, 128, 1)       17

 sigmoid_act (Activation)    (None, 128, 128, 1)       0

=================================================================
Total params: 1058865 (4.04 MB)
Trainable params: 1051889 (4.01 MB)
Non-trainable params: 6976 (27.25 KB)
_________________________________________________________________
from keras.metrics import BinaryIoU

# Compile the native Keras model (required to evaluate the metrics)
model_keras.compile(loss='binary_crossentropy', metrics=[BinaryIoU(), 'accuracy'])

# Check Keras model performance
_, biou, acc = model_keras.evaluate(x_val, y_val, steps=steps, verbose=0)

print(f"Keras binary IoU / pixel accuracy: {biou:.4f} / {100*acc:.2f}%")
Keras binary IoU / pixel accuracy: 0.9324 / 96.62%

3. Load a pre-trained quantized Keras model

The next step is to quantize and potentially perform Quantize Aware Training (QAT) on the Keras model from the previous step. After the Keras model is quantized to 8-bits for all weights and activations, QAT is used to maintain the performance of the quantized model. Again, a pre-trained model is downloaded to save runtime.

from akida_models import akida_unet_portrait128_pretrained

# Load the pre-trained quantized model
model_quantized_keras = akida_unet_portrait128_pretrained()
model_quantized_keras.summary()
Downloading data from https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128_i8_w8_a8.h5.

      0/4520400 [..............................] - ETA: 0s
 114688/4520400 [..............................] - ETA: 1s
 753664/4520400 [====>.........................] - ETA: 0s
1581056/4520400 [=========>....................] - ETA: 0s
2514944/4520400 [===============>..............] - ETA: 0s
3481600/4520400 [======================>.......] - ETA: 0s
4520400/4520400 [==============================] - 0s 0us/step
Download complete.
Model: "akida_unet"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input (InputLayer)          [(None, 128, 128, 3)]     0

 rescaling (QuantizedRescal  (None, 128, 128, 3)       0
 ing)

 conv_0 (QuantizedConv2D)    (None, 64, 64, 16)        448

 conv_0/relu (QuantizedReLU  (None, 64, 64, 16)        32
 )

 conv_1 (QuantizedConv2D)    (None, 64, 64, 32)        4640

 conv_1/relu (QuantizedReLU  (None, 64, 64, 32)        64
 )

 conv_2 (QuantizedConv2D)    (None, 32, 32, 64)        18496

 conv_2/relu (QuantizedReLU  (None, 32, 32, 64)        128
 )

 conv_3 (QuantizedConv2D)    (None, 32, 32, 64)        36928

 conv_3/relu (QuantizedReLU  (None, 32, 32, 64)        128
 )

 dw_separable_4 (QuantizedD  (None, 16, 16, 64)        704
 epthwiseConv2D)

 pw_separable_4 (QuantizedC  (None, 16, 16, 128)       8320
 onv2D)

 pw_separable_4/relu (Quant  (None, 16, 16, 128)       256
 izedReLU)

 dw_separable_5 (QuantizedD  (None, 16, 16, 128)       1408
 epthwiseConv2D)

 pw_separable_5 (QuantizedC  (None, 16, 16, 128)       16512
 onv2D)

 pw_separable_5/relu (Quant  (None, 16, 16, 128)       256
 izedReLU)

 dw_separable_6 (QuantizedD  (None, 8, 8, 128)         1408
 epthwiseConv2D)

 pw_separable_6 (QuantizedC  (None, 8, 8, 256)         33024
 onv2D)

 pw_separable_6/relu (Quant  (None, 8, 8, 256)         512
 izedReLU)

 dw_separable_7 (QuantizedD  (None, 8, 8, 256)         2816
 epthwiseConv2D)

 pw_separable_7 (QuantizedC  (None, 8, 8, 256)         65792
 onv2D)

 pw_separable_7/relu (Quant  (None, 8, 8, 256)         512
 izedReLU)

 dw_separable_8 (QuantizedD  (None, 8, 8, 256)         2816
 epthwiseConv2D)

 pw_separable_8 (QuantizedC  (None, 8, 8, 256)         65792
 onv2D)

 pw_separable_8/relu (Quant  (None, 8, 8, 256)         512
 izedReLU)

 dw_separable_9 (QuantizedD  (None, 8, 8, 256)         2816
 epthwiseConv2D)

 pw_separable_9 (QuantizedC  (None, 8, 8, 256)         65792
 onv2D)

 pw_separable_9/relu (Quant  (None, 8, 8, 256)         512
 izedReLU)

 dw_separable_10 (Quantized  (None, 8, 8, 256)         2816
 DepthwiseConv2D)

 pw_separable_10 (Quantized  (None, 8, 8, 256)         65792
 Conv2D)

 pw_separable_10/relu (Quan  (None, 8, 8, 256)         512
 tizedReLU)

 dw_separable_11 (Quantized  (None, 8, 8, 256)         2816
 DepthwiseConv2D)

 pw_separable_11 (Quantized  (None, 8, 8, 256)         65792
 Conv2D)

 pw_separable_11/relu (Quan  (None, 8, 8, 256)         512
 tizedReLU)

 dw_separable_12 (Quantized  (None, 4, 4, 256)         2816
 DepthwiseConv2D)

 pw_separable_12 (Quantized  (None, 4, 4, 512)         131584
 Conv2D)

 pw_separable_12/relu (Quan  (None, 4, 4, 512)         1024
 tizedReLU)

 dw_separable_13 (Quantized  (None, 4, 4, 512)         5632
 DepthwiseConv2D)

 pw_separable_13 (Quantized  (None, 4, 4, 512)         262656
 Conv2D)

 pw_separable_13/relu (Quan  (None, 4, 4, 512)         1024
 tizedReLU)

 dw_sepconv_t_0 (QuantizedD  (None, 8, 8, 512)         6144
 epthwiseConv2DTranspose)

 pw_sepconv_t_0 (QuantizedC  (None, 8, 8, 256)         131328
 onv2D)

 pw_sepconv_t_0/relu (Quant  (None, 8, 8, 256)         512
 izedReLU)

 dropout (QuantizedDropout)  (None, 8, 8, 256)         0

 dw_sepconv_t_1 (QuantizedD  (None, 16, 16, 256)       3072
 epthwiseConv2DTranspose)

 pw_sepconv_t_1 (QuantizedC  (None, 16, 16, 128)       32896
 onv2D)

 pw_sepconv_t_1/relu (Quant  (None, 16, 16, 128)       256
 izedReLU)

 dropout_1 (QuantizedDropou  (None, 16, 16, 128)       0
 t)

 dw_sepconv_t_2 (QuantizedD  (None, 32, 32, 128)       1536
 epthwiseConv2DTranspose)

 pw_sepconv_t_2 (QuantizedC  (None, 32, 32, 64)        8256
 onv2D)

 pw_sepconv_t_2/relu (Quant  (None, 32, 32, 64)        128
 izedReLU)

 dropout_2 (QuantizedDropou  (None, 32, 32, 64)        0
 t)

 dw_sepconv_t_3 (QuantizedD  (None, 64, 64, 64)        768
 epthwiseConv2DTranspose)

 pw_sepconv_t_3 (QuantizedC  (None, 64, 64, 32)        2080
 onv2D)

 pw_sepconv_t_3/relu (Quant  (None, 64, 64, 32)        64
 izedReLU)

 dropout_3 (QuantizedDropou  (None, 64, 64, 32)        0
 t)

 dw_sepconv_t_4 (QuantizedD  (None, 128, 128, 32)      384
 epthwiseConv2DTranspose)

 pw_sepconv_t_4 (QuantizedC  (None, 128, 128, 16)      528
 onv2D)

 pw_sepconv_t_4/relu (Quant  (None, 128, 128, 16)      32
 izedReLU)

 dropout_4 (QuantizedDropou  (None, 128, 128, 16)      0
 t)

 head (QuantizedConv2D)      (None, 128, 128, 1)       17

 head/dequantizer (Dequanti  (None, 128, 128, 1)       0
 zer)

 sigmoid_act (Activation)    (None, 128, 128, 1)       0

=================================================================
Total params: 1061601 (4.05 MB)
Trainable params: 1047905 (4.00 MB)
Non-trainable params: 13696 (53.50 KB)
_________________________________________________________________
# Compile the quantized Keras model (required to evaluate the metrics)
model_quantized_keras.compile(loss='binary_crossentropy', metrics=[BinaryIoU(), 'accuracy'])

# Check Keras model performance
_, biou, acc = model_quantized_keras.evaluate(x_val, y_val, steps=steps, verbose=0)

print(f"Keras quantized binary IoU / pixel accuracy: {biou:.4f} / {100*acc:.2f}%")
Keras quantized binary IoU / pixel accuracy: 0.9319 / 96.59%

4. Conversion to Akida

Finally, the quantized Keras model from the previous step is converted into an Akida model and its performance is evaluated. Note that the original performance of the keras floating point model is maintained throughout the conversion process in this example.

from cnn2snn import convert

# Convert the model
model_akida = convert(model_quantized_keras)
model_akida.summary()
/usr/local/lib/python3.11/dist-packages/cnn2snn/quantizeml/blocks.py:160: UserWarning: Conversion stops at layer head because of a dequantizer. The end of the model is ignored:
___________________________________________________
Layer (type)
===================================================
sigmoid_act (Activation)
===================================================

  warnings.warn("Conversion stops" + stop_layer_msg + " because of a dequantizer. "
                  Model Summary
_________________________________________________
Input shape    Output shape   Sequences  Layers
=================================================
[128, 128, 3]  [128, 128, 1]  1          36
_________________________________________________

_____________________________________________________________________________
Layer (type)                               Output shape    Kernel shape

=================== SW/conv_0-head/dequantizer (Software) ===================

conv_0 (InputConv2D)                       [64, 64, 16]    (3, 3, 3, 16)
_____________________________________________________________________________
conv_1 (Conv2D)                            [64, 64, 32]    (3, 3, 16, 32)
_____________________________________________________________________________
conv_2 (Conv2D)                            [32, 32, 64]    (3, 3, 32, 64)
_____________________________________________________________________________
conv_3 (Conv2D)                            [32, 32, 64]    (3, 3, 64, 64)
_____________________________________________________________________________
dw_separable_4 (DepthwiseConv2D)           [16, 16, 64]    (3, 3, 64, 1)
_____________________________________________________________________________
pw_separable_4 (Conv2D)                    [16, 16, 128]   (1, 1, 64, 128)
_____________________________________________________________________________
dw_separable_5 (DepthwiseConv2D)           [16, 16, 128]   (3, 3, 128, 1)
_____________________________________________________________________________
pw_separable_5 (Conv2D)                    [16, 16, 128]   (1, 1, 128, 128)
_____________________________________________________________________________
dw_separable_6 (DepthwiseConv2D)           [8, 8, 128]     (3, 3, 128, 1)
_____________________________________________________________________________
pw_separable_6 (Conv2D)                    [8, 8, 256]     (1, 1, 128, 256)
_____________________________________________________________________________
dw_separable_7 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_7 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_8 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_8 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_9 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_9 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_10 (DepthwiseConv2D)          [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_10 (Conv2D)                   [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_11 (DepthwiseConv2D)          [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_11 (Conv2D)                   [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_12 (DepthwiseConv2D)          [4, 4, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_12 (Conv2D)                   [4, 4, 512]     (1, 1, 256, 512)
_____________________________________________________________________________
dw_separable_13 (DepthwiseConv2D)          [4, 4, 512]     (3, 3, 512, 1)
_____________________________________________________________________________
pw_separable_13 (Conv2D)                   [4, 4, 512]     (1, 1, 512, 512)
_____________________________________________________________________________
dw_sepconv_t_0 (DepthwiseConv2DTranspose)  [8, 8, 512]     (3, 3, 512, 1)
_____________________________________________________________________________
pw_sepconv_t_0 (Conv2D)                    [8, 8, 256]     (1, 1, 512, 256)
_____________________________________________________________________________
dw_sepconv_t_1 (DepthwiseConv2DTranspose)  [16, 16, 256]   (3, 3, 256, 1)
_____________________________________________________________________________
pw_sepconv_t_1 (Conv2D)                    [16, 16, 128]   (1, 1, 256, 128)
_____________________________________________________________________________
dw_sepconv_t_2 (DepthwiseConv2DTranspose)  [32, 32, 128]   (3, 3, 128, 1)
_____________________________________________________________________________
pw_sepconv_t_2 (Conv2D)                    [32, 32, 64]    (1, 1, 128, 64)
_____________________________________________________________________________
dw_sepconv_t_3 (DepthwiseConv2DTranspose)  [64, 64, 64]    (3, 3, 64, 1)
_____________________________________________________________________________
pw_sepconv_t_3 (Conv2D)                    [64, 64, 32]    (1, 1, 64, 32)
_____________________________________________________________________________
dw_sepconv_t_4 (DepthwiseConv2DTranspose)  [128, 128, 32]  (3, 3, 32, 1)
_____________________________________________________________________________
pw_sepconv_t_4 (Conv2D)                    [128, 128, 16]  (1, 1, 32, 16)
_____________________________________________________________________________
head (Conv2D)                              [128, 128, 1]   (1, 1, 16, 1)
_____________________________________________________________________________
head/dequantizer (Dequantizer)             [128, 128, 1]   N/A
_____________________________________________________________________________
import tensorflow as tf

# Check Akida model performance
labels, pots = None, None

for s in range(steps):
    batch = x_val[s * batch_size: (s + 1) * batch_size, :]
    label_batch = y_val[s * batch_size: (s + 1) * batch_size, :]
    pots_batch = model_akida.predict(batch.astype('uint8'))

    if labels is None:
        labels = label_batch
        pots = pots_batch
    else:
        labels = np.concatenate((labels, label_batch))
        pots = np.concatenate((pots, pots_batch))
preds = tf.keras.activations.sigmoid(pots)

m_binary_iou = tf.keras.metrics.BinaryIoU(target_class_ids=[0, 1], threshold=0.5)
m_binary_iou.update_state(labels, preds)
binary_iou = m_binary_iou.result().numpy()

m_accuracy = tf.keras.metrics.Accuracy()
m_accuracy.update_state(labels, preds > 0.5)
accuracy = m_accuracy.result().numpy()
print(f"Akida binary IoU / pixel accuracy: {binary_iou:.4f} / {100*accuracy:.2f}%")

# For non-regression purpose
assert binary_iou > 0.9
Akida binary IoU / pixel accuracy: 0.9308 / 96.59%

5. Segment a single image

For visualization of the person segmentation performed by the Akida model, display a single image along with the segmentation produced by the original floating point model and the ground truth segmentation.

import matplotlib.pyplot as plt

# Estimate age on a random single image and display Keras and Akida outputs
sample = np.expand_dims(x_val[id, :], 0)
keras_out = model_keras(sample)
akida_out = tf.keras.activations.sigmoid(model_akida.forward(sample.astype('uint8')))

fig, axs = plt.subplots(1, 3, constrained_layout=True)
axs[0].imshow(keras_out[0] * sample[0] / 255.)
axs[0].set_title('Keras segmentation', fontsize=10)
axs[0].axis('off')

axs[1].imshow(akida_out[0] * sample[0] / 255.)
axs[1].set_title('Akida segmentation', fontsize=10)
axs[1].axis('off')

axs[2].imshow(y_val[id] * sample[0] / 255.)
axs[2].set_title('Expected segmentation', fontsize=10)
axs[2].axis('off')

plt.show()
Keras segmentation, Akida segmentation, Expected segmentation

Total running time of the script: (1 minutes 59.387 seconds)

Gallery generated by Sphinx-Gallery