Segmentation tutorial

This example demonstrates image segmentation with an Akida-compatible model as illustrated through person segmentation using the Portrait128 dataset.

Using pre-trained models for quick runtime, this example shows the evolution of model performance for a trained keras floating point model, a keras quantized and Quantization Aware Trained (QAT) model, and an Akida-converted model. Notice that the performance of the original keras floating point model is maintained throughout the model conversion flow.

1. Load the dataset

import os
import numpy as np
from akida_models import fetch_file

# Download validation set from Brainchip data server, it contains 10% of the original dataset
data_path = fetch_file(fname="val.tar.gz",
                       origin="https://data.brainchip.com/dataset-mirror/portrait128/val.tar.gz",
                       cache_subdir=os.path.join("datasets", "portrait128"),
                       extract=True)

data_dir = os.path.join(os.path.dirname(data_path), "val")
x_val = np.load(os.path.join(data_dir, "val_img.npy"))
y_val = np.load(os.path.join(data_dir, "val_msk.npy")).astype('uint8')
batch_size = 32
steps = x_val.shape[0] // 32

# Visualize some data
import matplotlib.pyplot as plt

id = np.random.randint(0, x_val.shape[0])

fig, axs = plt.subplots(3, 3, constrained_layout=True)
for col in range(3):
    axs[0, col].imshow(x_val[id + col] / 255.)
    axs[0, col].axis('off')
    axs[1, col].imshow(1 - y_val[id + col], cmap='Greys')
    axs[1, col].axis('off')
    axs[2, col].imshow(x_val[id + col] / 255. * y_val[id + col])
    axs[2, col].axis('off')

fig.suptitle('Image, mask and masked image', fontsize=10)
plt.show()
Image, mask and masked image
Downloading data from https://data.brainchip.com/dataset-mirror/portrait128/val.tar.gz.

        0/267313385 [..............................] - ETA: 0s
   188416/267313385 [..............................] - ETA: 1:11
   729088/267313385 [..............................] - ETA: 37s 
  1310720/267313385 [..............................] - ETA: 30s
  1908736/267313385 [..............................] - ETA: 28s
  2490368/267313385 [..............................] - ETA: 26s
  3072000/267313385 [..............................] - ETA: 25s
  3653632/267313385 [..............................] - ETA: 25s
  4235264/267313385 [..............................] - ETA: 25s
  4816896/267313385 [..............................] - ETA: 24s
  5414912/267313385 [..............................] - ETA: 24s
  5996544/267313385 [..............................] - ETA: 24s
  6594560/267313385 [..............................] - ETA: 23s
  7159808/267313385 [..............................] - ETA: 23s
  7766016/267313385 [..............................] - ETA: 23s
  8372224/267313385 [..............................] - ETA: 23s
  8962048/267313385 [>.............................] - ETA: 23s
  9551872/267313385 [>.............................] - ETA: 23s
 10149888/267313385 [>.............................] - ETA: 23s
 10739712/267313385 [>.............................] - ETA: 22s
 11329536/267313385 [>.............................] - ETA: 22s
 11919360/267313385 [>.............................] - ETA: 22s
 12509184/267313385 [>.............................] - ETA: 22s
 13033472/267313385 [>.............................] - ETA: 22s
 13672448/267313385 [>.............................] - ETA: 22s
 14245888/267313385 [>.............................] - ETA: 22s
 14835712/267313385 [>.............................] - ETA: 22s
 15425536/267313385 [>.............................] - ETA: 22s
 16015360/267313385 [>.............................] - ETA: 22s
 16605184/267313385 [>.............................] - ETA: 22s
 17195008/267313385 [>.............................] - ETA: 22s
 17784832/267313385 [>.............................] - ETA: 21s
 18374656/267313385 [=>............................] - ETA: 21s
 18964480/267313385 [=>............................] - ETA: 21s
 19554304/267313385 [=>............................] - ETA: 21s
 20144128/267313385 [=>............................] - ETA: 21s
 20717568/267313385 [=>............................] - ETA: 21s
 21323776/267313385 [=>............................] - ETA: 21s
 21913600/267313385 [=>............................] - ETA: 21s
 22503424/267313385 [=>............................] - ETA: 21s
 23093248/267313385 [=>............................] - ETA: 21s
 23650304/267313385 [=>............................] - ETA: 21s
 24272896/267313385 [=>............................] - ETA: 21s
 24862720/267313385 [=>............................] - ETA: 21s
 25452544/267313385 [=>............................] - ETA: 21s
 26042368/267313385 [=>............................] - ETA: 21s
 26632192/267313385 [=>............................] - ETA: 20s
 27222016/267313385 [==>...........................] - ETA: 20s
 27811840/267313385 [==>...........................] - ETA: 20s
 28401664/267313385 [==>...........................] - ETA: 20s
 28991488/267313385 [==>...........................] - ETA: 20s
 29597696/267313385 [==>...........................] - ETA: 20s
 30203904/267313385 [==>...........................] - ETA: 20s
 30793728/267313385 [==>...........................] - ETA: 20s
 31399936/267313385 [==>...........................] - ETA: 20s
 31793152/267313385 [==>...........................] - ETA: 21s
 32792576/267313385 [==>...........................] - ETA: 21s
 34791424/267313385 [==>...........................] - ETA: 20s
 35381248/267313385 [==>...........................] - ETA: 20s
 35971072/267313385 [===>..........................] - ETA: 20s
 36560896/267313385 [===>..........................] - ETA: 20s
 37150720/267313385 [===>..........................] - ETA: 19s
 37740544/267313385 [===>..........................] - ETA: 19s
 38330368/267313385 [===>..........................] - ETA: 19s
 38920192/267313385 [===>..........................] - ETA: 19s
 39510016/267313385 [===>..........................] - ETA: 19s
 40083456/267313385 [===>..........................] - ETA: 19s
 40673280/267313385 [===>..........................] - ETA: 19s
 41263104/267313385 [===>..........................] - ETA: 19s
 41852928/267313385 [===>..........................] - ETA: 19s
 42426368/267313385 [===>..........................] - ETA: 19s
 42967040/267313385 [===>..........................] - ETA: 19s
 43540480/267313385 [===>..........................] - ETA: 19s
 43982848/267313385 [===>..........................] - ETA: 19s
 44654592/267313385 [====>.........................] - ETA: 19s
 45064192/267313385 [====>.........................] - ETA: 19s
 45752320/267313385 [====>.........................] - ETA: 19s
 46301184/267313385 [====>.........................] - ETA: 19s
 46850048/267313385 [====>.........................] - ETA: 19s
 47390720/267313385 [====>.........................] - ETA: 19s
 47964160/267313385 [====>.........................] - ETA: 19s
 48553984/267313385 [====>.........................] - ETA: 19s
 49143808/267313385 [====>.........................] - ETA: 19s
 49733632/267313385 [====>.........................] - ETA: 18s
 50323456/267313385 [====>.........................] - ETA: 18s
 50913280/267313385 [====>.........................] - ETA: 18s
 51503104/267313385 [====>.........................] - ETA: 18s
 52092928/267313385 [====>.........................] - ETA: 18s
 52682752/267313385 [====>.........................] - ETA: 18s
 53272576/267313385 [====>.........................] - ETA: 18s
 53862400/267313385 [=====>........................] - ETA: 18s
 54452224/267313385 [=====>........................] - ETA: 18s
 55042048/267313385 [=====>........................] - ETA: 18s
 55631872/267313385 [=====>........................] - ETA: 18s
 56221696/267313385 [=====>........................] - ETA: 18s
 56811520/267313385 [=====>........................] - ETA: 18s
 57401344/267313385 [=====>........................] - ETA: 18s
 57991168/267313385 [=====>........................] - ETA: 18s
 58580992/267313385 [=====>........................] - ETA: 18s
 59170816/267313385 [=====>........................] - ETA: 18s
 59760640/267313385 [=====>........................] - ETA: 18s
 60350464/267313385 [=====>........................] - ETA: 17s
 60940288/267313385 [=====>........................] - ETA: 17s
 61530112/267313385 [=====>........................] - ETA: 17s
 62119936/267313385 [=====>........................] - ETA: 17s
 62709760/267313385 [======>.......................] - ETA: 17s
 63299584/267313385 [======>.......................] - ETA: 17s
 63889408/267313385 [======>.......................] - ETA: 17s
 64479232/267313385 [======>.......................] - ETA: 17s
 65069056/267313385 [======>.......................] - ETA: 17s
 65658880/267313385 [======>.......................] - ETA: 17s
 66248704/267313385 [======>.......................] - ETA: 17s
 66838528/267313385 [======>.......................] - ETA: 17s
 67428352/267313385 [======>.......................] - ETA: 17s
 68018176/267313385 [======>.......................] - ETA: 17s
 68608000/267313385 [======>.......................] - ETA: 17s
 69197824/267313385 [======>.......................] - ETA: 17s
 69787648/267313385 [======>.......................] - ETA: 17s
 70377472/267313385 [======>.......................] - ETA: 17s
 70967296/267313385 [======>.......................] - ETA: 16s
 71557120/267313385 [=======>......................] - ETA: 16s
 72138752/267313385 [=======>......................] - ETA: 16s
 72720384/267313385 [=======>......................] - ETA: 16s
 73310208/267313385 [=======>......................] - ETA: 16s
 73900032/267313385 [=======>......................] - ETA: 16s
 74489856/267313385 [=======>......................] - ETA: 16s
 75079680/267313385 [=======>......................] - ETA: 16s
 75669504/267313385 [=======>......................] - ETA: 16s
 76259328/267313385 [=======>......................] - ETA: 16s
 76849152/267313385 [=======>......................] - ETA: 16s
 77438976/267313385 [=======>......................] - ETA: 16s
 78045184/267313385 [=======>......................] - ETA: 16s
 78635008/267313385 [=======>......................] - ETA: 16s
 79224832/267313385 [=======>......................] - ETA: 16s
 79814656/267313385 [=======>......................] - ETA: 16s
 80404480/267313385 [========>.....................] - ETA: 16s
 80994304/267313385 [========>.....................] - ETA: 16s
 81584128/267313385 [========>.....................] - ETA: 16s
 82173952/267313385 [========>.....................] - ETA: 15s
 82763776/267313385 [========>.....................] - ETA: 15s
 83353600/267313385 [========>.....................] - ETA: 15s
 83943424/267313385 [========>.....................] - ETA: 15s
 84533248/267313385 [========>.....................] - ETA: 15s
 85106688/267313385 [========>.....................] - ETA: 15s
 85696512/267313385 [========>.....................] - ETA: 15s
 85925888/267313385 [========>.....................] - ETA: 15s
 88219648/267313385 [========>.....................] - ETA: 15s
 88809472/267313385 [========>.....................] - ETA: 15s
 89399296/267313385 [=========>....................] - ETA: 15s
 89989120/267313385 [=========>....................] - ETA: 15s
 90578944/267313385 [=========>....................] - ETA: 15s
 91168768/267313385 [=========>....................] - ETA: 15s
 91758592/267313385 [=========>....................] - ETA: 15s
 92348416/267313385 [=========>....................] - ETA: 15s
 92938240/267313385 [=========>....................] - ETA: 15s
 93528064/267313385 [=========>....................] - ETA: 14s
 94117888/267313385 [=========>....................] - ETA: 14s
 94707712/267313385 [=========>....................] - ETA: 14s
 95297536/267313385 [=========>....................] - ETA: 14s
 95887360/267313385 [=========>....................] - ETA: 14s
 96477184/267313385 [=========>....................] - ETA: 14s
 97067008/267313385 [=========>....................] - ETA: 14s
 97656832/267313385 [=========>....................] - ETA: 14s
 98230272/267313385 [==========>...................] - ETA: 14s
 98820096/267313385 [==========>...................] - ETA: 14s
 99409920/267313385 [==========>...................] - ETA: 14s
 99983360/267313385 [==========>...................] - ETA: 14s
100573184/267313385 [==========>...................] - ETA: 14s
101146624/267313385 [==========>...................] - ETA: 14s
101801984/267313385 [==========>...................] - ETA: 14s
102391808/267313385 [==========>...................] - ETA: 14s
102965248/267313385 [==========>...................] - ETA: 14s
103555072/267313385 [==========>...................] - ETA: 14s
104144896/267313385 [==========>...................] - ETA: 14s
104734720/267313385 [==========>...................] - ETA: 14s
105308160/267313385 [==========>...................] - ETA: 13s
105930752/267313385 [==========>...................] - ETA: 13s
106520576/267313385 [==========>...................] - ETA: 13s
107110400/267313385 [===========>..................] - ETA: 13s
107700224/267313385 [===========>..................] - ETA: 13s
108290048/267313385 [===========>..................] - ETA: 13s
108879872/267313385 [===========>..................] - ETA: 13s
109469696/267313385 [===========>..................] - ETA: 13s
110059520/267313385 [===========>..................] - ETA: 13s
110649344/267313385 [===========>..................] - ETA: 13s
111239168/267313385 [===========>..................] - ETA: 13s
111828992/267313385 [===========>..................] - ETA: 13s
112418816/267313385 [===========>..................] - ETA: 13s
113008640/267313385 [===========>..................] - ETA: 13s
113598464/267313385 [===========>..................] - ETA: 13s
114188288/267313385 [===========>..................] - ETA: 13s
114778112/267313385 [===========>..................] - ETA: 13s
115367936/267313385 [===========>..................] - ETA: 13s
115957760/267313385 [============>.................] - ETA: 13s
116547584/267313385 [============>.................] - ETA: 12s
117137408/267313385 [============>.................] - ETA: 12s
117727232/267313385 [============>.................] - ETA: 12s
118317056/267313385 [============>.................] - ETA: 12s
118906880/267313385 [============>.................] - ETA: 12s
119496704/267313385 [============>.................] - ETA: 12s
120086528/267313385 [============>.................] - ETA: 12s
120676352/267313385 [============>.................] - ETA: 12s
121266176/267313385 [============>.................] - ETA: 12s
121856000/267313385 [============>.................] - ETA: 12s
122445824/267313385 [============>.................] - ETA: 12s
123035648/267313385 [============>.................] - ETA: 12s
123625472/267313385 [============>.................] - ETA: 12s
124215296/267313385 [============>.................] - ETA: 12s
124805120/267313385 [=============>................] - ETA: 12s
125394944/267313385 [=============>................] - ETA: 12s
125984768/267313385 [=============>................] - ETA: 12s
126574592/267313385 [=============>................] - ETA: 12s
127164416/267313385 [=============>................] - ETA: 12s
127754240/267313385 [=============>................] - ETA: 12s
128344064/267313385 [=============>................] - ETA: 11s
128933888/267313385 [=============>................] - ETA: 11s
129523712/267313385 [=============>................] - ETA: 11s
130113536/267313385 [=============>................] - ETA: 11s
130686976/267313385 [=============>................] - ETA: 11s
131244032/267313385 [=============>................] - ETA: 11s
131833856/267313385 [=============>................] - ETA: 11s
132423680/267313385 [=============>................] - ETA: 11s
133013504/267313385 [=============>................] - ETA: 11s
133603328/267313385 [=============>................] - ETA: 11s
134193152/267313385 [==============>...............] - ETA: 11s
134782976/267313385 [==============>...............] - ETA: 11s
135372800/267313385 [==============>...............] - ETA: 11s
135962624/267313385 [==============>...............] - ETA: 11s
136552448/267313385 [==============>...............] - ETA: 11s
137142272/267313385 [==============>...............] - ETA: 11s
137732096/267313385 [==============>...............] - ETA: 11s
138321920/267313385 [==============>...............] - ETA: 11s
138911744/267313385 [==============>...............] - ETA: 11s
139501568/267313385 [==============>...............] - ETA: 11s
140091392/267313385 [==============>...............] - ETA: 10s
140681216/267313385 [==============>...............] - ETA: 10s
141271040/267313385 [==============>...............] - ETA: 10s
141860864/267313385 [==============>...............] - ETA: 10s
142450688/267313385 [==============>...............] - ETA: 10s
143040512/267313385 [===============>..............] - ETA: 10s
143630336/267313385 [===============>..............] - ETA: 10s
144220160/267313385 [===============>..............] - ETA: 10s
144809984/267313385 [===============>..............] - ETA: 10s
145383424/267313385 [===============>..............] - ETA: 10s
145973248/267313385 [===============>..............] - ETA: 10s
146563072/267313385 [===============>..............] - ETA: 10s
147152896/267313385 [===============>..............] - ETA: 10s
147742720/267313385 [===============>..............] - ETA: 10s
148332544/267313385 [===============>..............] - ETA: 10s
148889600/267313385 [===============>..............] - ETA: 10s
149495808/267313385 [===============>..............] - ETA: 10s
150036480/267313385 [===============>..............] - ETA: 10s
150691840/267313385 [===============>..............] - ETA: 10s
151265280/267313385 [===============>..............] - ETA: 9s 
151855104/267313385 [================>.............] - ETA: 9s
152444928/267313385 [================>.............] - ETA: 9s
153034752/267313385 [================>.............] - ETA: 9s
153624576/267313385 [================>.............] - ETA: 9s
154214400/267313385 [================>.............] - ETA: 9s
154787840/267313385 [================>.............] - ETA: 9s
155377664/267313385 [================>.............] - ETA: 9s
155967488/267313385 [================>.............] - ETA: 9s
156524544/267313385 [================>.............] - ETA: 9s
157114368/267313385 [================>.............] - ETA: 9s
157704192/267313385 [================>.............] - ETA: 9s
158294016/267313385 [================>.............] - ETA: 9s
158867456/267313385 [================>.............] - ETA: 9s
159424512/267313385 [================>.............] - ETA: 9s
159981568/267313385 [================>.............] - ETA: 9s
160538624/267313385 [=================>............] - ETA: 9s
161095680/267313385 [=================>............] - ETA: 9s
161652736/267313385 [=================>............] - ETA: 9s
162209792/267313385 [=================>............] - ETA: 9s
162766848/267313385 [=================>............] - ETA: 9s
163307520/267313385 [=================>............] - ETA: 8s
163848192/267313385 [=================>............] - ETA: 8s
164405248/267313385 [=================>............] - ETA: 8s
164962304/267313385 [=================>............] - ETA: 8s
165552128/267313385 [=================>............] - ETA: 8s
166141952/267313385 [=================>............] - ETA: 8s
166715392/267313385 [=================>............] - ETA: 8s
167288832/267313385 [=================>............] - ETA: 8s
167862272/267313385 [=================>............] - ETA: 8s
168452096/267313385 [=================>............] - ETA: 8s
169041920/267313385 [=================>............] - ETA: 8s
169631744/267313385 [==================>...........] - ETA: 8s
170221568/267313385 [==================>...........] - ETA: 8s
170811392/267313385 [==================>...........] - ETA: 8s
171401216/267313385 [==================>...........] - ETA: 8s
171991040/267313385 [==================>...........] - ETA: 8s
172580864/267313385 [==================>...........] - ETA: 8s
173170688/267313385 [==================>...........] - ETA: 8s
173760512/267313385 [==================>...........] - ETA: 8s
174350336/267313385 [==================>...........] - ETA: 8s
174940160/267313385 [==================>...........] - ETA: 7s
175529984/267313385 [==================>...........] - ETA: 7s
176119808/267313385 [==================>...........] - ETA: 7s
176709632/267313385 [==================>...........] - ETA: 7s
177299456/267313385 [==================>...........] - ETA: 7s
177889280/267313385 [==================>...........] - ETA: 7s
178479104/267313385 [===================>..........] - ETA: 7s
179068928/267313385 [===================>..........] - ETA: 7s
179658752/267313385 [===================>..........] - ETA: 7s
180248576/267313385 [===================>..........] - ETA: 7s
180838400/267313385 [===================>..........] - ETA: 7s
181444608/267313385 [===================>..........] - ETA: 7s
182034432/267313385 [===================>..........] - ETA: 7s
182624256/267313385 [===================>..........] - ETA: 7s
183214080/267313385 [===================>..........] - ETA: 7s
183803904/267313385 [===================>..........] - ETA: 7s
184393728/267313385 [===================>..........] - ETA: 7s
184983552/267313385 [===================>..........] - ETA: 7s
185573376/267313385 [===================>..........] - ETA: 7s
186163200/267313385 [===================>..........] - ETA: 6s
186753024/267313385 [===================>..........] - ETA: 6s
187342848/267313385 [====================>.........] - ETA: 6s
187916288/267313385 [====================>.........] - ETA: 6s
188506112/267313385 [====================>.........] - ETA: 6s
189095936/267313385 [====================>.........] - ETA: 6s
189685760/267313385 [====================>.........] - ETA: 6s
190275584/267313385 [====================>.........] - ETA: 6s
190865408/267313385 [====================>.........] - ETA: 6s
191455232/267313385 [====================>.........] - ETA: 6s
192053248/267313385 [====================>.........] - ETA: 6s
192634880/267313385 [====================>.........] - ETA: 6s
193224704/267313385 [====================>.........] - ETA: 6s
193814528/267313385 [====================>.........] - ETA: 6s
194404352/267313385 [====================>.........] - ETA: 6s
194994176/267313385 [====================>.........] - ETA: 6s
195584000/267313385 [====================>.........] - ETA: 6s
196173824/267313385 [=====================>........] - ETA: 6s
196763648/267313385 [=====================>........] - ETA: 6s
197353472/267313385 [=====================>........] - ETA: 6s
197943296/267313385 [=====================>........] - ETA: 5s
198533120/267313385 [=====================>........] - ETA: 5s
199122944/267313385 [=====================>........] - ETA: 5s
199712768/267313385 [=====================>........] - ETA: 5s
200302592/267313385 [=====================>........] - ETA: 5s
200892416/267313385 [=====================>........] - ETA: 5s
201482240/267313385 [=====================>........] - ETA: 5s
202072064/267313385 [=====================>........] - ETA: 5s
202629120/267313385 [=====================>........] - ETA: 5s
203268096/267313385 [=====================>........] - ETA: 5s
203857920/267313385 [=====================>........] - ETA: 5s
204447744/267313385 [=====================>........] - ETA: 5s
205037568/267313385 [======================>.......] - ETA: 5s
205627392/267313385 [======================>.......] - ETA: 5s
206217216/267313385 [======================>.......] - ETA: 5s
206807040/267313385 [======================>.......] - ETA: 5s
207446016/267313385 [======================>.......] - ETA: 5s
208035840/267313385 [======================>.......] - ETA: 5s
208625664/267313385 [======================>.......] - ETA: 5s
209215488/267313385 [======================>.......] - ETA: 5s
209805312/267313385 [======================>.......] - ETA: 4s
210395136/267313385 [======================>.......] - ETA: 4s
210993152/267313385 [======================>.......] - ETA: 4s
211574784/267313385 [======================>.......] - ETA: 4s
212164608/267313385 [======================>.......] - ETA: 4s
212754432/267313385 [======================>.......] - ETA: 4s
213344256/267313385 [======================>.......] - ETA: 4s
213934080/267313385 [=======================>......] - ETA: 4s
214523904/267313385 [=======================>......] - ETA: 4s
215113728/267313385 [=======================>......] - ETA: 4s
215703552/267313385 [=======================>......] - ETA: 4s
216293376/267313385 [=======================>......] - ETA: 4s
216883200/267313385 [=======================>......] - ETA: 4s
217473024/267313385 [=======================>......] - ETA: 4s
218062848/267313385 [=======================>......] - ETA: 4s
218652672/267313385 [=======================>......] - ETA: 4s
219193344/267313385 [=======================>......] - ETA: 4s
219848704/267313385 [=======================>......] - ETA: 4s
220446720/267313385 [=======================>......] - ETA: 4s
221028352/267313385 [=======================>......] - ETA: 3s
221618176/267313385 [=======================>......] - ETA: 3s
222208000/267313385 [=======================>......] - ETA: 3s
222797824/267313385 [========================>.....] - ETA: 3s
223387648/267313385 [========================>.....] - ETA: 3s
223977472/267313385 [========================>.....] - ETA: 3s
224567296/267313385 [========================>.....] - ETA: 3s
225157120/267313385 [========================>.....] - ETA: 3s
225746944/267313385 [========================>.....] - ETA: 3s
226336768/267313385 [========================>.....] - ETA: 3s
226910208/267313385 [========================>.....] - ETA: 3s
227500032/267313385 [========================>.....] - ETA: 3s
228073472/267313385 [========================>.....] - ETA: 3s
228663296/267313385 [========================>.....] - ETA: 3s
229253120/267313385 [========================>.....] - ETA: 3s
229842944/267313385 [========================>.....] - ETA: 3s
230432768/267313385 [========================>.....] - ETA: 3s
231022592/267313385 [========================>.....] - ETA: 3s
231612416/267313385 [========================>.....] - ETA: 3s
232185856/267313385 [=========================>....] - ETA: 3s
232775680/267313385 [=========================>....] - ETA: 2s
233349120/267313385 [=========================>....] - ETA: 2s
233922560/267313385 [=========================>....] - ETA: 2s
234479616/267313385 [=========================>....] - ETA: 2s
235036672/267313385 [=========================>....] - ETA: 2s
235610112/267313385 [=========================>....] - ETA: 2s
236150784/267313385 [=========================>....] - ETA: 2s
236691456/267313385 [=========================>....] - ETA: 2s
237215744/267313385 [=========================>....] - ETA: 2s
237641728/267313385 [=========================>....] - ETA: 2s
237674496/267313385 [=========================>....] - ETA: 2s
240640000/267313385 [==========================>...] - ETA: 2s
241082368/267313385 [==========================>...] - ETA: 2s
241590272/267313385 [==========================>...] - ETA: 2s
242032640/267313385 [==========================>...] - ETA: 2s
242524160/267313385 [==========================>...] - ETA: 2s
242999296/267313385 [==========================>...] - ETA: 2s
243474432/267313385 [==========================>...] - ETA: 2s
243982336/267313385 [==========================>...] - ETA: 2s
244441088/267313385 [==========================>...] - ETA: 1s
244883456/267313385 [==========================>...] - ETA: 1s
245309440/267313385 [==========================>...] - ETA: 1s
245850112/267313385 [==========================>...] - ETA: 1s
246308864/267313385 [==========================>...] - ETA: 1s
246784000/267313385 [==========================>...] - ETA: 1s
247291904/267313385 [==========================>...] - ETA: 1s
247767040/267313385 [==========================>...] - ETA: 1s
248258560/267313385 [==========================>...] - ETA: 1s
248750080/267313385 [==========================>...] - ETA: 1s
249225216/267313385 [==========================>...] - ETA: 1s
249733120/267313385 [===========================>..] - ETA: 1s
250208256/267313385 [===========================>..] - ETA: 1s
250732544/267313385 [===========================>..] - ETA: 1s
251207680/267313385 [===========================>..] - ETA: 1s
251707392/267313385 [===========================>..] - ETA: 1s
252141568/267313385 [===========================>..] - ETA: 1s
252633088/267313385 [===========================>..] - ETA: 1s
253140992/267313385 [===========================>..] - ETA: 1s
253599744/267313385 [===========================>..] - ETA: 1s
254124032/267313385 [===========================>..] - ETA: 1s
254599168/267313385 [===========================>..] - ETA: 1s
255090688/267313385 [===========================>..] - ETA: 1s
255582208/267313385 [===========================>..] - ETA: 1s
256040960/267313385 [===========================>..] - ETA: 0s
256548864/267313385 [===========================>..] - ETA: 0s
257073152/267313385 [===========================>..] - ETA: 0s
257662976/267313385 [===========================>..] - ETA: 0s
258244608/267313385 [===========================>..] - ETA: 0s
258826240/267313385 [============================>.] - ETA: 0s
259416064/267313385 [============================>.] - ETA: 0s
260005888/267313385 [============================>.] - ETA: 0s
260595712/267313385 [============================>.] - ETA: 0s
261185536/267313385 [============================>.] - ETA: 0s
261775360/267313385 [============================>.] - ETA: 0s
262365184/267313385 [============================>.] - ETA: 0s
262955008/267313385 [============================>.] - ETA: 0s
263544832/267313385 [============================>.] - ETA: 0s
264134656/267313385 [============================>.] - ETA: 0s
264724480/267313385 [============================>.] - ETA: 0s
265314304/267313385 [============================>.] - ETA: 0s
265904128/267313385 [============================>.] - ETA: 0s
266493952/267313385 [============================>.] - ETA: 0s
267083776/267313385 [============================>.] - ETA: 0s
267313385/267313385 [==============================] - 23s 0us/step
Download complete.

2. Load a pre-trained native Keras model

The model used in this example is AkidaUNet. It has an AkidaNet (0.5) backbone to extract features combined with a succession of separable transposed convolutional blocks to build an image segmentation map. A pre-trained floating point keras model is downloaded to save training time.

Note

  • The “transposed” convolutional feature is new in Akida 2.0.

  • The “separable transposed” operation is realized through the combination of a QuantizeML custom DepthwiseConv2DTranspose layer with a standard pointwise convolution.

The performance of the model is evaluated using both pixel accuracy and Binary IoU. The pixel accuracy describes how well the model can predict the segmentation mask pixel by pixel and the Binary IoU takes into account how close the predicted mask is to the ground truth.

from akida_models.model_io import load_model

# Retrieve the model file from Brainchip data server
model_file = fetch_file(fname="akida_unet_portrait128.h5",
                        origin="https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128.h5",
                        cache_subdir='models')

# Load the native Keras pre-trained model
model_keras = load_model(model_file)
model_keras.summary()
Downloading data from https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128.h5.

      0/4493952 [..............................] - ETA: 0s
 196608/4493952 [>.............................] - ETA: 1s
 737280/4493952 [===>..........................] - ETA: 0s
1327104/4493952 [=======>......................] - ETA: 0s
1908736/4493952 [===========>..................] - ETA: 0s
2490368/4493952 [===============>..............] - ETA: 0s
3080192/4493952 [===================>..........] - ETA: 0s
3661824/4493952 [=======================>......] - ETA: 0s
4243456/4493952 [===========================>..] - ETA: 0s
4493952/4493952 [==============================] - 0s 0us/step
Download complete.
Model: "akida_unet"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input (InputLayer)          [(None, 128, 128, 3)]     0

 rescaling (Rescaling)       (None, 128, 128, 3)       0

 conv_0 (Conv2D)             (None, 64, 64, 16)        432

 conv_0/BN (BatchNormalizati  (None, 64, 64, 16)       64
 on)

 conv_0/relu (ReLU)          (None, 64, 64, 16)        0

 conv_1 (Conv2D)             (None, 64, 64, 32)        4608

 conv_1/BN (BatchNormalizati  (None, 64, 64, 32)       128
 on)

 conv_1/relu (ReLU)          (None, 64, 64, 32)        0

 conv_2 (Conv2D)             (None, 32, 32, 64)        18432

 conv_2/BN (BatchNormalizati  (None, 32, 32, 64)       256
 on)

 conv_2/relu (ReLU)          (None, 32, 32, 64)        0

 conv_3 (Conv2D)             (None, 32, 32, 64)        36864

 conv_3/BN (BatchNormalizati  (None, 32, 32, 64)       256
 on)

 conv_3/relu (ReLU)          (None, 32, 32, 64)        0

 dw_separable_4 (DepthwiseCo  (None, 16, 16, 64)       576
 nv2D)

 pw_separable_4 (Conv2D)     (None, 16, 16, 128)       8192

 pw_separable_4/BN (BatchNor  (None, 16, 16, 128)      512
 malization)

 pw_separable_4/relu (ReLU)  (None, 16, 16, 128)       0

 dw_separable_5 (DepthwiseCo  (None, 16, 16, 128)      1152
 nv2D)

 pw_separable_5 (Conv2D)     (None, 16, 16, 128)       16384

 pw_separable_5/BN (BatchNor  (None, 16, 16, 128)      512
 malization)

 pw_separable_5/relu (ReLU)  (None, 16, 16, 128)       0

 dw_separable_6 (DepthwiseCo  (None, 8, 8, 128)        1152
 nv2D)

 pw_separable_6 (Conv2D)     (None, 8, 8, 256)         32768

 pw_separable_6/BN (BatchNor  (None, 8, 8, 256)        1024
 malization)

 pw_separable_6/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_7 (DepthwiseCo  (None, 8, 8, 256)        2304
 nv2D)

 pw_separable_7 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_7/BN (BatchNor  (None, 8, 8, 256)        1024
 malization)

 pw_separable_7/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_8 (DepthwiseCo  (None, 8, 8, 256)        2304
 nv2D)

 pw_separable_8 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_8/BN (BatchNor  (None, 8, 8, 256)        1024
 malization)

 pw_separable_8/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_9 (DepthwiseCo  (None, 8, 8, 256)        2304
 nv2D)

 pw_separable_9 (Conv2D)     (None, 8, 8, 256)         65536

 pw_separable_9/BN (BatchNor  (None, 8, 8, 256)        1024
 malization)

 pw_separable_9/relu (ReLU)  (None, 8, 8, 256)         0

 dw_separable_10 (DepthwiseC  (None, 8, 8, 256)        2304
 onv2D)

 pw_separable_10 (Conv2D)    (None, 8, 8, 256)         65536

 pw_separable_10/BN (BatchNo  (None, 8, 8, 256)        1024
 rmalization)

 pw_separable_10/relu (ReLU)  (None, 8, 8, 256)        0

 dw_separable_11 (DepthwiseC  (None, 8, 8, 256)        2304
 onv2D)

 pw_separable_11 (Conv2D)    (None, 8, 8, 256)         65536

 pw_separable_11/BN (BatchNo  (None, 8, 8, 256)        1024
 rmalization)

 pw_separable_11/relu (ReLU)  (None, 8, 8, 256)        0

 dw_separable_12 (DepthwiseC  (None, 4, 4, 256)        2304
 onv2D)

 pw_separable_12 (Conv2D)    (None, 4, 4, 512)         131072

 pw_separable_12/BN (BatchNo  (None, 4, 4, 512)        2048
 rmalization)

 pw_separable_12/relu (ReLU)  (None, 4, 4, 512)        0

 dw_separable_13 (DepthwiseC  (None, 4, 4, 512)        4608
 onv2D)

 pw_separable_13 (Conv2D)    (None, 4, 4, 512)         262144

 pw_separable_13/BN (BatchNo  (None, 4, 4, 512)        2048
 rmalization)

 pw_separable_13/relu (ReLU)  (None, 4, 4, 512)        0

 dw_sepconv_t_0 (DepthwiseCo  (None, 8, 8, 512)        5120
 nv2DTranspose)

 pw_sepconv_t_0 (Conv2D)     (None, 8, 8, 256)         131328

 pw_sepconv_t_0/BN (BatchNor  (None, 8, 8, 256)        1024
 malization)

 pw_sepconv_t_0/relu (ReLU)  (None, 8, 8, 256)         0

 dropout (Dropout)           (None, 8, 8, 256)         0

 dw_sepconv_t_1 (DepthwiseCo  (None, 16, 16, 256)      2560
 nv2DTranspose)

 pw_sepconv_t_1 (Conv2D)     (None, 16, 16, 128)       32896

 pw_sepconv_t_1/BN (BatchNor  (None, 16, 16, 128)      512
 malization)

 pw_sepconv_t_1/relu (ReLU)  (None, 16, 16, 128)       0

 dropout_1 (Dropout)         (None, 16, 16, 128)       0

 dw_sepconv_t_2 (DepthwiseCo  (None, 32, 32, 128)      1280
 nv2DTranspose)

 pw_sepconv_t_2 (Conv2D)     (None, 32, 32, 64)        8256

 pw_sepconv_t_2/BN (BatchNor  (None, 32, 32, 64)       256
 malization)

 pw_sepconv_t_2/relu (ReLU)  (None, 32, 32, 64)        0

 dropout_2 (Dropout)         (None, 32, 32, 64)        0

 dw_sepconv_t_3 (DepthwiseCo  (None, 64, 64, 64)       640
 nv2DTranspose)

 pw_sepconv_t_3 (Conv2D)     (None, 64, 64, 32)        2080

 pw_sepconv_t_3/BN (BatchNor  (None, 64, 64, 32)       128
 malization)

 pw_sepconv_t_3/relu (ReLU)  (None, 64, 64, 32)        0

 dropout_3 (Dropout)         (None, 64, 64, 32)        0

 dw_sepconv_t_4 (DepthwiseCo  (None, 128, 128, 32)     320
 nv2DTranspose)

 pw_sepconv_t_4 (Conv2D)     (None, 128, 128, 16)      528

 pw_sepconv_t_4/BN (BatchNor  (None, 128, 128, 16)     64
 malization)

 pw_sepconv_t_4/relu (ReLU)  (None, 128, 128, 16)      0

 dropout_4 (Dropout)         (None, 128, 128, 16)      0

 head (Conv2D)               (None, 128, 128, 1)       17

 sigmoid_act (Activation)    (None, 128, 128, 1)       0

=================================================================
Total params: 1,058,865
Trainable params: 1,051,889
Non-trainable params: 6,976
_________________________________________________________________
from keras.metrics import BinaryIoU

# Compile the native Keras model (required to evaluate the metrics)
model_keras.compile(loss='binary_crossentropy', metrics=[BinaryIoU(), 'accuracy'])

# Check Keras model performance
_, biou, acc = model_keras.evaluate(x_val, y_val, steps=steps, verbose=0)

print(f"Keras binary IoU / pixel accuracy: {biou:.4f} / {100*acc:.2f}%")
Keras binary IoU / pixel accuracy: 0.9455 / 97.28%

3. Load a pre-trained quantized Keras model

The next step is to quantize and potentially perform Quantize Aware Training (QAT) on the Keras model from the previous step. After the Keras model is quantized to 8-bits for all weights and activations, QAT is used to maintain the performance of the quantized model. Again, a pre-trained model is downloaded to save runtime.

from akida_models import akida_unet_portrait128_pretrained

# Load the pre-trained quantized model
model_quantized_keras = akida_unet_portrait128_pretrained()
model_quantized_keras.summary()
Downloading data from https://data.brainchip.com/models/AkidaV2/akida_unet/akida_unet_portrait128_i8_w8_a8.h5.

      0/4520576 [..............................] - ETA: 0s
 188416/4520576 [>.............................] - ETA: 1s
 729088/4520576 [===>..........................] - ETA: 0s
1310720/4520576 [=======>......................] - ETA: 0s
1892352/4520576 [===========>..................] - ETA: 0s
2465792/4520576 [===============>..............] - ETA: 0s
3014656/4520576 [===================>..........] - ETA: 0s
3563520/4520576 [======================>.......] - ETA: 0s
4120576/4520576 [==========================>...] - ETA: 0s
4520576/4520576 [==============================] - 0s 0us/step
Download complete.
Model: "akida_unet"
_________________________________________________________________
 Layer (type)                Output Shape              Param #
=================================================================
 input (InputLayer)          [(None, 128, 128, 3)]     0

 rescaling (QuantizedRescali  (None, 128, 128, 3)      0
 ng)

 conv_0 (QuantizedConv2D)    (None, 64, 64, 16)        448

 conv_0/relu (QuantizedReLU)  (None, 64, 64, 16)       32

 conv_1 (QuantizedConv2D)    (None, 64, 64, 32)        4640

 conv_1/relu (QuantizedReLU)  (None, 64, 64, 32)       64

 conv_2 (QuantizedConv2D)    (None, 32, 32, 64)        18496

 conv_2/relu (QuantizedReLU)  (None, 32, 32, 64)       128

 conv_3 (QuantizedConv2D)    (None, 32, 32, 64)        36928

 conv_3/relu (QuantizedReLU)  (None, 32, 32, 64)       128

 dw_separable_4 (QuantizedDe  (None, 16, 16, 64)       704
 pthwiseConv2D)

 pw_separable_4 (QuantizedCo  (None, 16, 16, 128)      8320
 nv2D)

 pw_separable_4/relu (Quanti  (None, 16, 16, 128)      256
 zedReLU)

 dw_separable_5 (QuantizedDe  (None, 16, 16, 128)      1408
 pthwiseConv2D)

 pw_separable_5 (QuantizedCo  (None, 16, 16, 128)      16512
 nv2D)

 pw_separable_5/relu (Quanti  (None, 16, 16, 128)      256
 zedReLU)

 dw_separable_6 (QuantizedDe  (None, 8, 8, 128)        1408
 pthwiseConv2D)

 pw_separable_6 (QuantizedCo  (None, 8, 8, 256)        33024
 nv2D)

 pw_separable_6/relu (Quanti  (None, 8, 8, 256)        512
 zedReLU)

 dw_separable_7 (QuantizedDe  (None, 8, 8, 256)        2816
 pthwiseConv2D)

 pw_separable_7 (QuantizedCo  (None, 8, 8, 256)        65792
 nv2D)

 pw_separable_7/relu (Quanti  (None, 8, 8, 256)        512
 zedReLU)

 dw_separable_8 (QuantizedDe  (None, 8, 8, 256)        2816
 pthwiseConv2D)

 pw_separable_8 (QuantizedCo  (None, 8, 8, 256)        65792
 nv2D)

 pw_separable_8/relu (Quanti  (None, 8, 8, 256)        512
 zedReLU)

 dw_separable_9 (QuantizedDe  (None, 8, 8, 256)        2816
 pthwiseConv2D)

 pw_separable_9 (QuantizedCo  (None, 8, 8, 256)        65792
 nv2D)

 pw_separable_9/relu (Quanti  (None, 8, 8, 256)        512
 zedReLU)

 dw_separable_10 (QuantizedD  (None, 8, 8, 256)        2816
 epthwiseConv2D)

 pw_separable_10 (QuantizedC  (None, 8, 8, 256)        65792
 onv2D)

 pw_separable_10/relu (Quant  (None, 8, 8, 256)        512
 izedReLU)

 dw_separable_11 (QuantizedD  (None, 8, 8, 256)        2816
 epthwiseConv2D)

 pw_separable_11 (QuantizedC  (None, 8, 8, 256)        65792
 onv2D)

 pw_separable_11/relu (Quant  (None, 8, 8, 256)        512
 izedReLU)

 dw_separable_12 (QuantizedD  (None, 4, 4, 256)        2816
 epthwiseConv2D)

 pw_separable_12 (QuantizedC  (None, 4, 4, 512)        131584
 onv2D)

 pw_separable_12/relu (Quant  (None, 4, 4, 512)        1024
 izedReLU)

 dw_separable_13 (QuantizedD  (None, 4, 4, 512)        5632
 epthwiseConv2D)

 pw_separable_13 (QuantizedC  (None, 4, 4, 512)        262656
 onv2D)

 pw_separable_13/relu (Quant  (None, 4, 4, 512)        1024
 izedReLU)

 dw_sepconv_t_0 (QuantizedDe  (None, 8, 8, 512)        6144
 pthwiseConv2DTranspose)

 pw_sepconv_t_0 (QuantizedCo  (None, 8, 8, 256)        131328
 nv2D)

 pw_sepconv_t_0/relu (Quanti  (None, 8, 8, 256)        512
 zedReLU)

 dropout (QuantizedDropout)  (None, 8, 8, 256)         0

 dw_sepconv_t_1 (QuantizedDe  (None, 16, 16, 256)      3072
 pthwiseConv2DTranspose)

 pw_sepconv_t_1 (QuantizedCo  (None, 16, 16, 128)      32896
 nv2D)

 pw_sepconv_t_1/relu (Quanti  (None, 16, 16, 128)      256
 zedReLU)

 dropout_1 (QuantizedDropout  (None, 16, 16, 128)      0
 )

 dw_sepconv_t_2 (QuantizedDe  (None, 32, 32, 128)      1536
 pthwiseConv2DTranspose)

 pw_sepconv_t_2 (QuantizedCo  (None, 32, 32, 64)       8256
 nv2D)

 pw_sepconv_t_2/relu (Quanti  (None, 32, 32, 64)       128
 zedReLU)

 dropout_2 (QuantizedDropout  (None, 32, 32, 64)       0
 )

 dw_sepconv_t_3 (QuantizedDe  (None, 64, 64, 64)       768
 pthwiseConv2DTranspose)

 pw_sepconv_t_3 (QuantizedCo  (None, 64, 64, 32)       2080
 nv2D)

 pw_sepconv_t_3/relu (Quanti  (None, 64, 64, 32)       64
 zedReLU)

 dropout_3 (QuantizedDropout  (None, 64, 64, 32)       0
 )

 dw_sepconv_t_4 (QuantizedDe  (None, 128, 128, 32)     384
 pthwiseConv2DTranspose)

 pw_sepconv_t_4 (QuantizedCo  (None, 128, 128, 16)     528
 nv2D)

 pw_sepconv_t_4/relu (Quanti  (None, 128, 128, 16)     32
 zedReLU)

 dropout_4 (QuantizedDropout  (None, 128, 128, 16)     0
 )

 head (QuantizedConv2D)      (None, 128, 128, 1)       17

 dequantizer (Dequantizer)   (None, 128, 128, 1)       0

 sigmoid_act (Activation)    (None, 128, 128, 1)       0

=================================================================
Total params: 1,061,601
Trainable params: 1,047,905
Non-trainable params: 13,696
_________________________________________________________________
# Compile the quantized Keras model (required to evaluate the metrics)
model_quantized_keras.compile(loss='binary_crossentropy', metrics=[BinaryIoU(), 'accuracy'])

# Check Keras model performance
_, biou, acc = model_quantized_keras.evaluate(x_val, y_val, steps=steps, verbose=0)

print(f"Keras quantized binary IoU / pixel accuracy: {biou:.4f} / {100*acc:.2f}%")
Keras quantized binary IoU / pixel accuracy: 0.9399 / 97.01%

4. Conversion to Akida

Finally, the quantized Keras model from the previous step is converted into an Akida model and its performance is evaluated. Note that the original performance of the keras floating point model is maintained throughout the conversion process in this example.

from cnn2snn import convert

# Convert the model
model_akida = convert(model_quantized_keras)
model_akida.summary()
/usr/local/lib/python3.8/dist-packages/cnn2snn/quantizeml/blocks.py:160: UserWarning: Conversion stops at layer head because of a dequantizer. The end of the model is ignored:
___________________________________________________
Layer (type)
===================================================
sigmoid_act (Activation)
===================================================

  warnings.warn("Conversion stops" + stop_layer_msg + " because of a dequantizer. "
                  Model Summary
_________________________________________________
Input shape    Output shape   Sequences  Layers
=================================================
[128, 128, 3]  [128, 128, 1]  1          36
_________________________________________________

_____________________________________________________________________________
Layer (type)                               Output shape    Kernel shape

====================== SW/conv_0-dequantizer (Software) =====================

conv_0 (InputConv2D)                       [64, 64, 16]    (3, 3, 3, 16)
_____________________________________________________________________________
conv_1 (Conv2D)                            [64, 64, 32]    (3, 3, 16, 32)
_____________________________________________________________________________
conv_2 (Conv2D)                            [32, 32, 64]    (3, 3, 32, 64)
_____________________________________________________________________________
conv_3 (Conv2D)                            [32, 32, 64]    (3, 3, 64, 64)
_____________________________________________________________________________
dw_separable_4 (DepthwiseConv2D)           [16, 16, 64]    (3, 3, 64, 1)
_____________________________________________________________________________
pw_separable_4 (Conv2D)                    [16, 16, 128]   (1, 1, 64, 128)
_____________________________________________________________________________
dw_separable_5 (DepthwiseConv2D)           [16, 16, 128]   (3, 3, 128, 1)
_____________________________________________________________________________
pw_separable_5 (Conv2D)                    [16, 16, 128]   (1, 1, 128, 128)
_____________________________________________________________________________
dw_separable_6 (DepthwiseConv2D)           [8, 8, 128]     (3, 3, 128, 1)
_____________________________________________________________________________
pw_separable_6 (Conv2D)                    [8, 8, 256]     (1, 1, 128, 256)
_____________________________________________________________________________
dw_separable_7 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_7 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_8 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_8 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_9 (DepthwiseConv2D)           [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_9 (Conv2D)                    [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_10 (DepthwiseConv2D)          [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_10 (Conv2D)                   [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_11 (DepthwiseConv2D)          [8, 8, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_11 (Conv2D)                   [8, 8, 256]     (1, 1, 256, 256)
_____________________________________________________________________________
dw_separable_12 (DepthwiseConv2D)          [4, 4, 256]     (3, 3, 256, 1)
_____________________________________________________________________________
pw_separable_12 (Conv2D)                   [4, 4, 512]     (1, 1, 256, 512)
_____________________________________________________________________________
dw_separable_13 (DepthwiseConv2D)          [4, 4, 512]     (3, 3, 512, 1)
_____________________________________________________________________________
pw_separable_13 (Conv2D)                   [4, 4, 512]     (1, 1, 512, 512)
_____________________________________________________________________________
dw_sepconv_t_0 (DepthwiseConv2DTranspose)  [8, 8, 512]     (3, 3, 512, 1)
_____________________________________________________________________________
pw_sepconv_t_0 (Conv2D)                    [8, 8, 256]     (1, 1, 512, 256)
_____________________________________________________________________________
dw_sepconv_t_1 (DepthwiseConv2DTranspose)  [16, 16, 256]   (3, 3, 256, 1)
_____________________________________________________________________________
pw_sepconv_t_1 (Conv2D)                    [16, 16, 128]   (1, 1, 256, 128)
_____________________________________________________________________________
dw_sepconv_t_2 (DepthwiseConv2DTranspose)  [32, 32, 128]   (3, 3, 128, 1)
_____________________________________________________________________________
pw_sepconv_t_2 (Conv2D)                    [32, 32, 64]    (1, 1, 128, 64)
_____________________________________________________________________________
dw_sepconv_t_3 (DepthwiseConv2DTranspose)  [64, 64, 64]    (3, 3, 64, 1)
_____________________________________________________________________________
pw_sepconv_t_3 (Conv2D)                    [64, 64, 32]    (1, 1, 64, 32)
_____________________________________________________________________________
dw_sepconv_t_4 (DepthwiseConv2DTranspose)  [128, 128, 32]  (3, 3, 32, 1)
_____________________________________________________________________________
pw_sepconv_t_4 (Conv2D)                    [128, 128, 16]  (1, 1, 32, 16)
_____________________________________________________________________________
head (Conv2D)                              [128, 128, 1]   (1, 1, 16, 1)
_____________________________________________________________________________
dequantizer (Dequantizer)                  [128, 128, 1]   N/A
_____________________________________________________________________________
import tensorflow as tf

# Check Akida model performance
labels, pots = None, None

for s in range(steps):
    batch = x_val[s * batch_size: (s + 1) * batch_size, :]
    label_batch = y_val[s * batch_size: (s + 1) * batch_size, :]
    pots_batch = model_akida.predict(batch.astype('uint8'))

    if labels is None:
        labels = label_batch
        pots = pots_batch
    else:
        labels = np.concatenate((labels, label_batch))
        pots = np.concatenate((pots, pots_batch))
preds = tf.keras.activations.sigmoid(pots)

m_binary_iou = tf.keras.metrics.BinaryIoU(target_class_ids=[0, 1], threshold=0.5)
m_binary_iou.update_state(labels, preds)
binary_iou = m_binary_iou.result().numpy()

m_accuracy = tf.keras.metrics.Accuracy()
m_accuracy.update_state(labels, preds > 0.5)
accuracy = m_accuracy.result().numpy()
print(f"Akida binary IoU / pixel accuracy: {binary_iou:.4f} / {100*accuracy:.2f}%")

# For non-regression purpose
assert binary_iou > 0.9
Akida binary IoU / pixel accuracy: 0.9388 / 97.01%

5. Segment a single image

For visualization of the person segmentation performed by the Akida model, display a single image along with the segmentation produced by the original floating point model and the ground truth segmentation.

import matplotlib.pyplot as plt

# Estimate age on a random single image and display Keras and Akida outputs
sample = np.expand_dims(x_val[id, :], 0)
keras_out = model_keras(sample)
akida_out = tf.keras.activations.sigmoid(model_akida.forward(sample.astype('uint8')))

fig, axs = plt.subplots(1, 3, constrained_layout=True)
axs[0].imshow(keras_out[0] * sample[0] / 255.)
axs[0].set_title('Keras segmentation', fontsize=10)
axs[0].axis('off')

axs[1].imshow(akida_out[0] * sample[0] / 255.)
axs[1].set_title('Akida segmentation', fontsize=10)
axs[1].axis('off')

axs[2].imshow(y_val[id] * sample[0] / 255.)
axs[2].set_title('Expected segmentation', fontsize=10)
axs[2].axis('off')

plt.show()
Keras segmentation, Akida segmentation, Expected segmentation

Total running time of the script: (1 minutes 58.104 seconds)

Gallery generated by Sphinx-Gallery