Logo
MetaTF 2.2.2
  • Overview
  • Installation
    • Requirements
    • Quick installation
    • Running examples
  • User guide
    • Getting started
      • For beginners
      • For users familiar with deep-learning
    • Akida user guide
      • Introduction
        • Akida layers
        • Input Format
        • A versatile machine learning framework
      • The Sequential model
        • Specifying the model
        • Accessing layer parameters and weights
        • Inference
        • Saving and loading
        • Input layer types
        • Data-Processing layer types
      • Model Hardware Mapping
        • Devices
        • Model mapping
        • Advanced Mapping Details and Hardware Devices Usage
        • Performances measurement
      • Using Akida Edge learning
        • Learning constraints
        • Compiling a layer
    • CNN2SNN toolkit
      • Overview
        • Conversion workflow
        • Typical training scenario
        • Design compatibility constraints
        • Quantization compatibility constraints
        • Command-line interface
      • Layers Considerations
        • Supported layer types
        • CNN2SNN Quantization-aware layers
        • Training-Only Layers
        • First Layers
        • Final Layers
      • Tips and Tricks
    • Akida models zoo
      • Overview
      • Command-line interface for model creation
      • Command-line interface for model training
        • UTK Face training
        • KWS training
        • YOLO training
        • AkidaNet training
      • Command-line interface for model evaluation
      • Layer Blocks
        • conv_block
        • dense_block
        • separable_conv_block
    • Hardware constraints
      • InputConvolutional
      • Convolutional
      • SeparableConvolutional
      • FullyConnected
    • Akida versions compatibility
      • Upgrading models with legacy quantizers
  • API reference
    • Akida runtime
      • Model
      • Layer
        • Layer
        • Mapping
      • InputData
      • InputConvolutional
      • FullyConnected
      • Convolutional
      • SeparableConvolutional
      • Layer parameters
        • LayerType
        • Padding
        • PoolType
        • LearningType
      • Sequence
        • Sequence
        • BackendType
        • Pass
      • Device
        • Device
        • HwVersion
      • HWDevice
        • HWDevice
        • SocDriver
        • ClockMode
      • PowerMeter
      • NP
      • Tools
        • Sparsity
        • Compatibility
    • CNN2SNN
      • Tool functions
        • quantize
        • quantize_layer
        • convert
        • check_model_compatibility
        • load_quantized_model
        • Transforms
        • Calibration
      • Quantizers
        • WeightQuantizer
        • LinearWeightQuantizer
        • StdWeightQuantizer
        • StdPerAxisQuantizer
        • MaxQuantizer
        • MaxPerAxisQuantizer
      • Quantized layers
        • QuantizedConv2D
        • QuantizedDense
        • QuantizedSeparableConv2D
        • QuantizedActivation
        • ActivationDiscreteRelu
        • QuantizedReLU
    • Akida models
      • Layer blocks
        • conv_block
        • separable_conv_block
        • dense_block
      • Helpers
        • BatchNormalization gamma constraint
      • Knowledge distillation
      • Pruning
      • Training
      • Model zoo
        • AkidaNet
        • Mobilenet
        • DS-CNN
        • VGG
        • YOLO
        • ConvTiny
        • PointNet++
        • GXNOR
  • Examples
    • General examples
      • GXNOR/MNIST inference
        • 1. Dataset preparation
        • 2. Create a Keras GXNOR model
        • 3. Conversion to Akida
      • AkidaNet/ImageNet inference
        • 1. Dataset preparation
        • 2. Create a Keras AkidaNet model
        • 3. Quantized model
        • 4. Pretrained quantized model
        • 5. Conversion to Akida
        • 6. Hardware mapping and performance
      • DS-CNN/KWS inference
        • 1. Load the preprocessed dataset
        • 2. Load a pre-trained native Keras model
        • 3. Load a pre-trained quantized Keras model satisfying Akida NSoC requirements
        • 4. Conversion to Akida
        • 5. Confusion matrix
      • Regression tutorial
        • 1. Load the dataset
        • 2. Load a pre-trained native Keras model
        • 3. Load a pre-trained quantized Keras model satisfying Akida NSoC requirements
        • 4. Conversion to Akida
        • 5. Estimate age on a single image
      • Transfer learning with AkidaNet for PlantVillage
        • Transfer learning process
        • 1. Dataset preparation
        • 2. Get a trained AkidaNet base model
        • 3. Add a float classification head to the model
        • 4. Freeze the base model
        • 5. Train for a few epochs
        • 6. Quantize the classification head
        • 7. Compute accuracy
      • YOLO/PASCAL-VOC detection tutorial
        • 1. Introduction
        • 2. Preprocessing tools
        • 3. Model architecture
        • 4. Training
        • 5. Performance
        • 6. Conversion to Akida
    • CNN2SNN tutorials
      • CNN conversion flow tutorial
        • 1. Load and reshape MNIST dataset
        • 2. Model definition
        • 3. Model training
        • 4. Model quantization
        • 5. Model fine tuning (quantization-aware training)
        • 6. Model conversion
      • Advanced CNN2SNN tutorial
        • 1. Design a CNN2SNN quantized model
        • 2. Weight Quantizer Details
        • 3. Understanding quantized activation
        • 4. How to deal with too high scale factors
    • Edge examples
      • Akida vision edge learning
        • 1. Dataset preparation
        • 2. Prepare Akida model for learning
        • 3. Edge learning with Akida
      • Akida edge learning for keyword spotting
        • 1. Edge learning process
        • 2. Dataset preparation
        • 3. Prepare Akida model for learning
        • 4. Learn with Akida using the training set
        • 5. Edge learning
      • Tips to set Akida learning parameters
        • 1. Akida learning parameters
        • 2. Create Akida model
        • 3. Estimate the required number of weights of the trainable layer
        • 4. Estimate the number of neurons per class
  • Model zoo performances
    • Image domain
      • Classification
      • Object detection
      • Regression
      • Face recognition
    • Audio domain
      • Keyword spotting
    • Time domain
      • Fault detection
      • Classification
    • Point cloud
      • Classification
  • Changelog
  • Support
  • License
Akida Examples
  • »
  • Akida examples

Akida examples

To learn how to use the Akida runtime, the CNN2SNN toolkit and check the Akida processor performance against MNIST, ImageNet and Google Speech Commands (KWS) datasets please refer to the sections below.

General examples

GXNOR/MNIST inference

GXNOR/MNIST inference

AkidaNet/ImageNet inference

AkidaNet/ImageNet inference

DS-CNN/KWS inference

DS-CNN/KWS inference

Regression tutorial

Regression tutorial

Transfer learning with AkidaNet for PlantVillage

Transfer learning with AkidaNet for PlantVillage

YOLO/PASCAL-VOC detection tutorial

YOLO/PASCAL-VOC detection tutorial

CNN2SNN tutorials

CNN conversion flow tutorial

CNN conversion flow tutorial

Advanced CNN2SNN tutorial

Advanced CNN2SNN tutorial

Edge examples

Akida vision edge learning

Akida vision edge learning

Akida edge learning for keyword spotting

Akida edge learning for keyword spotting

Tips to set Akida learning parameters

Tips to set Akida learning parameters

Download all examples in Python source code: examples_python.zip

Download all examples in Jupyter notebooks: examples_jupyter.zip

Gallery generated by Sphinx-Gallery

Previous Next

© Copyright 2022, BrainChip Holdings Ltd. All Rights Reserved.