PlumX Metrics
Embed PlumX Metrics

Efficient Spiking Neural Architecture Search with Mixed Neuron Models and Variable Thresholds

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN: 1611-3349, Vol: 14448 LNCS, Page: 466-481
2024
  • 0
    Citations
  • 0
    Usage
  • 1
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Conference Paper Description

Spiking Neural Networks (SNNs) are emerging as energy-efficient alternatives to artificial neural networks (ANNs) due to their event-driven computation and effective processing of temporal information. While Neural Architecture Search (NAS) has been extensively used to optimize neural network structures, its application to SNNs remains limited. Existing studies often overlook the temporal differences in information propagation between ANNs and SNNs. Instead, they focus on shared structures such as convolutional, recurrent, or pooling modules. This work introduces a novel neural architecture search framework, MixedSNN, explicitly designed for SNNs. Inspired by the human brain, MixedSNN incorporates a novel search space called SSP, which explores the impact of utilizing Mixed spiking neurons and Variable thresholds on SNN performance. Additionally, we propose a training-free evaluation strategy called Period-Based Spike Evaluation (PBSE), which leverages spike activation patterns to incorporate temporal features in SNNs. The performance of SNN architectures obtained through MixedSNN is evaluated on three datasets, including CIFAR-10, CIFAR-100, and CIFAR-10-DVS. Results demonstrate that MixedSNN can achieve state-of-the-art performance with significantly lower timesteps.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know