↓ Skip to main content

Training Deep Spiking Neural Networks Using Backpropagation

Overview of attention for article published in Frontiers in Neuroscience, November 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (89th percentile)
  • High Attention Score compared to outputs of the same age and source (93rd percentile)

Mentioned by

blogs
1 blog
twitter
6 X users
patent
2 patents
googleplus
2 Google+ users
q&a
1 Q&A thread
video
1 YouTube creator

Readers on

mendeley
776 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Training Deep Spiking Neural Networks Using Backpropagation
Published in
Frontiers in Neuroscience, November 2016
DOI 10.3389/fnins.2016.00508
Pubmed ID
Authors

Jun Haeng Lee, Tobi Delbruck, Michael Pfeiffer

Abstract

Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.

X Demographics

X Demographics

The data shown below were collected from the profiles of 6 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 776 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 <1%
Turkey 1 <1%
Germany 1 <1%
Canada 1 <1%
United States 1 <1%
Unknown 770 99%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 181 23%
Student > Master 137 18%
Researcher 92 12%
Student > Bachelor 68 9%
Student > Doctoral Student 30 4%
Other 78 10%
Unknown 190 24%
Readers by discipline Count As %
Computer Science 214 28%
Engineering 195 25%
Neuroscience 57 7%
Agricultural and Biological Sciences 23 3%
Physics and Astronomy 23 3%
Other 59 8%
Unknown 205 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 20. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 14 October 2021.
All research outputs
#1,850,612
of 25,457,858 outputs
Outputs from Frontiers in Neuroscience
#977
of 11,568 outputs
Outputs of similar age
#32,219
of 319,320 outputs
Outputs of similar age from Frontiers in Neuroscience
#10
of 139 outputs
Altmetric has tracked 25,457,858 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,568 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 11.0. This one has done particularly well, scoring higher than 91% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 319,320 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 89% of its contemporaries.
We're also able to compare this research output to 139 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 93% of its contemporaries.