↓ Skip to main content

Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines

Overview of attention for article published in Frontiers in Neuroscience, June 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (92nd percentile)
  • High Attention Score compared to outputs of the same age and source (94th percentile)

Mentioned by

blogs
2 blogs
twitter
29 X users
facebook
1 Facebook page
reddit
1 Redditor

Citations

dimensions_citation
207 Dimensions

Readers on

mendeley
188 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines
Published in
Frontiers in Neuroscience, June 2017
DOI 10.3389/fnins.2017.00324
Pubmed ID
Authors

Emre O. Neftci, Charles Augustine, Somnath Paul, Georgios Detorakis

Abstract

An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. However, the workhorse of deep learning, the gradient descent Gradient Back Propagation (BP) rule, often relies on the immediate availability of network-wide information stored with high-precision memory during learning, and precise operations that are difficult to realize in neuromorphic hardware. Remarkably, recent work showed that exact backpropagated gradients are not essential for learning deep representations. Building on these results, we demonstrate an event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations. Using a two-compartment Leaky Integrate & Fire (I&F) neuron, the rule requires only one addition and two comparisons for each synaptic weight, making it very suitable for implementation in digital or mixed-signal neuromorphic hardware. Our results show that using eRBP, deep representations are rapidly learned, achieving classification accuracies on permutation invariant datasets comparable to those obtained in artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.

X Demographics

X Demographics

The data shown below were collected from the profiles of 29 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 188 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 188 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 44 23%
Student > Master 31 16%
Researcher 19 10%
Student > Bachelor 14 7%
Student > Doctoral Student 10 5%
Other 20 11%
Unknown 50 27%
Readers by discipline Count As %
Engineering 49 26%
Computer Science 42 22%
Neuroscience 20 11%
Physics and Astronomy 8 4%
Social Sciences 3 2%
Other 13 7%
Unknown 53 28%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 33. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 02 December 2019.
All research outputs
#1,237,412
of 25,853,983 outputs
Outputs from Frontiers in Neuroscience
#541
of 11,720 outputs
Outputs of similar age
#24,423
of 334,498 outputs
Outputs of similar age from Frontiers in Neuroscience
#10
of 196 outputs
Altmetric has tracked 25,853,983 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 95th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,720 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 11.0. This one has done particularly well, scoring higher than 95% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 334,498 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 92% of its contemporaries.
We're also able to compare this research output to 196 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 94% of its contemporaries.