↓ Skip to main content

Reducing the computational footprint for real-time BCPNN learning

Overview of attention for article published in Frontiers in Neuroscience, January 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (80th percentile)
  • Good Attention Score compared to outputs of the same age and source (65th percentile)

Mentioned by

twitter
1 X user
patent
1 patent
wikipedia
1 Wikipedia page

Citations

dimensions_citation
14 Dimensions

Readers on

mendeley
32 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Reducing the computational footprint for real-time BCPNN learning
Published in
Frontiers in Neuroscience, January 2015
DOI 10.3389/fnins.2015.00002
Pubmed ID
Authors

Bernhard Vogginger, René Schüffny, Anders Lansner, Love Cederström, Johannes Partzsch, Sebastian Höppner

Abstract

The implementation of synaptic plasticity in neural simulation or neuromorphic hardware is usually very resource-intensive, often requiring a compromise between efficiency and flexibility. A versatile, but computationally-expensive plasticity mechanism is provided by the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm. Building upon Bayesian statistics, and having clear links to biological plasticity processes, the BCPNN learning rule has been applied in many fields, ranging from data classification, associative memory, reward-based learning, probabilistic inference to cortical attractor memory networks. In the spike-based version of this learning rule the pre-, postsynaptic and coincident activity is traced in three low-pass-filtering stages, requiring a total of eight state variables, whose dynamics are typically simulated with the fixed step size Euler method. We derive analytic solutions allowing an efficient event-driven implementation of this learning rule. Further speedup is achieved by first rewriting the model which reduces the number of basic arithmetic operations per update to one half, and second by using look-up tables for the frequently calculated exponential decay. Ultimately, in a typical use case, the simulation using our approach is more than one order of magnitude faster than with the fixed step size Euler method. Aiming for a small memory footprint per BCPNN synapse, we also evaluate the use of fixed-point numbers for the state variables, and assess the number of bits required to achieve same or better accuracy than with the conventional explicit Euler method. All of this will allow a real-time simulation of a reduced cortex model based on BCPNN in high performance computing. More important, with the analytic solution at hand and due to the reduced memory bandwidth, the learning rule can be efficiently implemented in dedicated or existing digital neuromorphic hardware.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 32 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 3%
Japan 1 3%
Unknown 30 94%

Demographic breakdown

Readers by professional status Count As %
Student > Master 8 25%
Student > Ph. D. Student 5 16%
Professor 3 9%
Student > Bachelor 2 6%
Student > Postgraduate 2 6%
Other 7 22%
Unknown 5 16%
Readers by discipline Count As %
Computer Science 12 38%
Engineering 6 19%
Agricultural and Biological Sciences 2 6%
Psychology 2 6%
Neuroscience 2 6%
Other 2 6%
Unknown 6 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 April 2021.
All research outputs
#5,339,559
of 25,374,917 outputs
Outputs from Frontiers in Neuroscience
#4,041
of 11,541 outputs
Outputs of similar age
#70,518
of 359,658 outputs
Outputs of similar age from Frontiers in Neuroscience
#41
of 125 outputs
Altmetric has tracked 25,374,917 research outputs across all sources so far. Compared to these this one has done well and is in the 78th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,541 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.9. This one has gotten more attention than average, scoring higher than 64% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 359,658 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 80% of its contemporaries.
We're also able to compare this research output to 125 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 65% of its contemporaries.