↓ Skip to main content

Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms

Overview of attention for article published in Frontiers in Neuroscience, July 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (87th percentile)
  • High Attention Score compared to outputs of the same age and source (88th percentile)

Mentioned by

blogs
1 blog
twitter
4 X users
patent
2 patents
facebook
1 Facebook page

Citations

dimensions_citation
76 Dimensions

Readers on

mendeley
125 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms
Published in
Frontiers in Neuroscience, July 2015
DOI 10.3389/fnins.2015.00222
Pubmed ID
Authors

Evangelos Stromatias, Daniel Neil, Michael Pfeiffer, Francesco Galluppi, Steve B. Furber, Shih-Chii Liu

Abstract

Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs) are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks require vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost two bits, and show that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 125 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 2%
United States 2 2%
India 1 <1%
Germany 1 <1%
Korea, Republic of 1 <1%
Switzerland 1 <1%
Unknown 117 94%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 40 32%
Researcher 18 14%
Student > Master 14 11%
Student > Bachelor 9 7%
Student > Doctoral Student 8 6%
Other 11 9%
Unknown 25 20%
Readers by discipline Count As %
Computer Science 44 35%
Engineering 28 22%
Neuroscience 7 6%
Physics and Astronomy 4 3%
Biochemistry, Genetics and Molecular Biology 2 2%
Other 10 8%
Unknown 30 24%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 13. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 02 December 2019.
All research outputs
#2,714,887
of 25,374,917 outputs
Outputs from Frontiers in Neuroscience
#1,709
of 11,541 outputs
Outputs of similar age
#33,362
of 275,994 outputs
Outputs of similar age from Frontiers in Neuroscience
#12
of 106 outputs
Altmetric has tracked 25,374,917 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,541 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.9. This one has done well, scoring higher than 85% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 275,994 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 87% of its contemporaries.
We're also able to compare this research output to 106 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 88% of its contemporaries.