Stochasticity and robustness in spiking neural networks
Neurocomputing, ISSN: 0925-2312, Vol: 419, Page: 23-36
2021
- 8Citations
- 45Captures
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Article Description
Despite drawing inspiration from biological systems which are inherently noisy and variable, artificial neural networks have been shown to require precise weights to carry out the task which they are trained to accomplish. This creates a challenge when adapting these artificial networks to specialized execution platforms which may encode weights in a manner which restricts their accuracy and/or precision. Reflecting back on the non-idealities which are observed in biological systems, we investigated the effect these properties have on the robustness of spiking neural networks under perturbations to weights. First, we examined techniques extant in conventional neural networks which resemble noisy processes, and postulated they may produce similar beneficial effects in spiking neural networks. Second, we evolved a set of spiking neural networks utilizing biological non-idealities to solve a pole-balancing task, and estimated their robustness. We showed it is higher in networks using noisy neurons, and demonstrated that one of these networks can perform well under the variance expected when a hafnium-oxide based resistive memory is used to encode synaptic weights. Lastly, we trained a series of networks using a surrogate gradient method on the MNIST classification task. We confirmed that these networks demonstrate similar trends in robustness to the evolved networks. We discuss these results and argue that they display empirical evidence supporting the role of noise as a regularizer which can increase network robustness.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S0925231220313035; http://dx.doi.org/10.1016/j.neucom.2020.07.105; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85091049174&origin=inward; https://linkinghub.elsevier.com/retrieve/pii/S0925231220313035; https://api.elsevier.com/content/article/PII:S0925231220313035?httpAccept=text/xml; https://api.elsevier.com/content/article/PII:S0925231220313035?httpAccept=text/plain; https://dul.usage.elsevier.com/doi/; https://dx.doi.org/10.1016/j.neucom.2020.07.105
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know