ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
ERCIM news 130
ERCIM news 130
Back Issues Online
Back Issues Online

by Stanisław Woźniak, Thomas Bohnstingl and Angeliki Pantazi (IBM Research – Europe)

Deep learning has achieved outstanding success in several artificial intelligence (AI) tasks, resulting in human-like performance, albeit at a much higher power than the ~20 watts required by the human brain. We have developed an approach that incorporates biologically inspired neural dynamics into deep learning using a novel construct called spiking neural unit (SNU). Remarkably, these biological insights enabled SNU-based deep learning to even surpass the state-of-the-art performance while simultaneously enhancing the energy-efficiency of AI hardware implementations.

The state-of-the-art deep learning is based on artificial neural networks (ANNs) that only take inspiration from biology to a very limited extent – primarily from its networked structure. This has several drawbacks, especially in terms of power consumption and energy efficiency [1]. Our group at IBM Research – Europe has been studying more realistic models of neural dynamics in the brain, known as spiking neural networks (SNNs). Since human brains are still much more capable than modern deep learning systems, SNNs have been widely considered as the most promising contender for the next generation of neural networks [1]. However, modelling and training complex SNNs has remained a challenge, which has led to the conviction among many researchers that the performance of ANNs is in practice superior to that of SNNs.

We have developed a novel spiking neural unit (SNU) that unifies the biologically inspired dynamics of spiking neurons with the research advances of ANNs and deep learning [2, L1]. In particular, we demonstrated that the leaky integrate and fire (LIF) evolution of the membrane potential of a biological neuron (Figure 1a) can be modelled, maintaining the exact equivalence, through a specific combination of recurrent artificial neurons forming jointly an SNU (Figure 1b). Therefore, SNUs can be flexibly incorporated into deep learning architectures and trained to high accuracy using supervised methods developed for ANNs, such as the backpropagation through time algorithm. We have compared SNUs and common deep learning units, such as LSTMs and GRUs, on tasks including image classification (Figure 1c, upper part), language modelling, music prediction and weather prediction. Importantly, we demonstrated that although SNUs have the lowest number of parameters, they can surpass the accuracy of these units and provide a significant increase in speed (Figure 1c, lower part). Our results established the SNN state of the art with the best accuracy of 99.53% +/- 0.03%, for the handwritten digit classification task based on the rate-coded MNIST dataset using a convolutional neural network. Moreover, we even demonstrated the first-of-a-kind temporal generative adversarial network (GAN) based on an SNN.

Figure 1: Incorporating biologically inspired dynamics into deep learning: a. Biological neurons receive input spikes that are modulated by synaptic weights at the dendrites and accumulated into the membrane potential Vm in cell soma. This is typically modelled as a resistor-capacitor (RC) circuit. The output spikes are emitted through axons to downstream neurons. b. SNU models the spiking neural dynamics through two recurrent artificial neurons. N1 performs the accumulation and corresponds to the Vm. N2 controls the spike emission and resets the Vm. c. Digit classification for rate-coded MNIST dataset. In the upper part of the pane, feed-forward networks of common deep learning units are compared. Higher accuracy indicates improvement. In the lower part of the pane, the training time of SNU vs. LSTM is illustrated. d. The synaptic operations are accelerated in-memory through physical properties of the crossbar structure, illustrated in the upper part of the pane. We use two Phase Change Memory devices per synapse. The lower part of the pane contains a comparison of the average negative log-likelihood of software simulation vs. hardware experiment for the music prediction task using JSB dataset. Lower values correspond to higher-quality predictions. Figure adapted from [2].
Figure 1: Incorporating biologically inspired dynamics into deep learning: a. Biological neurons receive input spikes that are modulated by synaptic weights at the dendrites and accumulated into the membrane potential Vm in cell soma. This is typically modelled as a resistor-capacitor (RC) circuit. The output spikes are emitted through axons to downstream neurons. b. SNU models the spiking neural dynamics through two recurrent artificial neurons. N1 performs the accumulation and corresponds to the Vm. N2 controls the spike emission and resets the Vm. c. Digit classification for rate-coded MNIST dataset. In the upper part of the pane, feed-forward networks of common deep learning units are compared. Higher accuracy indicates improvement. In the lower part of the pane, the training time of SNU vs. LSTM is illustrated. d. The synaptic operations are accelerated in-memory through physical properties of the crossbar structure, illustrated in the upper part of the pane. We use two Phase Change Memory devices per synapse. The lower part of the pane contains a comparison of the average negative log-likelihood of software simulation vs. hardware experiment for the music prediction task using JSB dataset. Lower values correspond to higher-quality predictions. Figure adapted from [2].

Neural circuits in the human brain exhibit an additional set of functionalities observed on the neural dynamics level, where adaptive spiking thresholds play an important role as well as on the architectural level, where lateral inhibition between neurons is a commonly observed theme. Another advantage of the SNUs is that the developed framework can easily be extended to incorporate such neural functionalities. For example, we demonstrated that the LI-SNU variant, that implements the lateral inhibition, achieves digit classification performance that is on par with the original SNU while significantly reducing the required number of spikes.

The increasing adoption of ANNs has sparked the development of hardware accelerators to speed up the required computations. Because SNUs cast the dynamics of spiking neurons into the deep learning framework, they provide a systematic methodology for training SNNs using such AI accelerators. We demonstrated this with a highly efficient in-memory computing concept based on nanoscale phase-change memory devices (Figure 1d), where the spiking nature of SNNs leads to further energy savings [3]. Furthermore, we showed that SNNs are robust to operation with low-precision synaptic weights down to 4-bits of precision and can also cope with hardware imperfections such as noise and drift.

Our work on SNU has bridged the ANN and the SNN worlds by incorporating biologically inspired neural dynamics into deep learning, enabling to benefit from both worlds. SNU allows SNNs to take direct advantage of recent deep learning advances, which enable easy scaling up and training deep SNNs to a high degree of accuracy. From the ANN perspective, SNU implements novel temporal dynamics for machine learning applications with fewer parameters and potentially higher performance than the state-of-the-art units. Finally, hardware accelerators can also benefit from SNUs, as they allow for a highly efficient implementation and unlock the potential of neuromorphic hardware by training deep SNNs to high accuracy. Our team is continuing the work towards developing the biologically inspired deep learning paradigm by also exploring ways of enhancing the learning algorithms with neuroscientific insights [4].

Link:
[L1] https://www.ibm.com/blogs/research/2020/06/biologically-inspired-deep-learning-predicts-chords-of-bach/

References:
[1] W. Maass: “Networks of spiking neurons: The third generation of neural network models,” Neural Networks, vol. 10, no. 9, pp. 1659–1671, 1997.
[2] S. Woźniak, et al.: “Deep learning incorporating biologically inspired neural dynamics and in-memory computing,” Nat Mach Intell, vol. 2, no. 6, pp. 325–336, Jun. 2020.
[3] A. Pantazi, et al.: “All-memristive neuromorphic computing with level-tuned neurons,” Nanotechnology, vol. 27, no. 35, p. 355205, 2016.
[4] T. Bohnstingl, et al.:, “Online Spatio-Temporal Learning in Deep Neural Networks,” https://arxiv.org/abs/2007.12723

Please contact:
Stanisław Woźniak, IBM Research – Europe, Switzerland
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: January 2024
Special theme:
Large Language Models
Call for the next issue
Image ERCIM News 125
This issue in pdf

 

Image ERCIM News 125 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed