Thanks to a mathematical breakthrough, artificial-intelligence applications like speech recognition, gesture recognition and ECG classification can become 100-1000 times more energy efficient. This makes it possible to put much more elaborate AI in chips, enabling applications to run on a smartphone or smartwatch where before this was done in the cloud. Because a network connection is no longer necessary, the applications are more robust and because data can be stored and processed locally, they are more privacy friendly. The mathematical breakthrough has been achieved by researchers of Centrum Wiskunde & Informatica (CWI), the Dutch national research center for mathematics and computer science, together with Holst Centre.
Under the supervision of CWI researcher and UvA professor of cognitive neurobiology Sander Bohté, the scientists developed a learning algorithm for so-called spiking neural networks. Such networks have been around for some time but are very difficult to handle from a mathematical perspective, making it hard to put them into practice so far. The new algorithm, which has been made available open source, is ground-breaking in two ways: the neurons in the network are required to communicate a lot less frequently and each individual neuron has to execute fewer calculations. “The combination of these two breakthroughs makes AI algorithms a thousand times more energy efficient in comparison with standard neural networks, and a factor hundred more energy efficient than current state-of-the-art neural networks”, says principal investigator Bohté.
The spiking neural networks developed by Bohté and his research team differ from those already integrated in AI applications. “The communication between neurons in classical neural networks is continuous and easy to handle from a mathematical perspective. Spiking neurons look more like the human brain and communicate only sparingly and with short pulses. This, however, means that the signals are discontinuous and much more difficult to handle mathematically.” To run spiking neural networks efficiently in the real world, a new type of chips is needed. Bohté says that prototypes are already being developed. “All kinds of companies are working hard to make this happen, like our project partner Holst Centre.”
Bohté’s methods can train spiking neural networks comprised of up to a few thousand neurons, less than typical classical neural networks but sufficient for many applications. The next challenge will therefore be to scale up these networks to 100,000 or a million neurons, which will expand the application possibilities even further.