Your cart is currently empty!
Take inspiration from the brain and make AI more energy efficient
Thanks to a mathematical breakthrough achieved at CWI, AI applications like speech recognition, gesture recognition and electrocardiogram (ECG) classification can become a hundred to a thousand times more energy efficient. This means it will be possible to put much more elaborate AI in chips, enabling applications to run locally on a smartphone or smartwatch instead of in the cloud.
Since 2012, the field of artificial intelligence (AI) has made great steps forward thanks to a technique called deep learning. Deep learning has led to numerous practical applications: Apple uses it in the voice recognition of Siri, Facebook automatically tags photos with it and Google Translate translates texts with it, to name just a few.
Deep learning is based on information processing by large artificial neural networks that have dozens or even hundreds of layers. It doesn’t come for free, however. Since 2012, training the largest deep neural networks has become 300,000 times more computationally intensive. Every few months, training costs have doubled. The training of text generator GPT-3, for example, which amazed the world in 2020 by writing human-style texts in all kinds of styles, cost the annual consumption of 300 Dutch households (1 GWh, 4.1 million euros on the electricity bill).