New technologies could save AI
And there is definitely potential for efficiency. Two developments in different areas give us reason to hope that we will be able to continue using AI systems in the future without having to build new nuclear power plants.
Development of new hardware
Put simply, AI hardware consumes a great deal of energy because there is a constant exchange between processing and storage components. Data is transferred from a memory to the processing unit and then back again.
Researchers at the University of Minnesota Twin Cities have now developed a new technology that makes this exchange unnecessary. In the hardware called Computational Random-Access Memory (CRAM), the previously separate components merge into a single one. In other words, data processing takes place directly in the memory.
In scientific tests, CRAM technology proved to be 2,500 times more energy efficient and 1,700 times faster than a conventional system with separate components.
This is the state of development: the researchers are already in contact with leading companies in the semiconductor industry to develop CRAM technology on a large scale and make it marketable. The development stage is already well advanced, because the technology is the result of more than 20 years of research and interdisciplinary collaboration.
Development of new software
In language models and artificial intelligence applications, algorithms use floating-point operations that are very computationally intensive. These are also known as a measure of the performance of supercomputers, which is expressed in FLOPS (Floating Point Operations Per Second). In contrast, the researchers at BitEnergy AI use simpler integer additions in their algorithm ‘Linear-complexity multiplication’ (L-Mul). Using this mathematical method, the researchers claim to have achieved an energy saving of 95 per cent in experiments.
This is the state of development: the research work on the L-Mul algorithm has so far only been published as a preprint. A regular publication in a scientific journal, which requires a professional review of the results, is still pending. Furthermore, special hardware that can fully exploit the potential of L-Mul is still needed for market readiness.
Conclusion: AI providers should invest in efficiency
The global trend towards greater sustainability will continue in the wake of climate change. Even the great hype surrounding artificial intelligence will not change this. On the contrary: the best AI systems are useless if the climate tips over.
Energy is already in short supply in many countries, and energy-intensive industries are under scrutiny. It is still in their own hands to become future-proof and sustainable at the same time by investing more in efficiency. This raises the question:
What could be achieved with the resources earmarked for the construction of nuclear power plants if they were instead invested in AI efficiency technologies?
Text: Falk Hedemann