Simple because it matters.
Simple because it matters.
Digitalisation & Technology, 01 August 2024
Artificial intelligence (AI) has quickly developed into a new technological cornerstone. AI is accelerating digitalisation, improving and expanding existing technologies and opening up almost unlimited possibilities for new developments. The catch: energy requirements are also almost unlimited. How can AI's hunger for energy be satisfied in the most climate-friendly way in times of global energy transition?
We have already reported on the high energy requirements of AI models. A lot has happened since then, but little has changed: while AI models are developing rapidly and constantly opening up new possibilities, their hunger for energy is still very high. And as new AI applications continue to come onto the market, the energy problem will continue to worsen.
Did you know that generating an image with an AI tool consumes about as much power as a smartphone battery?
According to the World Economic Forum, the computing power required to operate AI systems doubles approximately every 100 days. Computing power also means energy requirements. There are now various studies that deal with the energy consumption of AI. However, the estimates are difficult and vary accordingly. For example, global energy consumption could currently be equivalent to the annual consumption of Switzerland.
However, growth is more decisive: experts consider a tenfold increase in energy requirements by 2030 to be realistic. The extent to which they have already taken into account the progressive integration of AI functions in digital systems such as smartphones, office applications, search engines and operating systems remains unclear. In any case, the experts are urgently calling for the development of sustainable solutions to counteract the considerable ecological impact.
There are several possible solutions. They range from bioprocessors to underwater data centres and integrated eco-power plants to thermodynamic computing. They are divided into measures that can take immediate effect and long-term developments that should enable us to achieve a sustainable AI future.
The number of data centres has been growing for years. In an analysis, the Borderstep Institute for Innovation and Sustainability estimates the number at around 80 million worldwide (PDF). Due to increasing digitalisation and the hype surrounding AI, the number is likely to rise further in the coming years. AI expert Ralf Herbrich from the Hasso Plattner Institute estimates that the share of data centres in global energy consumption could skyrocket from four to five percent today to up to 30 percent in the next few years.
In view of these figures, it quickly becomes clear that improving the energy efficiency of data centres would be a powerful lever. Data centres could save energy quickly and directly by reducing their performance, but this would significantly slow down the response speed of AI applications. A completely different savings option is more realistic: data centres consume a lot of energy for cooling.
Cooling is usually provided by air circulation systems, which absorb the heat generated and transfer it to a cooling medium, usually water. There have therefore already been attempts to build data centres directly under water. A huge underwater data centre is currently being planned in China that will have the computing power of 6 million conventional computers. It is due to go into operation in 2025 and will save up to 122 megawatt hours of electricity per year thanks to natural water cooling.
Forecast: How successful the project will be certainly also depends on the construction and maintenance costs, which are likely to differ from those of conventional data centres on land. It is more likely that underwater data centres will remain the exception and are primarily suitable for coastal conurbations where space is already at a premium.
The technological basis of today's computer systems is surprisingly old. The so-called ‘Von Neumann Architecture’ (VNA) goes back to the work of mathematician John von Neumann and was first published in 1945. In this computer architecture, the basic elements such as processors and memory work separately from each other, so they must constantly exchange data. They also process information digitally in the form of bits (0 and 1), perform calculations using logical operations and work deterministically.
The energy consumption of this classic architecture is extremely high, especially for AI applications that utilise large amounts of data.
Thermodynamic computing offers an alternative. Here, processors and memory work as a unit and utilise the principles of thermodynamics, i.e. the theory of heat and energy. Instead of purely logical operations, thermodynamic computing is based on physical processes and fluctuations at molecular level. It works with probabilities and not with deterministic states.
This computer architecture has several advantages in terms of energy efficiency:
Forecast: Thermodynamic computing has the potential to drastically improve the energy efficiency of computer systems, especially for AI applications.
Another exciting approach is neuromorphic computing. In contrast to thermodynamic computing, the source of inspiration here is not physics, but the human brain and nervous system. The basic principle:
Neuromorphic computing attempts to replicate the structure and functioning of the human brain in hardware. The aim is to develop computers that work as efficiently and adaptably as biological neural networks. Here too, the aim is to merge computing processes and information storage in order to overcome the energy-intensive data transfer of conventional computer architectures.
Neuromorphic chips with artificial synapses are modelled on the human brain in two respects. On the one hand, the brain is a biological supercomputer with unimaginable capacity. On the other hand, the brain only needs 20 watts - an almost unimaginable level of energy efficiency.
Forecast: Neuromorphic computing is already being used in experimental chips and is being intensively developed further in a research project at the Jülich Research Centre.
The sustainable energy supply of data centres has so far been a subordinate topic at best. As demand increases and the energy transition progresses, the requirements profile for new data centres, which are urgently needed for AI systems, is changing.
A sustainable data centre is therefore currently being built in Mainz, very close to the important internet hub of Frankfurt am Main. The operator, Kraftwerke Mainz Wiesbaden (KMW), is not an Internet company, but an energy supplier. The data centre is supplied with 100 percent green electricity from KMW's own wind turbines. In addition, the waste heat is not released into the environment as usual, but is fed into the local district heating network.
This one data centre alone will produce up to 60 MW of waste heat, which can be used to heat numerous households in the surrounding area. By way of comparison, the world's largest large-scale heat pump is currently being built in Esbjerg, Denmark, which will also generate 60 MW of heating energy and supply up to 25,000 households.
This is a great opportunity for the future: if all data centres feed their waste heat into district heating networks, this will also improve the carbon footprint of AI systems. At the same time, the energy transition is significantly supported. A smart solution, isn't it?
Text: Falk Hedemann
Your opinion
If you would like to share your opinion on this topic with us, please send us a message to: next@ergo.de
Further articles