di Francesco Gattei
We have discovered that even creative thinking can be “externalized.” But artificial intelligences imply a large and unpredictable increase in energy and water consumption
T
he history of humanity is a progressive sequence of “externalization” of biological functions. With fire, we have delegated part of the digestive process to the flame; this has allowed us to free up hours of the day that would otherwise be spent slowly absorbing raw food—look at the resting times of big cats. Later modifications of the jaw and skull favored the development of new areas of the brain dedicated to creativity.
With the industrial revolution’s machines, we performed many activities using tools that replaced raw human exertion. We also learned to do new things, like flying and traveling over land and sea at speeds unmatched by any living being. At each step, we took effortful action out of the realm of the human body to better satisfy our daily needs.
Humans are inherently lazy creatures who have spent thousands of years working out ways to increase our leisure time. We exert ourselves less and consume fewer calories, but we increase the energy consumption of the objects that replace us, which is why we increasingly exploit the environment around us.
Today, we’re taking another step in replacing our efforts: we’ve discovered that even thinking can be outsourced. At first, this applied only to tedious calculations — we outsourced them to computers decades ago. More recently, we’ve begun to externalize even parts of our creative thinking.
The new intelligences are still in their infancy, but they’re already leaders in a wide range of activities. They can pass exams for neurosurgeons and sommeliers (without the practical test, of course!), and excel at stock trading and logistics. They write articles, compose songs (though unlistenable), and create films (albeit unwatchable).
In chess, they’ve been defeating us for twenty-five years, but only recently have they begun to display the hallmarks of creative thought.
They’re now practicing to beat us at Diplomacy, a board game requiring dialog and persuasion between players to form alliances. These AIs learn on their own, developing neural networks through continuous simulations and progressively focusing on key content.
GPT (Generative Pretrained Transformer) is the most popular model of artificial intelligence: it can compose text, answer questions and produce images. Officially launched in 2017, it has already reached its fourth iteration.
These systems develop progressively through self-learning (pre-training) and subsequent fine-tuning with human support. However, this process comes at a huge energy cost: new thinking machines must study and refine themselves to be useful. They then develop their logical paths during the usage (inference) phase. The two phases have different energy intensities: early learning is effortful and energy intensive. The later phase is less so.
During pre-training, all the focus is on the CPUs and TPUs—computer processing units and tensor processing units. The energy consumption to train the new intelligence in this phase can be enormous, potentially involving a few hundred megawatt-hours. That’s equivalent to the power usage of a small city for several weeks.
During inference—when artificial intelligence answers user questions—the energy consumption is lower and limited to CPUs and GPUs (graphic processing units that facilitate algorithm processing). However, it remains quite high due to the billions of bits of information that must be analyzed and selected. Today, this involves 1 million operations per day, with an estimated consumption of around 200-300 kWh daily. This number is destined to increase dramatically with the progressive spread of this technology and its application to more diverse uses.
Artificial intelligence is not very efficient compared to the human brain, which needs just 0.4 kWh per day while engaged in millions of intellectual and muscular decisions, including unconscious ones. However, the most surprising data emerges from comparing AI to other traditional software applications. During the learning process, AI consumes up to 100,000 times more energy than traditional software. Even in the inference phase, AI uses between 10 and 100 times more energy, depending on the tasks required. According to the IEA, data centers consume 2 percent of global power each year—460 TWh, equivalent to the electricity consumption of a country like Germany. This is expected to double in two to three years due to the rise of artificial intelligence. And, contrary to the narrative that digital equals immateriality and environmental friendliness, these machines also contribute to global warming.
Today, the IT system already has a significant carbon footprint, equal to 2 percent of global emissions—about the same as air travel. This impact is set to increase, especially considering that intermittency hampers calculation. Moreover, the heat generated has to be dissipated. According to some estimates, data centers consume up to 500-700 billion liters of water per year, an amount expected to increase by the end of the decade.
We’ll surely find ways to make things work more efficiently, but the power these systems will need is growing faster than we could have guessed. Humankind is creating new (non-living) things that need food (power) and drink (water). If we think about a world where 8 billion brains will grow to 10 billion by mid-century, all hoping for a better life, we’re only seeing part of the picture. On top of those billions, we’ll have millions of man-made brains learning to do everything: bargain, sell, help us, take pictures, write, and drive. And perhaps even pretend to love us.
Anyone hoping to cut energy use in the coming decades through heroic gains in efficiency is likely missing some pieces of the puzzle. They’ll probably need to rethink the energy sources that can support these ever-growing electronic and biological populations.
P.S.: This article was composed over three days with the help of artificial intelligence, thereby amplifying the writer’s energy consumption.