How much does intelligence cost?

There is too much hype about artificial intelligence (AI), which is not really intelligence at all. Intelligence is behavioral flexibility in pursuit of values and preferences; the algorithms out there learn, but once trained, they are not flexible, much less have values or preferences, because they feel nothing.

They may even act, but they don't feel the consequences of their actions in their flesh, or in their transistors. They don't suffer or rejoice in the results of their actions, much less in anticipation. And if they don't suffer or rejoice, they don't care about the future. And there is nothing less human than not caring about the future.

But the less talked about problem is that artificial intelligence is stupidly expensive. Yes, all new technology tends to be expensive, which currently pays off in the case of AI because there is enormous value in advancing knowledge about the circumstances in which neural networks can learn to find meaning in mountains of unsupervised data.

But the cost in this case is not measured solely in dollars that pay salaries and infrastructure, plus energy in the form of electricity. The energy cost of AI is, in itself, exorbitant, especially compared to the comparatively negligible cost of running our natural intelligence. AI may be effective, but it is not efficient.

Simple arithmetic is enough to make a meaningful comparison. For starters, the human brain operates at a power—that is, energy usage rate per unit of time—of 22 watts, or 22 joules per second (which translates to about 450 kCal per day, in more familiar units). That is very, very little; a laptop operates at around 50 watts—which is also impressive, considering that a washing machine operates at ten times that power, at around 500 watts.

But this arithmetic is fallacious because it ignores that an adult human brain only does what it does at a measly 22 watts because it has spent literally years being trained, doing and making mistakes. Let's say that by the time you can read a written test question and answer it with a written paragraph, for example, that's 15 years of training for the neural networks in the human brain, or a total of about 3,000 kWh of energy used over 5,475 24-hour days at 22 W.

In comparison, in the mere two months it took Google to train PaLM, its language generation AI, the undertaking used no less than 3,400,000 kWh of energy, enough to keep 300 homes running for a year. In other words, it is a thousand times cheaper to ask questions to a human being than to an artificial intelligence that is not even that intelligent, that does not care about your reaction to the answer, and that will generate several completely wrong answers, leaving you to separate the wheat from the chaff anyway.

I think I'll stick to asking people questions.

Excerpt from Suzana Herculano-Houzel (2025) Neuroscience of Everyday Life, originally published in Folha de São Paulo in July 2023.

Previous
Previous

What matters more is what you do with the brain you have

Next
Next

Acupuncture with neuroscience is even better