The human brain is a marvel of nature. It’s an incredibly complex organ that uses very little energy. In contrast, generative AI models are very energy-intensive. This disparity in energy consumption is a significant problem, and it’s one that we need to address if we want to create a sustainable future for AI.
The human brain comprises billions of neurons and operates on just 20 watts of power. That’s less than a standard lightbulb! The brain can do this because it uses a very efficient way of processing information. It sends electrical signals between neurons in a very sparse and asynchronous mode. This means that the brain only uses energy when needed and doesn’t waste energy on unnecessary computations.
Generative AI models, on the other hand, are very energy-intensive. Training a single generative AI model can require as much energy as several flights around the globe. And even when these models are not being taught, they still use much energy. This is because they are constantly running on powerful computers, and these computers need to be kept cool.