AI has a terrible energy problem. It’s about to hit crisis point

Every ‘foundation’ model – such as GPT-4, Claude 2, LLaMA 2, etc – burns through massive computational and energy resources in their training. And those models are already multiplying. From COSMOS:

Unless something knocks us off this path, it’s reasonable to expect that by around 2030 there will be more than a billion people using AI day-to-day in their work, and perhaps another 3 or 4 billion using it via their smartphones (or smartwatches, or smart glasses) for more quotidian assistance.

That’s a lot of requests flowing into these AI systems, a lot of data – and a lot of power.

Artificial intelligence is both mathematically and computationally intense. The number of computations that need to be performed to get an AI chatbot to generate a single word response to a user ‘prompt’ can number in the trillions of operations.

Read my analysis here.

Leave a comment