Home » Cracking the Code on AI’s Energy: GPT-5’s Consumption Raises Alarm

Cracking the Code on AI’s Energy: GPT-5’s Consumption Raises Alarm

by admin477351
Picture Credit: www.heute.at

As OpenAI’s GPT-5 makes waves for its advanced capabilities, a parallel story is unfolding about its voracious energy appetite. With the company offering no official data, independent researchers are providing the first glimpses into the model’s massive power demands. Their findings reveal that the new model’s enhanced intelligence comes at a significant and unprecedented environmental cost, challenging the industry’s narrative of clean, innovative progress.
The numbers are difficult to ignore. According to a team at the University of Rhode Island’s AI lab, a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a dramatic increase from previous models and is “significantly more energy than GPT-4o.” To help understand the scale, 18 watt-hours is equivalent to an incandescent bulb burning for 18 minutes. Given that ChatGPT handles billions of requests daily, the total energy consumption of GPT-5 could reach the daily electricity demand of 1.5 million US homes, an astonishing figure that highlights the growing environmental footprint of AI.
The main reason for this dramatic increase in energy usage is the model’s size. While OpenAI has not disclosed the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This is consistent with a study from the French AI company Mistral, which found a “strong correlation” between a model’s size and its energy consumption. The larger the model, the bigger its environmental impact. This principle, applied to a model believed to be orders of magnitude larger than its predecessors, suggests that the pursuit of ever-more-powerful AI will come at an escalating environmental price.
The architectural design of GPT-5 also plays a significant role. Its new reasoning mode and multimodal capabilities (handling video and images) require more intensive computation and time. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10” for the same answer. While the model does employ a “mixture-of-experts” architecture to improve efficiency, these new features likely counteract those gains, making the model a major consumer of power. This raises serious questions about the long-term sustainability of the AI industry and the need for greater transparency.

You may also like