OpenAI’s latest creation, GPT-5, is generating headlines for its remarkable abilities, but a growing chorus of experts is raising a critical concern: its potentially massive energy consumption. While the company has been notably quiet on the issue, experts are voicing serious concerns. They argue that the model’s enhanced features, such as its ability to create websites and solve PhD-level problems, come with a steep and unprecedented environmental cost. This lack of transparency from a leading AI firm is prompting tough questions about the industry’s commitment to sustainability.
A key piece of evidence for these concerns comes from a study at the University of Rhode Island’s AI lab, which found that generating a medium-length response of around 1,000 tokens with GPT-5 can consume an average of 18 watt-hours. This is a significant jump from prior models. To put this into context, 18 watt-hours is the amount of energy an old-fashioned incandescent light bulb uses in about 18 minutes. With a service like ChatGPT fielding billions of requests daily, the total energy consumption could be staggering, potentially reaching the daily electricity demand of millions of homes.
The surge in energy use is directly linked to the model’s increased size and complexity. Experts believe GPT-5 is substantially larger than its predecessors, with a greater number of parameters. This aligns with research from a French AI company, Mistral, which established a strong correlation between a model’s size and its energy consumption. Mistral’s study concluded that a model ten times larger will have an impact that is an order of magnitude greater. This principle seems to be holding true for GPT-5, with some specialists suggesting its resource use could be “orders of magnitude higher” than even GPT-3.
The issue is compounded by the new model’s architecture. While it does utilize a “mixture-of-experts” system to improve efficiency, its advanced reasoning capabilities and ability to process video and images likely cancel out these gains. The “reasoning mode,” which requires the model to compute for a longer time before generating a response, could make its energy footprint several times larger than text-only operations. This combination of size, complexity, and advanced features paints a clear picture of an AI system with an immense appetite for power, leading to urgent calls for greater transparency from OpenAI and the wider AI industry.
