Chatgpt, the Openai Chatbot platform, may not be as hungry with force as supposed. But its appetite depends to a large extent on how the Chatgpt is used and the AI models that answer the questions, according to a new study.
A recent analysis From Epoch AI, a non -profit AI Research Institute, he tried to calculate how much energy a typical chatgpt question consumes. A commonly reported status is that Chatgpt requires about 3 watt-hours of power to answer a single question, or 10 times more than a Google search.
Epoch believes it is overestimated.
Using OpenAi’s latest default model for Chatgpt, GPT-4O, as reference, Epoch found that the average ChatGPT question consumes about 0.3 Watt-Hours-less than many home appliances.
“Energy is not really a big deal compared to the use of regular devices or heating or cooling your home or driving a car,” said Joshua You, the Epoch data analyst in TechCrunch.
The use of AI energy – and its environmental impacts, in general – is the subject of the controversial debate, as AI companies are rapidly expanding their infrastructure imprints. Just last week, a group of over 100 organizations Published an open letter Calling the industry and regulators of AI to ensure that the new AI data centers do not exhaust natural resources and utilities based on non -renewable energy sources.
You told TechCrunch that his analysis was suggested by what was described as outdated previous research. You have noted, for example, that the author of the report that reached the estimation of the 3 Watt-Hours that took over Openai used earlier, less efficient chips to run its models.
“I have seen many public reasons that properly recognized that AI will consume a lot of energy in the coming years, but does not really describe the energy that was going to AI today,” you said. “Also, some of my colleagues noticed that the most widely mentioned estimate of 3 watt-hours per question was based on quite old research and based on some mathematical napkin it seemed to be very high.”
Given, the 0.3 Watt-Hours figure of the time is also an approach. Openai has not published the details needed to make an accurate calculation.
The analysis also does not consider the additional energy costs resulting from chatgpt features such as image creation or input processing. You recognized that the “Long Input” Questions Chatgpt – Questions with attached files associated with long files, for example – may consume more electricity in advance than a standard question.
You said it expects to increase the basic Chatgpt power consumption, however.
“[The] AI will become more advanced, training this AI will probably require much more energy, and this future AI can be used much more intense – handling much more tasks and more complex tasks than how people use chatgpt today “, I mentioned.
While there have been remarkable discoveries in the performance of AI in recent months, the scale on which the AI is growing is expected to lead the huge expansion of the infrastructure. In the next two years, AI data centers may need almost all California 2022 (68 GW), According to a RAND report. By 2030, the training of a border model could require an exit power equivalent to that of the eight nuclear reactors (8 GW), the report predicted.
Chatgpt only reaches a huge – and expanding – number of people, making his server require similarly huge. Openai, along with several investment partners, plans to spend billions of dollars on new Data Center AI projects in the coming years.
Openai’s attention – along with the rest of the AI industry – is also shifted to reasoning models, which are generally more capable in terms of duties they can achieve, but require more computers to run. Unlike models such as the GPT-4O, which respond to the questions almost instantaneously, the models of “think” for seconds to minutes before answering, a process that absorbs more computers-and thus power.
“Reasoning models will increasingly take over the tasks that older models cannot and will create more [data] To do this, both require more data centers, “you said.
Openai has begun to release more efficient models of reasoning such as O3-Mini. But it seems unlikely, at least at this juncture, that performance profits will offset the increased power demands from the process of thinking of reasoning models and the increase in AI use worldwide.
Suggest that people are worried about their AI energy footprint use applications such as chatgpt rarely or choose models that minimize the necessary calculation – to the extent that it is realistic.
‘You could try to use smaller AI models like [OpenAI’s] The GPT-4O-mini, “you said,” and use them sparingly to use them in a way that requires processing or creating a ton of data. “