AI

ChatGPT may not be as power-hungry as once assumed

Chatgpt, the Chatbot platform of OpenAi, may not be as powerful as ever accepted. But the appetite largely depends on how chatgpt is used and the AI ​​models that answer the questions according to a new study.

A Recent analysis By Epoch AI, a non -profit AI Research Institute tried to calculate how much energy a typical chatgpt -query uses. A often cited stat Is that Chatgpt requires about 3 wattsure in force to answer a single question, or 10 times as much as a Google search.

Epoch believes that this is an overestimation.

With the help of the newest standard model from OpenAI for Chatgpt, GPT-4O, as a reference, Epoch found the average chatgpt-query about 0.3 wattsure consumed more than many household appliances.

“The energy consumption is really not a problem compared to the use of normal devices or the heating or cooling of your home, or the management of car,” said Joshua, the data analyst in Epoch who performed the analysis, to WAN.

AI’s energy consumption – and its environmental impact, broadly – is the subject of controversial debate, because AI companies want to expand their infrastructure footprints quickly. Last week a group of more than 100 organizations an open letter published The AI ​​industry and supervisors call to ensure that new AI data centers do not exhaust natural resources and force utilities to rely on non -renewable energy sources.

You told WAN that his analysis was encouraged by what he characterized as outdated earlier research. For example, you have pointed out that the author of the report that arrived at the estimate of 3 Wattuur assumed that OpenAi used older, less efficient chips to perform its models.

See also  OpenAI rolls out ChatGPT for iPhone in landmark AI integration with Apple
Epoch ai chatgpt energy consumption
Image Credits:AI era

“I have seen a lot of public discourse that correctly acknowledged that AI would consume a lot of energy in the coming years, but not really the energy that went to AI today did not accurately described,” you said. “Some of my colleagues also noticed that the most reported estimate of 3 wattsuur per question was based on fairly old research, and based on some napkin mathematics too high.”

Admittedly, the figure of 0.3 Wattuur from Epoch is also an approach; OpenAi has not published the details that are needed to make a precise calculation.

The analysis also does not take into account the extra energy costs made by chatgpt functions such as generating images or input processing. You acknowledged that “long input” chatgpt -querys – Questions with long files, for example – probably use more electricity in advance than a typical question.

However, you said he expected that the power consumption of Baseline will rise.

‘[The] AI will become more advanced, to train this AI will probably require much more energy, and this future AI can be used much more intensively – perform many more tasks and more complex tasks than how people use chatgpt today, “you said.

Although there have been remarkable breakthroughs in AI efficiency in recent months, the scale on which AI is used will stimulate an enormous, powerful hunger infrastructure extension. In the next two years, AI data centers may need almost all power capacity of California (68 GW), According to an edge report. By 2030, training a frontier model can demand the power equal to that of eight nuclear reactors (8 GW), the report predicted.

See also  Will the Convergence of Agentic AI and Spatial Computing Empower Human Agency in the AI Revolution?

Only chatgpt reaches a huge – and expansion of – number of people, which requires the server to be solid. OpenAi, together with various investment partners, is planning to spend billions of dollars on new AI Data Center projects in the coming years.

The attention of OpenAi – together with the rest of the AI ​​industry – also shifts to reasoning models, which are generally better able to perform in terms of the tasks they can perform, but require more computing to be performed. In contrast to models such as GPT-4O, who respond almost immediately to questions, “think” reasoning models for seconds to minutes before you answer, a process that sucks more computing and therefore strength.

“Reasoning models will increasingly take on tasks that older models cannot, and generate more [data] To do this, and both require more data centers, “you said.

OpenAi started releasing more powerful reasoning models such as O3-Mini. But it seems unlikely, at least at the moment, that the efficiency gains will compensate for the increased wealth requirements through the ‘thinking process’ of the reasoning models and the growing of AI use around the world.

You suggested that people are worried about their AI Energy Footprint apps such as chatgpt rarely, or select models that minimize the required computer – to the extent that is realistic.

“You could try to use smaller AI models such as [OpenAI’s] GPT-4O-Mini, “You said,” and use them sparingly in a way that requires or generates a lot of data. ‘

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button