- cross-posted to:
- technology@lemmy.zip
- cross-posted to:
- technology@lemmy.zip
For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.


Had to look up Chat GPT’s energy usage because you made me curious.
Seems like Open AI claims Chat GPT 4o uses about 0.34 Wh per “query.” This is apparently consistent with third party estimates. The average Google search is about 0.03 Wh, for reference.
Issue is, “query” isn’t defined, and it’s possible this figure is the energy consumption of the GPUs alone, omitting additional sources that comprise the full picture (energy conversion loss, cooling, infrastructure, etc.). It’s also unclear if this figure was obtained during model training, or during normal use.
I also briefly saw that Chat GPT 5 uses between 18-40 Wh per query, so 100x more than GPT 4o. The OP used GPT 5.
It sounds like the energy consumption is relatively bad no matter how it’s spun, but consider that it replaces other forms of compute and reduces workload for people, and the net energy tradeoff may not be that bad. Consider the task from the OP - how much longer/how many more people would it take to accomplish the same result that GPT 5 and the lone author accomplished? I bet the net energy difference isn’t that far from zero.
Here’s the article I found: https://towardsdatascience.com/lets-analyze-openais-claims-about-chatgpt-energy-use/
How would this compare to one person with 5090 gaming for a week?
A setup with one monitor and a computer with a 5090 will draw about 1 kW under load. That’s 7 kWh per week if the average is 1 hour a day.
So that’s about:
Lol good question