TECH NEWS – The CEO of OpenAI predicts that the cost of artificial intelligence (AI) will drop to the level of electricity. However, he didn’t go into much detail.
The AI craze isn’t going away anytime soon, as companies are increasingly investing in the technology and integrating it into their products and services. Since launching ChatGPT, OpenAI has brought AI to the forefront even further. As AI becomes more widely used, concerns about its rapid expansion and environmental impact are growing. Sam Altman recently discussed how much these AI models consume.
He recently shared that an average ChatGPT query consumes water, and the amount may surprise you. In a blog post titled “The Gentle Singularity,” Altman discussed how AI is destined to shape the world economically, socially, and environmentally. However, Altman did not share the methodology used to arrive at this figure or the factors taken into account in the calculation:
“An average ChatGPT query uses about 0.000085 gallons of water, or roughly one-fifteenth of a teaspoon. People are often curious about how much energy a ChatGPT query uses. The average query uses about 0.34 watt-hours—about what an oven would use in a little over one second or a high-efficiency light bulb would use in a couple of minutes,” he wrote.
Predicting the future of AI, Altman hypothesized that, as AI becomes more advanced and efficient, producing intelligence will cost only as much as the electricity used to power the hardware. Although he believes mass scaling would reduce costs, critics and environmentalists continually highlight the current resource costs and argue that the consumption rate Altman represents may be an underestimate because the consumption rate of AI devices appears to be higher. Without a proper methodology, this is quite difficult to verify.
Source: WCCFTech, Sam Altman




Leave a Reply