
It's no secret that top AI research labs are breaking the bank to keep the lights on while chasing the ever-elusive intelligence bag. Multiple reports have suggested that investor interest in artificial intelligence is rapidly waning as these companies struggle to establish a clear path to profitability despite plowing billions into the sector.
In January, a damning report suggested OpenAI could be on the verge of making a $14 billion loss in 2026, which could lead to bankruptcy by mid-2027. While OpenAI generates approximately $13 billion in annual revenue from ChatGPT and LLM access fees, it reportedly spends up to $1.4 billion on infrastructure expansion, model training, research hiring, and computing.
However, OpenAI CEO Sam Altman might have a master plan in the works that could permanently solve the AI firm's money problem. Speaking at the BlackRock Infrastructure Summit in Washington, DC, on Wednesday, the executive suggested that artificial intelligence will eventually be traded as a basic utility like electricity or water, metered by usage (via Business Insider).
Altman's idea of paying for intelligence like an electricity or water bill doesn't seem farfetched. It's not like we're not paying for AI already...
Of course, there's a free tier for services like ChatGPT, but OpenAI has found a creative way to make money off it, by integrating ads into the user experience.
Reports suggest that advanced AI scaling might have already hit a wall, citing a lack of high-quality content for model training and computing power.
Would utility pricing make you use AI more or less? Share your thoughts with me in the comments.
💬 Is paying for AI as a utility a better option than monthly subscription plans?
If we get to the point where we pay for AI like a utility bill, I'd argue that this approach is actually better since you'll be billed based on your usage. I'm not sure I can say the same for power users who've heavily integrated the technology into their workflow.
Sam Altman indicated that demand for AI is skyrocketing and research labs are building toward a future where intelligence is delivered on demand. “One of the most important things in the future is that we make intelligence, to borrow an old phrase from the energy industry that didn’t quite work: ‘Too cheap to meter,'" the executive added.
At the end of the day, as the demand for AI surges, computing power will likely become more scarce. As such, AI firms will be forced to charge more per unit for model tokens or just not meet demand.

Join us on Reddit at r/WindowsCentral to share your insights and discuss our latest news, reviews, and more.