Home > Article > Technology peripherals > How much does it cost for artificial intelligence to chat with you?
Image source Visual China
China Youth Daily·China Youth Daily reporter Yuan Ye
Every time an artificial intelligence chats with you, the company behind it loses money.
The Washington Post reported that the operation and maintenance costs of large-scale language models represented by ChatGPT are so high that the companies that launch them are unwilling to provide the best versions to the public. Tom Goldstein, a professor of computer science at the University of Maryland, said: "The models currently being deployed look impressive, but they are not the best." He believes that if cost factors are not considered at all, then artificial intelligence will not be widely used. Criticized shortcomings, such as the tendency to give biased results or even lie, are avoidable.
Artificial intelligence requires intensive computing power, which is why OpenAI, the developer of ChatGPT, only runs the less powerful GPT-3.5 model in its free version. Every 3 hours, even those who pay $20 per month to use GPT-4's premium model can only send 25 messages. The reason behind this is that operating costs are too high.
Last December, shortly after ChatGPT was released, OpenAI CEO Sam Altman estimated that it would “probably only cost a few cents per chat.” Although it sounds inexpensive, the overall cost is still very expensive considering it has more than 10 million daily active users. In February this year, a research institution stated that even if it only runs GPT-3.5, the daily computing cost of ChatGPT is still as high as 700,000 US dollars.
The Washington Post said that cost issues may also be one of the reasons why Google has not yet added artificial intelligence chatbots to its search engine, which handles tens of billions of queries every day. Dylan Patel, chief analyst at an American industry research company, estimates that for operators, the cost of a chat with ChatGPT may be 1,000 times more expensive than a Google search.
The cost of artificial intelligence has even alarmed the White House. British Reuters stated that in a recently released report on artificial intelligence, the Biden administration pointed out that the computational cost of generative artificial intelligence is a "national problem." According to the White House, this technology will significantly increase computing requirements and associated environmental impacts, creating an urgent need to design more sustainable systems.
According to Reuters, compared with other forms of machine learning, generative artificial intelligence in particular relies on dazzling computing power and specialized computer chips that only powerful companies can afford. In May, Sam Altman said at a Senate hearing in the U.S. Congress, “In fact, we have a very short supply of chips (GPUs), so the fewer people who use our products, the better.”
According to Elon Musk at a Wall Street Journal summit held on May 23, GPUs are more popular than drugs at this stage. Recently, Musk's artificial intelligence startup purchased about 10,000 GPUs for its own research and projects.
Eye-catching new technology burning money is nothing new to the technology industry. The Washington Post stated that part of the reason why Silicon Valley was able to dominate the Internet economy was that it provided services such as online search, email, and social media to the world for free. Huge profits were made from advertising. While the AI industry may adopt the same strategy, analysts note that relying solely on advertising revenue may not make high-end AI tools profitable anytime soon.
So companies that offer AI models to consumers must balance their desire to win market share with the headache of financial losses.
The Washington Post pointed out that even if artificial intelligence can make money, the profits will likely flow to cloud computing giants and chip manufacturers of the hardware needed to run the models. It is no coincidence that the companies currently developing leading artificial intelligence language models are either the largest cloud computing providers (such as Google and Microsoft) or have close cooperation with them (such as OpenAI).
For consumers, the days of unrestricted access to powerful artificial intelligence models may be numbered.
Reuters said that Microsoft has begun trying to embed advertisements in artificial intelligence search results. Despite preferring a paid subscription model, during the hearing OpenAI's Atman said he wouldn't rule out doing the same. Both companies say they believe artificial intelligence can one day be profitable. "It's worth so much, I can't imagine... ringing the cash register on it," Altman said in an interview in February.
The above is the detailed content of How much does it cost for artificial intelligence to chat with you?. For more information, please follow other related articles on the PHP Chinese website!