Home > Article > Technology peripherals > Industry says ChatGPT style search will increase Google's costs tenfold, spending billions more every year
February 23 news, as the artificial intelligence chat tool ChatGPT continues to be popular, companies such as Google and Microsoft that provide search services have begun to use search interfaces Integrate artificial intelligence chatbot functionality. But for these technology giants, this chatbot-style search engine will increase operating costs by 10 times, resulting in expenditures of up to billions of dollars.
Executives across the tech industry are discussing the high cost of operating artificial intelligence like ChatGPT. Sam Altman, CEO of OpenAI, once said on Twitter that the computational cost of ChatGPT per conversation with users is as high as a few cents or more, which is "unbearable."
John Hennessy, chairman of Google parent company Alphabet, said in an interview that while fine-tuning can help reduce costs quickly, "the cost of exchanging information with artificial intelligence such as large language models can be Complete more than 10 times the standard keyword search function."
Alphabet’s net profit in 2022 will be close to US$60 billion. Even if the chatbot-based search business generates advertising revenue, the technology could adversely affect Alphabet's bottom line and incur billions of dollars in additional costs, analysts said.
Investment bank Morgan Stanley estimates that the cost of Google’s 3.3 trillion search queries last year was about 0.55 cents per query. This number will be more or less depending on the amount of text to be generated by the artificial intelligence. Analysts predict that if "an AI chatbot like ChatGPT can handle half of today's search queries with 50-word answers," it will cost Google $6 billion more per year by 2024.
Other analysts have similar views. SemiAnalysis, a research and consulting firm focused on chip technology, said adding ChatGPT-style artificial intelligence to search could cost Alphabet an extra $3 billion, given Google's new Tensor Processing Units and other optimizations. .
The reason this kind of artificial intelligence is more expensive than traditional search engines is the higher computing power involved. Analysts say this kind of artificial intelligence relies on chips that cost billions of dollars, and the corresponding costs need to be spread over several years of service life. The electricity consumed will also increase the pressure on corporate operating costs and carbon emission indicators.
The process of processing artificial intelligence search queries is called "inference". Each search starts a huge neural network that mimics the human brain, generates a bunch of text, and may also query large search index information for factual information. .
Alphabet’s Hannis said, “You have to reduce the cost of inference,” which he said “will take several years in the worst case scenario.”
Despite its high operating expenses, Alphabet still needs to consider how to respond to challenges from other technology companies. Earlier this month, rival Microsoft showed off plans to embed artificial intelligence chatbots into its Bing search engine. Similarweb estimates that Microsoft executives are targeting Google's 91% search market share.
Microsoft Chief Financial Officer Amy Hood told analysts that as the improved Bing goes online, the benefits of increased user numbers and advertising revenue have outweighed the costs. Hood said, "For us, even at the cost of services we are discussing, the gross margin is increased."
Richard, CEO of another Google competitor, search engine You.com Richard Socher said adding artificial intelligence chat experiences as well as graphics, video and other generative technology applications will increase operating expenses by 30% to 50%. But he said, "As time goes on, technology will become cheaper and cheaper."
A source close to Google cautioned that it is too early to determine the specific cost of the chatbot, Because operating efficiency and usage will vary greatly depending on the technology involved, and search products have long used artificial intelligence to provide technical support.
Accenture Chief Technology Officer Paul Daugherty said cost considerations were one of two main reasons why the search giant with billions of users did not immediately launch an AI chatbot. one.
“One is accuracy, and two is you have to scale in the right way,” he said. For years, researchers at Alphabet and other companies have been studying how to scale at lower Cost to train and run large language models.
Larger models typically require more chips for inference and therefore cost more to run. The models behind artificial intelligence that make consumers flock to it are quite large. For example, the model behind ChatGPT has 175 billion parameters, and the cost of each run will continue to change with the length of the user's query.
A senior technology executive said that it is still too expensive to get millions of consumers to use this kind of artificial intelligence. "These models are very expensive, so the next phase of advancement will lower the cost of training these models and inference so that it can be used in every application," said the executive, who asked not to be named. ##A person familiar with the matter said that OpenAI’s computer scientists have now figured out how to optimize inference costs through complex codes, thereby improving chip operation efficiency.
A long-standing question is how to reduce the number of parameters in an artificial intelligence model by 10 times or even 100 times without affecting accuracy.
“How to most effectively eliminate (parameters) is still an open question,” said Naveen Rao, who once led Intel’s artificial intelligence chip project.
At the same time, some companies are considering paying to use artificial intelligence search, such as OpenAI’s upgraded ChatGPT service, which charges a monthly subscription fee of $20. Technology experts also say one workaround is to apply smaller AI models to simpler tasks, an approach Alphabet is exploring.
Alphabet said this month that a "smaller version" of the LaMDA artificial intelligence model will power the chatbot Bard, "requiring significantly less computing power, allowing us to scale to more users."
When asked about chatbots like ChatGPT and Bader at a conference last week, Hannis said that more targeted models rather than all-powerful systems would help "reduce costs." .
The above is the detailed content of Industry says ChatGPT style search will increase Google's costs tenfold, spending billions more every year. For more information, please follow other related articles on the PHP Chinese website!