Home > Article > Technology peripherals > After Bing Chat is open for testing, the GPU speed cannot keep up with the usage. The chat history function will be online in a few days.
According to news on May 9, Mikhail Parakhin, head of advertising and network services at Microsoft, recently responded to netizens, apologizing for the long waiting time for Bing Chat response, and stated that GPU The rate of addition cannot keep up with the increase in user usage.
IT House Translation Parakhin’s tweet reads as follows: “Sorry you are experiencing latency issues. Due to growing usage, we are not adding GPUs fast enough. . We will do everything we can to fix this."
Microsoft has not announced how many GPUs are used to support Bing Chat. However, a report released by market research agency TrendForce pointed out that if the processing power of the Nvidia A100 graphics card is calculated, running ChatGPT will require 30,000 GPUs. Block Nvidia GPU.
Parakhin also mentioned other things in the relevant tweet:
The above is the detailed content of After Bing Chat is open for testing, the GPU speed cannot keep up with the usage. The chat history function will be online in a few days.. For more information, please follow other related articles on the PHP Chinese website!