


The latest news! Baidu Wenxin Big Model 4.0: The largest parameter model in the history of Wanka training, see you as soon as next week
Yesterday, Cailian News exclusively revealed that Baidu’s Wenxin Model 4.0 is intensifying training and is close to being ready for release. Everyone has always been curious about Wen Xinyiyan's information. Today we also got more news about Wenxin 4.0, which involves key information such as underlying architecture, infrastructure, training data sets, costs, etc. It has a very high degree of credibility!
Let’s talk about the core conclusions first:
1. Yesterday’s revelations are basically true. It is currently understood that Wenxin Large Model 4.0 has actually been tested with small traffic.
2. The number of parameters of Wenxin 4.0 is larger than that of all LLMs with publicly released parameters. It is also the first large model in China to be trained using Wanka cluster.
3. The reasoning cost is much higher than that of Wenxin 3.5, it is said to be about 8-10 times! (Large models are really expensive!)
If these revelations are true, then this will be a major node for Baidu and even domestic large models to catch up with GPT-4.
Next, let’s take a look at the details of the revelations.
The largest parameter model in the history of Wanka cluster training?
According to the information we have received, the parameter scale of Wenxin Large Model 4.0 is larger than all LLMs currently publicly releasing parameters, which means that the parameter scale of Wenxin Large Model 4.0 is expected to exceed the trillion level.
Looking at this parameter amount alone, many people will think it's okay. After all, according to the currently revealed information, the parameter amount of GPT-4 is already around 1.8 trillion. However, the person who broke the news further stated that Wenxin Large Model 4.0 is still a single model and does not adopt the mixed expert model (MoE) used by GPT and many other large language models.
Previously, "genius hacker" George Hotez broke the news that the reason why GPT-4 uses a hybrid model is because the parameter size of the model cannot exceed 220 billion. OpenAI wants the model to get better, but if it just takes longer to train, the effect is already diminishing.
So, if Baidu can achieve a breakthrough in a single model, whether the model capabilities will also be significantly improved, we can only wait and see after the actual release.
A model with such a large number of parameters is bound to have high computing power requirements. The current news is that Wenxin 4.0 was trained on the Wanka AI cluster. It should be regarded as the first large language model in China to be trained using a Wanka-scale cluster.
What is the concept of Wanka cluster? In China, only Huawei and Alibaba have revealed that they have built Wanka AI cluster, but we have not seen a specific model based on it.
This shows that Wanka cluster is not easy to build, and it is even more difficult to use it to maximize its effect. According to analysis, it is precisely because of the deep integration of Fei Paddle that such a large-scale model can be efficiently trained based on the Wanka cluster.
The cost has surged, and low-traffic testing has been conducted for the public in a low-key manner
Not only is the training cost increasing, but the inference cost of Wenxin 4.0 has also been revealed to be much higher than that of 3.5. We have not yet obtained the specific inference cost per thousand tokens, but it is rumored that it was probably before 8-10 times, this is still in the case of high utilization (MFU). If utilization is even lower, costs are expected to continue to increase.
I have to say that large models are really expensive. Creating a leading underlying foundation model is a game for giants!
Finally, according to internal employees, Baidu has actually begun secretly testing Wenxin Big Model 4.0 with low traffic, and a small number of Wenxin Yiyan users are already using the latest model version.
Many people think this statement is more reliable, and we can also get some clues from some recent revelations in the technology community.
Perhaps, when you ask questions on Wenxin Yiyan now, you are using Wenxin Big Model 4.0. I don’t know if the generated results can compete with GPT-4.
I emphasize again that the above is not officially confirmed information, and everyone can judge its accuracy by themselves.
The above is the detailed content of The latest news! Baidu Wenxin Big Model 4.0: The largest parameter model in the history of Wanka training, see you as soon as next week. For more information, please follow other related articles on the PHP Chinese website!

Scientists have extensively studied human and simpler neural networks (like those in C. elegans) to understand their functionality. However, a crucial question arises: how do we adapt our own neural networks to work effectively alongside novel AI s

Google's Gemini Advanced: New Subscription Tiers on the Horizon Currently, accessing Gemini Advanced requires a $19.99/month Google One AI Premium plan. However, an Android Authority report hints at upcoming changes. Code within the latest Google P

Despite the hype surrounding advanced AI capabilities, a significant challenge lurks within enterprise AI deployments: data processing bottlenecks. While CEOs celebrate AI advancements, engineers grapple with slow query times, overloaded pipelines, a

Handling documents is no longer just about opening files in your AI projects, it’s about transforming chaos into clarity. Docs such as PDFs, PowerPoints, and Word flood our workflows in every shape and size. Retrieving structured

Harness the power of Google's Agent Development Kit (ADK) to create intelligent agents with real-world capabilities! This tutorial guides you through building conversational agents using ADK, supporting various language models like Gemini and GPT. W

summary: Small Language Model (SLM) is designed for efficiency. They are better than the Large Language Model (LLM) in resource-deficient, real-time and privacy-sensitive environments. Best for focus-based tasks, especially where domain specificity, controllability, and interpretability are more important than general knowledge or creativity. SLMs are not a replacement for LLMs, but they are ideal when precision, speed and cost-effectiveness are critical. Technology helps us achieve more with fewer resources. It has always been a promoter, not a driver. From the steam engine era to the Internet bubble era, the power of technology lies in the extent to which it helps us solve problems. Artificial intelligence (AI) and more recently generative AI are no exception

Harness the Power of Google Gemini for Computer Vision: A Comprehensive Guide Google Gemini, a leading AI chatbot, extends its capabilities beyond conversation to encompass powerful computer vision functionalities. This guide details how to utilize

The AI landscape of 2025 is electrifying with the arrival of Google's Gemini 2.0 Flash and OpenAI's o4-mini. These cutting-edge models, launched weeks apart, boast comparable advanced features and impressive benchmark scores. This in-depth compariso


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

VSCode Windows 64-bit Download
A free and powerful IDE editor launched by Microsoft

WebStorm Mac version
Useful JavaScript development tools

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool
