


The latest news! Baidu Wenxin Big Model 4.0: The largest parameter model in the history of Wanka training, see you as soon as next week
Yesterday, Cailian News exclusively revealed that Baidu’s Wenxin Model 4.0 is intensifying training and is close to being ready for release. Everyone has always been curious about Wen Xinyiyan's information. Today we also got more news about Wenxin 4.0, which involves key information such as underlying architecture, infrastructure, training data sets, costs, etc. It has a very high degree of credibility!
Let’s talk about the core conclusions first:
1. Yesterday’s revelations are basically true. It is currently understood that Wenxin Large Model 4.0 has actually been tested with small traffic.
2. The number of parameters of Wenxin 4.0 is larger than that of all LLMs with publicly released parameters. It is also the first large model in China to be trained using Wanka cluster.
3. The reasoning cost is much higher than that of Wenxin 3.5, it is said to be about 8-10 times! (Large models are really expensive!)
If these revelations are true, then this will be a major node for Baidu and even domestic large models to catch up with GPT-4.
Next, let’s take a look at the details of the revelations.
The largest parameter model in the history of Wanka cluster training?
According to the information we have received, the parameter scale of Wenxin Large Model 4.0 is larger than all LLMs currently publicly releasing parameters, which means that the parameter scale of Wenxin Large Model 4.0 is expected to exceed the trillion level.
Looking at this parameter amount alone, many people will think it's okay. After all, according to the currently revealed information, the parameter amount of GPT-4 is already around 1.8 trillion. However, the person who broke the news further stated that Wenxin Large Model 4.0 is still a single model and does not adopt the mixed expert model (MoE) used by GPT and many other large language models.
Previously, "genius hacker" George Hotez broke the news that the reason why GPT-4 uses a hybrid model is because the parameter size of the model cannot exceed 220 billion. OpenAI wants the model to get better, but if it just takes longer to train, the effect is already diminishing.
So, if Baidu can achieve a breakthrough in a single model, whether the model capabilities will also be significantly improved, we can only wait and see after the actual release.
A model with such a large number of parameters is bound to have high computing power requirements. The current news is that Wenxin 4.0 was trained on the Wanka AI cluster. It should be regarded as the first large language model in China to be trained using a Wanka-scale cluster.
What is the concept of Wanka cluster? In China, only Huawei and Alibaba have revealed that they have built Wanka AI cluster, but we have not seen a specific model based on it.
This shows that Wanka cluster is not easy to build, and it is even more difficult to use it to maximize its effect. According to analysis, it is precisely because of the deep integration of Fei Paddle that such a large-scale model can be efficiently trained based on the Wanka cluster.
The cost has surged, and low-traffic testing has been conducted for the public in a low-key manner
Not only is the training cost increasing, but the inference cost of Wenxin 4.0 has also been revealed to be much higher than that of 3.5. We have not yet obtained the specific inference cost per thousand tokens, but it is rumored that it was probably before 8-10 times, this is still in the case of high utilization (MFU). If utilization is even lower, costs are expected to continue to increase.
I have to say that large models are really expensive. Creating a leading underlying foundation model is a game for giants!
Finally, according to internal employees, Baidu has actually begun secretly testing Wenxin Big Model 4.0 with low traffic, and a small number of Wenxin Yiyan users are already using the latest model version.
Many people think this statement is more reliable, and we can also get some clues from some recent revelations in the technology community.
Perhaps, when you ask questions on Wenxin Yiyan now, you are using Wenxin Big Model 4.0. I don’t know if the generated results can compete with GPT-4.
I emphasize again that the above is not officially confirmed information, and everyone can judge its accuracy by themselves.
The above is the detailed content of The latest news! Baidu Wenxin Big Model 4.0: The largest parameter model in the history of Wanka training, see you as soon as next week. For more information, please follow other related articles on the PHP Chinese website!

The 2025 Artificial Intelligence Index Report released by the Stanford University Institute for Human-Oriented Artificial Intelligence provides a good overview of the ongoing artificial intelligence revolution. Let’s interpret it in four simple concepts: cognition (understand what is happening), appreciation (seeing benefits), acceptance (face challenges), and responsibility (find our responsibilities). Cognition: Artificial intelligence is everywhere and is developing rapidly We need to be keenly aware of how quickly artificial intelligence is developing and spreading. Artificial intelligence systems are constantly improving, achieving excellent results in math and complex thinking tests, and just a year ago they failed miserably in these tests. Imagine AI solving complex coding problems or graduate-level scientific problems – since 2023

Meta's Llama 3.2: A Leap Forward in Multimodal and Mobile AI Meta recently unveiled Llama 3.2, a significant advancement in AI featuring powerful vision capabilities and lightweight text models optimized for mobile devices. Building on the success o

This week's AI landscape: A whirlwind of advancements, ethical considerations, and regulatory debates. Major players like OpenAI, Google, Meta, and Microsoft have unleashed a torrent of updates, from groundbreaking new models to crucial shifts in le

The comforting illusion of connection: Are we truly flourishing in our relationships with AI? This question challenged the optimistic tone of MIT Media Lab's "Advancing Humans with AI (AHA)" symposium. While the event showcased cutting-edg

Introduction Imagine you're a scientist or engineer tackling complex problems – differential equations, optimization challenges, or Fourier analysis. Python's ease of use and graphics capabilities are appealing, but these tasks demand powerful tools

Meta's Llama 3.2: A Multimodal AI Powerhouse Meta's latest multimodal model, Llama 3.2, represents a significant advancement in AI, boasting enhanced language comprehension, improved accuracy, and superior text generation capabilities. Its ability t

Data Quality Assurance: Automating Checks with Dagster and Great Expectations Maintaining high data quality is critical for data-driven businesses. As data volumes and sources increase, manual quality control becomes inefficient and prone to errors.

Mainframes: The Unsung Heroes of the AI Revolution While servers excel at general-purpose applications and handling multiple clients, mainframes are built for high-volume, mission-critical tasks. These powerful systems are frequently found in heavil


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Zend Studio 13.0.1
Powerful PHP integrated development environment

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)