


With 3.6 trillion tokens and 340 billion parameters, details of Google's large model PaLM 2 have been exposed
Last Thursday, at the 2023 Google I/O conference, Google CEO Pichai announced the launch of PaLM 2, a large model that benchmarks GPT-4, and officially released a preview version, which improved mathematics, code, and reasoning. , multi-language translation and natural language generation capabilities.
PaLM 2 model provides four versions of different sizes, from small to large For Gecko, Otter, Bison and Unicorn, easier to deploy for a variety of use cases. Among them, the lightweight Gecko model can run on mobile devices very quickly, and excellent interactive applications can be run on the device without being connected to the Internet.
However, at the meeting, Google did not give specific technical details about PaLM 2, only stating that it is built on Google’s latest JAX and TPU v4.
##Yesterday, according to internal documents seen by foreign media CNBC, PaLM 2 was Training on 3.6 trillion tokens. For comparison, the previous generation PaLM was trained on 780 billion tokens.
In addition, Google has previously stated that PaLM 2 is smaller than previous LLMs, which means it can become more efficient while completing more complex tasks. This has also been verified in internal documents. The training parameter size of PaLM 2 is 340 billion, which is much smaller than PaLM's 540 billion.
How do the training tokens and parameters of PaLM 2 compare with other LLMs? For comparison, LLaMA, released by Meta in February, was trained on 1.4 trillion tokens. OpenAI’s 175 billion parameter GPT-3 was trained on 300 billion tokens.
While Google has been eager to demonstrate the power of its AI technology and how it can be embedded in search, email, document processing and spreadsheets, it has also been reluctant to disclose the size of its training data or the size of its training data. Other details. In fact, Google is not the only one doing this. OpenAI is also silent on the details of its latest multi-modal large model GPT-4. They all said the non-disclosure of details stemmed from the competitive nature of the business.
However, as the AI arms race continues to heat up, the research community is increasingly demanding more transparency. And in an internal Google document leaked some time ago, Google internal researchers expressed this view: Although on the surface it seems that OpenAI and Google are chasing each other on large AI models, the real winner may not necessarily come from these two. Produced at home, because the third party force "open source" is quietly rising.
At present, the authenticity of this internal document has not been verified, and Google has not commented on the relevant content.
Netizen CommentsAt the beginning of the official announcement of PaLM 2, some netizens predicted the number of parameters based on Chinchilla’s law. They predicted that the parameter result of the PaLM 2 model family would be 80B. / 90B / 100B, which is still very different from the 340B reported this time.
Some people have made a wave of predictions about the training cost of PaLM 2. According to the development of large models in the past, This netizen said that it would cost US$100 million to build PaLM 2.
PaLM 2 parameters have been leaked, you can try to guess Bard, this netizen said:
With the leak of the number of PaLM 2 tokens, netizens can’t help but wonder, how many tokens will it take for a big turning point before the arrival of AGI?
The above is the detailed content of With 3.6 trillion tokens and 340 billion parameters, details of Google's large model PaLM 2 have been exposed. For more information, please follow other related articles on the PHP Chinese website!

Revolutionizing the Checkout Experience Sam's Club's innovative "Just Go" system builds on its existing AI-powered "Scan & Go" technology, allowing members to scan purchases via the Sam's Club app during their shopping trip.

Nvidia's Enhanced Predictability and New Product Lineup at GTC 2025 Nvidia, a key player in AI infrastructure, is focusing on increased predictability for its clients. This involves consistent product delivery, meeting performance expectations, and

Google's Gemma 2: A Powerful, Efficient Language Model Google's Gemma family of language models, celebrated for efficiency and performance, has expanded with the arrival of Gemma 2. This latest release comprises two models: a 27-billion parameter ver

This Leading with Data episode features Dr. Kirk Borne, a leading data scientist, astrophysicist, and TEDx speaker. A renowned expert in big data, AI, and machine learning, Dr. Borne offers invaluable insights into the current state and future traje

There were some very insightful perspectives in this speech—background information about engineering that showed us why artificial intelligence is so good at supporting people’s physical exercise. I will outline a core idea from each contributor’s perspective to demonstrate three design aspects that are an important part of our exploration of the application of artificial intelligence in sports. Edge devices and raw personal data This idea about artificial intelligence actually contains two components—one related to where we place large language models and the other is related to the differences between our human language and the language that our vital signs “express” when measured in real time. Alexander Amini knows a lot about running and tennis, but he still

Caterpillar's Chief Information Officer and Senior Vice President of IT, Jamie Engstrom, leads a global team of over 2,200 IT professionals across 28 countries. With 26 years at Caterpillar, including four and a half years in her current role, Engst

Google Photos' New Ultra HDR Tool: A Quick Guide Enhance your photos with Google Photos' new Ultra HDR tool, transforming standard images into vibrant, high-dynamic-range masterpieces. Ideal for social media, this tool boosts the impact of any photo,

Introduction Transaction Control Language (TCL) commands are essential in SQL for managing changes made by Data Manipulation Language (DML) statements. These commands allow database administrators and users to control transaction processes, thereby


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

SublimeText3 English version
Recommended: Win version, supports code prompts!

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

SublimeText3 Mac version
God-level code editing software (SublimeText3)

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

Atom editor mac version download
The most popular open source editor