


The large model of the alpaca family evolves collectively! 32k context equals GPT-4, produced by Tian Yuandong's team
The open source alpaca large model LLaMA context is equal to GPT-4, with only one simple change!
This paper just submitted by Meta AI shows that after the LLaMA context window is expanded from 2k to 32k, it only requires less than 1000 steps of fine-tuning.
The cost is negligible compared to pre-training.
Expanding the context window means that the AI's "working memory" capacity is increased. Specifically, it can:
- Support more rounds of dialogue , reduce forgetting, such as more stable role-playing
- Enter more information to complete more complex tasks, such as processing longer documents or multiple documents at one time
More important meaning The question is, can all large alpaca model families based on LLaMA adopt this method at low cost and evolve collectively?
Yangtuo is currently the most comprehensive open source basic model, and has derived many fully open source commercially available large models and vertical industry models.
# Tian Yuandong, the corresponding author of the paper, also excitedly shared this new development in his circle of friends.
Large models based on RoPE can use
The new method is called position interpolation (Position Interpolation), which is suitable for large models that use RoPE (rotation position encoding) Applicable to all models.
RoPE was proposed by the Zhuiyi Technology team as early as 2021, and has now become one of the most common position encoding methods for large models.
But directly using extrapolation to expand the context window under this architecture will completely destroy the self-attention mechanism.
Specifically, the part beyond the length of the pre-trained context will cause the model perplexity to soar to the same level as an untrained model.
The new method is changed to linearly reduce the position index and expand the range alignment of the front and rear position index and relative distance.
# It is more intuitive to use pictures to express the difference between the two.
Experimental results show that the new method is effective for LLaMA large models from 7B to 65B.
There is no significant performance degradation in Long Sequence Language Modeling, Passkey Retrieval, and Long Document Summarization.
#In addition to experiments, a detailed proof of the new method is also given in the appendix of the paper.
Three More Thing
The context window used to be an important gap between open source large models and commercial large models.
For example, OpenAI’s GPT-3.5 supports up to 16k, GPT-4 supports 32k, and AnthropicAI’s Claude supports up to 100k.
At the same time, many large open source models such as LLaMA and Falcon are still stuck at 2k.
Now, Meta AI’s new results have directly bridged this gap.
Expanding the context window is also one of the focuses of recent large model research. In addition to position interpolation methods, there are many attempts to attract industry attention.
1. Developer kaiokendev explored a method to extend the LLaMa context window to 8k in a technical blog.
2. Galina Alperovich, head of machine learning at data security company Soveren, summarized 6 tips for expanding the context window in an article.
3. Teams from Mila, IBM and other institutions also tried to completely remove positional encoding in Transformer in a paper.
Friends who need it can click the link below to view~
Meta paper: https://www.php.cn/link/ 0bdf2c1f053650715e1f0c725d754b96
Extending Context is Hard…but not Impossiblehttps://www.php.cn/link/9659078925b57e621eb3f9ef19773ac3
The Secret Sauce context window behind 100K in LLMshttps://www.php.cn/link/09a630e07af043e4cae879dd60db1cac
Positionless Coding Paperhttps://www.php.cn/link/fb6c84779f12283a81d739d8f088fc12
The above is the detailed content of The large model of the alpaca family evolves collectively! 32k context equals GPT-4, produced by Tian Yuandong's team. For more information, please follow other related articles on the PHP Chinese website!

Harness the Power of On-Device AI: Building a Personal Chatbot CLI In the recent past, the concept of a personal AI assistant seemed like science fiction. Imagine Alex, a tech enthusiast, dreaming of a smart, local AI companion—one that doesn't rely

Their inaugural launch of AI4MH took place on April 15, 2025, and luminary Dr. Tom Insel, M.D., famed psychiatrist and neuroscientist, served as the kick-off speaker. Dr. Insel is renowned for his outstanding work in mental health research and techno

"We want to ensure that the WNBA remains a space where everyone, players, fans and corporate partners, feel safe, valued and empowered," Engelbert stated, addressing what has become one of women's sports' most damaging challenges. The anno

Introduction Python excels as a programming language, particularly in data science and generative AI. Efficient data manipulation (storage, management, and access) is crucial when dealing with large datasets. We've previously covered numbers and st

Before diving in, an important caveat: AI performance is non-deterministic and highly use-case specific. In simpler terms, Your Mileage May Vary. Don't take this (or any other) article as the final word—instead, test these models on your own scenario

Building a Standout AI/ML Portfolio: A Guide for Beginners and Professionals Creating a compelling portfolio is crucial for securing roles in artificial intelligence (AI) and machine learning (ML). This guide provides advice for building a portfolio

The result? Burnout, inefficiency, and a widening gap between detection and action. None of this should come as a shock to anyone who works in cybersecurity. The promise of agentic AI has emerged as a potential turning point, though. This new class

Immediate Impact versus Long-Term Partnership? Two weeks ago OpenAI stepped forward with a powerful short-term offer, granting U.S. and Canadian college students free access to ChatGPT Plus through the end of May 2025. This tool includes GPT‑4o, an a


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Dreamweaver Mac version
Visual web development tools

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

SublimeText3 Chinese version
Chinese version, very easy to use

WebStorm Mac version
Useful JavaScript development tools

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.