


HuggingFace: Two alpacas are spliced together after removing their heads and tails
HuggingFace’s open source large model rankings have been eliminated again.
The front row is exclusively occupied by the SOLAR 10.7B fine-tuned version, squeezing out the various Mixtral 8x7B fine-tuned versions from a few weeks ago.
#What is the origin of the large SOLAR model?
The related paper has just been uploaded to ArXiv, from the Korean company Upstage AI, using the new large model expansion method depth up-scaling(DUS).
To put it simply, it’sTwo 7B alpacas cut off the head and tail, one cut off the first 8 layers, and the other Only cut off the last 8 layers.
The remaining two 24 layers are stitched together, the 24th layer of the first model is spliced with the 9th layer of the second model, and finally becomes New 48-story 10.7B large model.
The paper claims that the new method surpasses traditional extension methods such as MoE, and can use exactly the same infrastructure as the basic large model.
There is no need for additional modules such as gated networks, the training framework is optimized for MoE, and there is no need to customize CUDA kernels for fast inference. It can be seamlessly integrated into existing methods while maintaining efficiency.
The team chose Mistral 7B, the strongest single large model of 7B, as the base material, and used new methods to splice it together to surpass the original version and the MoE version.
At the same time, the aligned Instruct version also surpasses the corresponding MoE Instruct version.
Carry out the stitching to the end
Why is this splicing method? The introduction in the paper comes from an intuition.
Start with the simplest expansion method, which is to repeat the 32-layer basic large model twice to become 64 layers.
The advantage of this is that there is no heterogeneity, all layers are from the base large model, but the seams of layer 32 and layer 33 (same as layer 1) There is a larger "layer distance"(layer distance).
Previous research has shown that different layers of Transformer do different things. For example, deeper layers are better at processing more abstract concepts.
The team believes that too large a layer distance may hinder the model's ability to effectively utilize pre-trained weights.
One potential solution was to sacrifice the middle layer, thereby reducing the difference at the seams, and this is where the DUS method was born.
Based on the trade-off between performance and model size, the team chose to delete 8 layers from each model, and the seams were changed from 32 layers to layer 1 to 24 layers to layer 9.
The performance of the simply spliced model will still be lower than the original base model at first, but it can recover quickly after continued pre-training.
In the instruction fine-tuning phase, in addition to using open source data sets, a mathematically enhanced data set was also produced, and DPO was used in the alignment phase.
The last step is to weight the average of the model versions trained using different data sets, which is also the completion of the stitching.
Some netizens questioned the possibility of test data leakage.
The team also took this into consideration and specifically reported the data pollution test results in the appendix of the paper, which showed a low level.
Finally, both the SOLAR 10.7B basic model and the fine-tuned model are open source under the Apache 2.0 license.
Netizens who have tried it have reported that it performs well in extracting data from JSON format data.
Paper address: https://arxiv.org/abs/2312.15166
The above is the detailed content of HuggingFace: Two alpacas are spliced together after removing their heads and tails. For more information, please follow other related articles on the PHP Chinese website!

Introduction In prompt engineering, “Graph of Thought” refers to a novel approach that uses graph theory to structure and guide AI’s reasoning process. Unlike traditional methods, which often involve linear s

Introduction Congratulations! You run a successful business. Through your web pages, social media campaigns, webinars, conferences, free resources, and other sources, you collect 5000 email IDs daily. The next obvious step is

Introduction In today’s fast-paced software development environment, ensuring optimal application performance is crucial. Monitoring real-time metrics such as response times, error rates, and resource utilization can help main

“How many users do you have?” he prodded. “I think the last time we said was 500 million weekly actives, and it is growing very rapidly,” replied Altman. “You told me that it like doubled in just a few weeks,” Anderson continued. “I said that priv

Introduction Mistral has released its very first multimodal model, namely the Pixtral-12B-2409. This model is built upon Mistral’s 12 Billion parameter, Nemo 12B. What sets this model apart? It can now take both images and tex

Imagine having an AI-powered assistant that not only responds to your queries but also autonomously gathers information, executes tasks, and even handles multiple types of data—text, images, and code. Sounds futuristic? In this a

Introduction The finance industry is the cornerstone of any country’s development, as it drives economic growth by facilitating efficient transactions and credit availability. The ease with which transactions occur and credit

Introduction Data is being generated at an unprecedented rate from sources such as social media, financial transactions, and e-commerce platforms. Handling this continuous stream of information is a challenge, but it offers an


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Atom editor mac version download
The most popular open source editor

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool

Dreamweaver CS6
Visual web development tools

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),