search
HomeTechnology peripheralsAIThe model will evolve after merging, and directly win SOTA! Transformer author's new entrepreneurial achievements are popular

Use the ready-made models on Huggingface to "save up" -

can you directly combine them to create new powerful models? !

The Japanese large model company sakana.ai was very creative (it was the company founded by one of the "Transformer Eight") and came up with such a coup to evolve the merged model.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

This method can not only automatically generate a new basic model, but alsoThe performance is absolutely bad:

They Utilizing a large model of Japanese mathematics containing 7 billion parameters, it achieved state-of-the-art results on relevant benchmarks, surpassing previous models such as the 70 billion parameter Llama-2.

The most important thing is that arriving at such a model does not require any gradient training, so the computing resources required are greatly reduced.

NVIDIA scientist Jim Fan praised after reading it:

This is one of the most imaginative papers I have read recently.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

Merge and evolve, automatically generate new basic models

Most of the best-performing models from the open source large model rankings are no longer LLaMA Or "original" models like Mistral, but after some fine-tuning or merging models, we can see:

A new trend has emerged.

Sakana.ai introduced that the open source basic model can be easily extended and fine-tuned in hundreds of different directions, and then produce new models that perform well in new fields.

Among these, model merging shows great prospects.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

However, it may be a kind of "black magic" that relies heavily on intuition and professional knowledge.

Therefore, we need a more systematic approach.

Inspired by natural selection in nature, Sakana.ai focused on evolutionary algorithms, introduced the concept of "Evolutionary Model Merge", and proposed a method that can discover the best model General method of combination.

This method combines two different ideas:

(1) merging models in the data flow space (layer) , and (2) merging parameter spaces (weights) model in .

Specifically, the first data flow space method uses evolution to discover the best combination of different model layers to form a new model.

In the past, the community relied on intuition to determine how and which layers of a model can be combined with layers of another model.

But in fact, Sakana.ai introduced that this problem has a search space with a huge number of combinations, which is most suitable for searching by optimization algorithms such as evolutionary algorithms.

The operation examples are as follows:

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

As for the second parameter space method, multiple model weights are mixed to form a new model.

There are actually countless ways to implement this method, and in principle, each layer of mixing can use different mixing ratios, even more.

And here, using evolutionary methods can effectively find more novel hybrid strategies.

The following is an example of mixing the weights of two different models to obtain a new model:

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

Combine the above two methods, that’s it :

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

The authors introduce that they hope to form new emerging fields that have not been explored before in distant fields, such as mathematics and non-English languages, vision and non-English languages. combination.

The result is really a bit surprising.

New model easily wins SOTA

Using the above evolutionary merging method, the team obtained 3 basic models:

  • Large language model EvoLLM-JP

is formed by merging the large Japanese model Shisa-Gamma and the large mathematical model WizardMath/Abel. It is good at solving Japanese mathematics problems and has evolved for 100-150 generations.

  • Visual language model EvoVLM-JP
##Japanese large model Shisa Gamma 7B v1 LLaVa-1.6-Mistral-7B , is a VLM with Japanese language ability.

  • Image generation model EvoSDXL-JP
Supports Japanese SDXL diffusion model.

The first two have been released on Hugging Face and GitHub, and the last one will be launched soon.

Look specifically.

1. EvoLLM-JP

It achieved the following results on the Japanese evaluation set of MGSM, a multilingual version of the GSM8K data set:

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

It can be seen that the performance of EvoLLM-JP in solving mathematical problems in Japanese exceeds their original models, and also exceeds high-performance models such as Llama-2 and GPT-3.5 .

Among them, Model 4 is optimized only in the parameter space, and Model 6 is the result of further optimization using Model 4 in the data flow space.

On the Japanese lm-evaluation-harness benchmark, which evaluates both data capabilities and general Japanese language skills, EvoLLM-JP achieved a maximum average score of 70.5 on 9 tasks - using only 7 billion parameters. It beats models such as the 70 billion Llama-2.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

The team stated that EvoLLM-JP is good enough to be used as a general Japanese model and solve some interesting examples:

For example, specific Japanese culture is required Math problems of knowledge, or telling Japanese jokes in Kansai dialect.

2, EvoVLM-JP

On the following two benchmark data sets of image question and answer, the higher the score, the model answers in Japanese The description is more accurate.

As a result, it is not only better than the English VLM LLaVa-1.6-Mistral-7B on which it is based, but also better than the existing Japanese VLM.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

As shown in the picture below, when asked what the color of the signal light in the picture is, only EvoVLM-JP answered correctly: blue.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

3. EvoSDXL-JP

This SDXL model that supports Japanese only requires 4 diffusion models Inference can be performed and the generation speed is quite fast.

The specific running scores have not yet been released, but the team revealed that it is "quite promising."

You can enjoy some examples:

The prompt words include: Miso ラーメン, the highest quality Ukiyoe, Katsushika Hokusai, Edo era.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

For the above 3 new models, the team pointed out:

In principle, we can use gradient-based backpropagation to further improve the above performance of these models.

But we don’t use , because the purpose now is to show that even without backpropagation, we can still get a sufficiently advanced basic model to challenge the current "expensive Paradigm”.

Netizens liked this one after another.

Jim Fan also added:

In the field of basic models, the current community is almost entirely focused on letting the model learn, and does not pay much attention to search , but the latter actually has huge potential in the training (that is, the evolutionary algorithm proposed in this article) and inference stage.

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular
Liked by Musk

So, as netizens said:

We are now in the Cambrian of the model Is it the era of the Great Explosion?

The model will evolve after merging, and directly win SOTA! Transformer authors new entrepreneurial achievements are popular

Paper address: https://arxiv.org/abs/2403.13187

The above is the detailed content of The model will evolve after merging, and directly win SOTA! Transformer author's new entrepreneurial achievements are popular. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
Gemma Scope: Google's Microscope for Peering into AI's Thought ProcessGemma Scope: Google's Microscope for Peering into AI's Thought ProcessApr 17, 2025 am 11:55 AM

Exploring the Inner Workings of Language Models with Gemma Scope Understanding the complexities of AI language models is a significant challenge. Google's release of Gemma Scope, a comprehensive toolkit, offers researchers a powerful way to delve in

Who Is a Business Intelligence Analyst and How To Become One?Who Is a Business Intelligence Analyst and How To Become One?Apr 17, 2025 am 11:44 AM

Unlocking Business Success: A Guide to Becoming a Business Intelligence Analyst Imagine transforming raw data into actionable insights that drive organizational growth. This is the power of a Business Intelligence (BI) Analyst – a crucial role in gu

How to Add a Column in SQL? - Analytics VidhyaHow to Add a Column in SQL? - Analytics VidhyaApr 17, 2025 am 11:43 AM

SQL's ALTER TABLE Statement: Dynamically Adding Columns to Your Database In data management, SQL's adaptability is crucial. Need to adjust your database structure on the fly? The ALTER TABLE statement is your solution. This guide details adding colu

Business Analyst vs. Data AnalystBusiness Analyst vs. Data AnalystApr 17, 2025 am 11:38 AM

Introduction Imagine a bustling office where two professionals collaborate on a critical project. The business analyst focuses on the company's objectives, identifying areas for improvement, and ensuring strategic alignment with market trends. Simu

What are COUNT and COUNTA in Excel? - Analytics VidhyaWhat are COUNT and COUNTA in Excel? - Analytics VidhyaApr 17, 2025 am 11:34 AM

Excel data counting and analysis: detailed explanation of COUNT and COUNTA functions Accurate data counting and analysis are critical in Excel, especially when working with large data sets. Excel provides a variety of functions to achieve this, with the COUNT and COUNTA functions being key tools for counting the number of cells under different conditions. Although both functions are used to count cells, their design targets are targeted at different data types. Let's dig into the specific details of COUNT and COUNTA functions, highlight their unique features and differences, and learn how to apply them in data analysis. Overview of key points Understand COUNT and COU

Chrome is Here With AI: Experiencing Something New Everyday!!Chrome is Here With AI: Experiencing Something New Everyday!!Apr 17, 2025 am 11:29 AM

Google Chrome's AI Revolution: A Personalized and Efficient Browsing Experience Artificial Intelligence (AI) is rapidly transforming our daily lives, and Google Chrome is leading the charge in the web browsing arena. This article explores the exciti

AI's Human Side: Wellbeing And The Quadruple Bottom LineAI's Human Side: Wellbeing And The Quadruple Bottom LineApr 17, 2025 am 11:28 AM

Reimagining Impact: The Quadruple Bottom Line For too long, the conversation has been dominated by a narrow view of AI’s impact, primarily focused on the bottom line of profit. However, a more holistic approach recognizes the interconnectedness of bu

5 Game-Changing Quantum Computing Use Cases You Should Know About5 Game-Changing Quantum Computing Use Cases You Should Know AboutApr 17, 2025 am 11:24 AM

Things are moving steadily towards that point. The investment pouring into quantum service providers and startups shows that industry understands its significance. And a growing number of real-world use cases are emerging to demonstrate its value out

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
1 months agoBy尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. Best Graphic Settings
1 months agoBy尊渡假赌尊渡假赌尊渡假赌
Will R.E.P.O. Have Crossplay?
1 months agoBy尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

DVWA

DVWA

Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software