search
HomeTechnology peripheralsAIHas the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

In natural language processing tasks, large language models have achieved impressive results in zero-shot and few-shot learning. However, all models have inherent limitations that can often only be partially addressed through further extensions. Specifically, the limitations of the model include the inability to access the latest information, the "information hallucination" of facts, the difficulty of understanding low-resource languages, the lack of mathematical skills for precise calculations, etc.

A simple way to solve these problems is to equip the model with external tools, such as a search engine, calculator, or calendar. However, existing methods often rely on extensive manual annotations or limit the use of tools to specific task settings, making the use of language models combined with external tools difficult to generalize.

In order to break this bottleneck, Meta AI recently proposed a new method called Toolformer, which allows the language model to learn to "use" various external tools.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

##Paper address: https://arxiv.org/pdf/2302.04761v1.pdf

Toolformer quickly attracted great attention. Some people believed that this paper solved many problems of current large language models and praised: "This is the most important article in recent weeks. paper".

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Some people pointed out that Toolformer uses self-supervised learning to allow large language models to learn to use some APIs and tools, which are very flexible and efficient:

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Some even think that Toolformer will take us away from general artificial intelligence (AGI) One step closer.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Toolformer gets such a high rating because it meets the following practical needs:

  • Large language models should learn the use of tools in a self-supervised manner without the need for extensive human annotation. This is critical because the cost of human annotation is high, but more importantly, what humans think is useful may be different from what the model thinks is useful.
  • Language models require more comprehensive use of tools that are not bound to a specific task.

This clearly breaks the bottleneck mentioned above. Let’s take a closer look at Toolformer’s methods and experimental results.

Method

Toolformer generates the dataset from scratch based on a large language model with in-context learning (ICL) (Schick and Schütze, 2021b; Honovich et al. , 2022; Wang et al., 2022)’s idea: just give a few samples of humans using the API, you can let LM annotate a huge language modeling data set with potential API calls; then use self-supervised loss function to determine which API calls actually help the model predict future tokens; and finally fine-tune based on API calls that are useful to the LM itself.

Since Toolformer is agnostic to the dataset used, it can be used on the exact same dataset as the model was pre-trained on, which ensures that the model does not lose any generality and language Modeling capabilities.

Specifically, the goal of this research is to equip the language model M with the ability to use various tools through API calls. This requires that the input and output of each API can be characterized as a sequence of text. This allows API calls to be seamlessly inserted into any given text, with special tokens used to mark the beginning and end of each such call.

The study represents each API call as a tuple

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

, where a_c is the name of the API and i_c is the corresponding input. Given an API call c with corresponding result r, this study represents the linearized sequence of API calls excluding and including its result as:

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Among them, , and "→" are special tokens. As shown in Figure 1 below, this approach allows LMs to learn to control various tools and choose for themselves which tool to use when and how.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Given data set

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

, the study first transformed this data set into a data set C* with added API calls. This is done in three steps, as shown in Figure 2 below: First, the study leverages M's in-context learning capabilities to sample a large number of potential API calls, then executes these API calls, and then checks whether the obtained responses help predictions Future token to be used as filtering criteria. After filtering, the study merges API calls to different tools, ultimately generating dataset C*, and fine-tunes M itself on this dataset.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Experiments and Results

The study was conducted on a variety of different downstream tasks Experimental results show that: Toolformer based on the 6.7B parameter pre-trained GPT-J model (learned to use various APIs and tools) significantly outperforms the larger GPT-3 model and several other baselines on various tasks.

This study evaluated several models on the SQuAD, GoogleRE and T-REx subsets of the LAMA benchmark. The experimental results are shown in Table 3 below:

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

To test the mathematical reasoning capabilities of Toolformer, the study conducted experiments on the ASDiv, SVAMP, and MAWPS benchmarks. Experiments show that Toolformer uses calculator tools in most cases, which is significantly better than OPT (66B) and GPT-3 (175B).

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

In terms of question answering, the study conducted experiments on three question answering data sets: Web Questions, Natural Questions and TriviaQA . Toolformer significantly outperforms baseline models of the same size, but is inferior to GPT-3 (175B).

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

In terms of cross-language tasks, this study compared all baseline models on Toolformer and MLQA, and the results are as follows As shown in Table 6:

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

##In order to study the effectiveness of the calendar API, the study was conducted on TEMPLAMA and a new API called DATESET Experiments were conducted on several models on the dataset. Toolformer outperforms all baselines but does not use the TEMPLAMA calendar tool.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

In addition to validating performance improvements on various downstream tasks, the study also hopes to ensure that Toolformer's language modeling performance is not degraded by fine-tuning of API calls. To this end, this study conducts experiments on two language modeling datasets to evaluate, and the perplexity of the model is shown in Table 8 below.

For language modeling without any API calls, there is no cost to add API calls.

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

Finally, the researchers analyzed how the ability to seek help from external tools affects the model as the size of the language model increases. The impact of performance, the analysis results are shown in Figure 4 below

Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer

##Interested readers can read the original text of the paper to learn more Study the details.

The above is the detailed content of Has the language model learned to use search engines on its own? Meta AI proposes API call self-supervised learning method Toolformer. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
What is Graph of Thought in Prompt EngineeringWhat is Graph of Thought in Prompt EngineeringApr 13, 2025 am 11:53 AM

Introduction In prompt engineering, “Graph of Thought” refers to a novel approach that uses graph theory to structure and guide AI’s reasoning process. Unlike traditional methods, which often involve linear s

Optimize Your Organisation's Email Marketing with GenAI AgentsOptimize Your Organisation's Email Marketing with GenAI AgentsApr 13, 2025 am 11:44 AM

Introduction Congratulations! You run a successful business. Through your web pages, social media campaigns, webinars, conferences, free resources, and other sources, you collect 5000 email IDs daily. The next obvious step is

Real-Time App Performance Monitoring with Apache PinotReal-Time App Performance Monitoring with Apache PinotApr 13, 2025 am 11:40 AM

Introduction In today’s fast-paced software development environment, ensuring optimal application performance is crucial. Monitoring real-time metrics such as response times, error rates, and resource utilization can help main

ChatGPT Hits 1 Billion Users? 'Doubled In Just Weeks' Says OpenAI CEOChatGPT Hits 1 Billion Users? 'Doubled In Just Weeks' Says OpenAI CEOApr 13, 2025 am 11:23 AM

“How many users do you have?” he prodded. “I think the last time we said was 500 million weekly actives, and it is growing very rapidly,” replied Altman. “You told me that it like doubled in just a few weeks,” Anderson continued. “I said that priv

Pixtral-12B: Mistral AI's First Multimodal Model - Analytics VidhyaPixtral-12B: Mistral AI's First Multimodal Model - Analytics VidhyaApr 13, 2025 am 11:20 AM

Introduction Mistral has released its very first multimodal model, namely the Pixtral-12B-2409. This model is built upon Mistral’s 12 Billion parameter, Nemo 12B. What sets this model apart? It can now take both images and tex

Agentic Frameworks for Generative AI Applications - Analytics VidhyaAgentic Frameworks for Generative AI Applications - Analytics VidhyaApr 13, 2025 am 11:13 AM

Imagine having an AI-powered assistant that not only responds to your queries but also autonomously gathers information, executes tasks, and even handles multiple types of data—text, images, and code. Sounds futuristic? In this a

Applications of Generative AI in the Financial SectorApplications of Generative AI in the Financial SectorApr 13, 2025 am 11:12 AM

Introduction The finance industry is the cornerstone of any country’s development, as it drives economic growth by facilitating efficient transactions and credit availability. The ease with which transactions occur and credit

Guide to Online Learning and Passive-Aggressive AlgorithmsGuide to Online Learning and Passive-Aggressive AlgorithmsApr 13, 2025 am 11:09 AM

Introduction Data is being generated at an unprecedented rate from sources such as social media, financial transactions, and e-commerce platforms. Handling this continuous stream of information is a challenge, but it offers an

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
3 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. Best Graphic Settings
3 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. How to Fix Audio if You Can't Hear Anyone
3 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
WWE 2K25: How To Unlock Everything In MyRise
4 weeks agoBy尊渡假赌尊渡假赌尊渡假赌

Hot Tools

DVWA

DVWA

Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

VSCode Windows 64-bit Download

VSCode Windows 64-bit Download

A free and powerful IDE editor launched by Microsoft

MinGW - Minimalist GNU for Windows

MinGW - Minimalist GNU for Windows

This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

ZendStudio 13.5.1 Mac

ZendStudio 13.5.1 Mac

Powerful PHP integrated development environment

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools