Home > Article > Technology peripherals > Guide to using the ChatGPT plug-in to unlock a new Internet experience
ChatGPT’s knowledge base was trained with data as of September 2021, but by using these plugins, ChatGPT is now able to search the web for the latest answers, thus removing the limitations of relying solely on its knowledge base.
Recently, OpenAI released a new feature of ChatGPT: plug-in system. ChatGPT can now be extended with functionality and perform new tasks, such as:
ChatGPT’s knowledge base was trained with data as of September 2021, but by using these plugins, ChatGPT is now able to search the web for the latest answers , thus removing the limitation of relying solely on its knowledge base.
OpenAI also enables any developer to create their own plugins. Although developers currently need to join a waiting list (https://openai.com/waitlist/plugins), the files to create plugins are already available.
More information about the plug-in process can be found on this page (https://platform.openai.com/docs/plugins/introduction).
Example code can be found on this page (https://platform.openai.com/docs/plugins/examples).
The documentation only shows how the integration between the third-party API and ChatGPT works. The following article will explore the inner workings of this integration:
"How do large language models perform operations without receiving relevant training?"
LangChain is a framework for creating chatbots, generative question answering, summaries, and more
LangChain is Harrison Chase (hwchase17) A tool developed in 2022 to assist developers in integrating third-party applications into large language models (LLM).
Borrow the example shown below to explain its working mode:
import os
os.environ["SERPAPI_API_KEY"] = "
os. environ["OPENAI_API_KEY"] = "
from langchain.agents import load_tools
from langchain.agents import initialize_agent
from langchain.llms import OpenAI
# First, load the language model you want to use to control the agent
llm = OpenAI(temperature=0)
# Next, load some tools to use. Note that the llm-math tool uses LLM, so you need to pass it in
tools = load_tools(["serpapi", "llm-math"], llm=llm)
# Finally, use the tools , language model and the agent type you want to use to initialize the agent
agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
# Test now
agent.run("Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?")
Three main parts can be seen from this example:
The 0-shot-react-description is used here. From its documentation, we can learn that "this agent uses the ReAct framework and decides which tool to use based entirely on the tool's description." This information will be used later.
serpapi: a wrapper around the https://serpapi.com/ API. It is used for browsing the web.
llm-math: Enables the agent to answer math-related questions in prompts, such as "What is his current age raised to the power 0.23?".
When the script is run, the agent does several things, such as browsing who Olivia Wilde's boyfriend is, extracting his name, asking Harry Style's age, performing a search and using llm-math The tool calculates 29^0.23, which is 2.16.
The biggest advantage of LangChain is that it does not rely on a single provider, as documented (https://python.langchain.com/en/latest/modules/llms/ integrations.html).
On March 21, OpenAI’s strongest partner Microsoft released MM-REACT, revealing ChatGPT’s multi-modal reasoning and actions (https://github.com/microsoft/MM-REACT).
When looking at the capabilities of this "system paradigm", you can see that each example involves an interaction between the language model and some external application.
By looking at the sample code provided (https://github.com/microsoft/MM-REACT/blob/main/sample.py), we can see , the implementation of de model tools interaction is done with LangChain. The README.md file (https://github.com/microsoft/MM-REACT/blob/main/README.md) also states that "MM-REACT's code is based on langchain".
Combining this evidence, plus the fact that the ChatGPT plug-in documentation mentions that “plug-in descriptions, API requests, and API responses are all inserted into conversations with ChatGPT.” it can be assumed that the plug-in system adds different plug-ins as proxies tool, in this case ChatGPT.
It is also possible that OpenAI turned ChatGPT into a proxy of type zero-shot-react-description to support these plug-ins (which is the type we saw in the previous example). Because the description of the API is inserted into the conversation, this matches the agent's expectations, as can be seen in the documentation excerpt below.
LangChain
Although the plug-in system is not yet open to users, it can be experienced using published documents and MM-REACT The powerful functions of ChatGPT plug-in system.
The above is the detailed content of Guide to using the ChatGPT plug-in to unlock a new Internet experience. For more information, please follow other related articles on the PHP Chinese website!