Home > Article > Technology peripherals > ChatGPT big update! OpenAI offers programmers a gift package: API adds killer capabilities and prices are reduced, new models and four times the context are coming
ChatGPT evolved again overnight, and OpenAI launched a large number of updates in one go!
#The core is the new function calling capability of the API. Similar to the web version of the plug-in, the API can also use external tools.
This ability is put in the hands of developers, and capabilities that the ChatGPT API does not originally have can also be solved by various third-party services.
Some people think that this is a killer feature and the most important update since the release of ChatGPT API.
In addition, this update about ChatGPT APIEach one is very important, not only the capacity is increased, but the price is also lower:
When the news reached China, some netizens believed that this was a major challenge facing large domestic models.
According to the official introduction of OpenAI, function calls support both the new version of GPT-4 and GPT-3.5.
Developers only need the model to describe the functions they need to use. When to call which function is determined by the model itself based on the prompt words, which is the same as the mechanism of ChatGPT calling plug-ins.
For specific usage methods, the official gives three examples:
First, the chatbot calls an external API to perform operations or answer questions , like "Send someone an email" or "What's the weather like today?"
Second, convert natural language into API calls or database queries, such as "How many orders were there last month?" ” will automatically generate SQL query statements.
Third , automatically extract structured data from the text, for example, you only need to define the required "name, birthday" , location" and give a web link, you can automatically extract all the character information mentioned in a Wikipedia article.
This new feature has cheered the majority of netizens, especially developers, saying that with it, work efficiency will be greatly improved.
In the past, if you wanted GPT to call functions, you needed to use LangChain's tools.
……
Although LangChain theoretically has higher operating efficiency, its reliability is inferior to the specially tuned new GPT.
Currently, the new version of the model has gradually begun to iterate.
The latest versions of gpt-4-0613, gpt-3.5-turbo-0613 and the extended context length gpt-4-32k-0613 all support function calls.
gpt-3.5-turbo-16k does not support function calls and provides 4 times the context length, which means that one request can support approximately 20 pages of text.
Old models are also beginning to be gradually abandoned.
Apps using the initial versions of gpt-3.5-turbo and gpt-4 will be automatically upgraded to the new version on June 27th
Developers who need more time to transition can also manually specify to continue Use the old version, but all old version requests will be completely obsolete after September 13th.
After talking about this timeline, let’s take a look at the price.
After the upgrade, OpenAI not only did not increase the price of the product, but also lowered the price.
The first is the most used gpt-3.5-turbo (4k token version).
The price of input tokens has been reduced by 25%, and is now US$0.0015 per thousand tokens, which is 666,000 tokens per US$1.
The price of the output token per thousand tokens is 0.002 US dollars, which is 1 US dollar and 500,000 tokens.
If converted into English text, it is roughly 700 pages for 1 US dollar.
The price of embeddings model has plummeted, directly reduced by 75%.
Every thousand tokens only cost 0.0001 USD, which is 1 USD and 10 million tokens.
In addition, the newly launched 16K token version of GPT3.5-Turbo provides four times the processing power of the 4K version, but is only twice the price.
The prices of input and output tokens are US$0.003 and US$0.004 per thousand tokens respectively.
In addition, a few netizens reported that the monthly bill dropped from 100 to a few cents. It is still unclear what the specific situation is.
Finally, don’t forget to queue up for GPT-4 API testing qualifications if necessary.
Many netizens pointed out that OpenAI's new "function call" is basically a replica of the "Tools" in Langchain.
Perhaps in the future, OpenAI will also copy more functions of Langchain, such as Chains and Indexes.
Langchain is the most popular open source development framework in the field of large models, which can integrate various large model capabilities to quickly build applications.
The team also recently received US$10 million in seed round financing.
Although this update of OpenAI will not directly "kill" the entrepreneurial project of Langchain.
But developers originally needed LangChain to implement some functions, but now they no longer need it.
# Looking at Langchain’s reaction, the desire to survive is indeed very strong.
Within 10 minutes after OpenAI officially released the update, Langchain immediately announced that it was “already working on compatibility.”
And a new version was released in less than an hour. In addition to supporting new official functions, it can also convert tools already written by developers into OpenAI functions.
In addition to queuing up to lament the ridiculously fast development speed, netizens also thought about an unavoidable question:
What should you do if OpenAI destroys your entrepreneurial project?
In this regard, OpenAI CEO Sam Altman just made a statement recently.
At the exchange meeting held by Humanloop at the end of May, Altman once said:
With the exception of consumer-level applications such as ChatGPT, try to avoid competing with customers.
Now it seems that development tools are not included in the scope of avoiding competition.
In addition to startups that compete with OpenAI, there is another entity that cannot be ignored:
Microsoft, OpenAI’s largest sponsor, also provides OpenAI API services through the Azure cloud.
Just recently, some developers reported that after switching from the official OpenAI API to the Microsoft Azure version, the performance was significantly improved.
Specifically:
Including some discounts from Azure, it is even cheaper than before.
#But the update speed of Microsoft Azure is generally several weeks slower than OpenAI.
Use OpenAI for rapid iteration during the development phase, and move to Microsoft Azure for large-scale deployment. Have you learned anything?
Update announcement https://openai.com/blog/function-calling-and-other-api-updates.
GPT-4 API queue https://openai.com/waitlist/gpt-4-api.
Reference link:
[1]https://news .ycombinator.com/item?id=36313348.
##[2]https://twitter.com/svpino/status/1668695130570903552.
[3]https://weibo.com/1727858283/N5cjr0jBq.
[4]https://twitter.com/LangChainAI/status/1668671302624747520.
[5]https://twitter.com/hwchase17/status/1668682373767020545.
[6]https://twitter.com/ItakGol/status/1668336193270865921.
The above is the detailed content of ChatGPT big update! OpenAI offers programmers a gift package: API adds killer capabilities and prices are reduced, new models and four times the context are coming. For more information, please follow other related articles on the PHP Chinese website!