Home >Technology peripherals >AI >OpenAI opens GPT-4 to all paying API users
According to news on July 7, OpenAI launched GPT-4 in March this year, but only invited some developers who submitted applications to participate in the test; in a press release issued today, OpenAI announced Open GPT-4 to all developers who pay for API access.
GPT-4 is another major breakthrough after GPT-3, with more than 100 billion parameters, 10 times the number of GPT-3. GPT-4 can generate various types and styles of natural language output, such as articles, dialogues, summaries, poems, lyrics, etc., based on given text or voice input.
OpenAI also said that starting today, more APIs including GPT-3.5 Turbo, Whisper, and its DALL・E image generation will be available.
The blog post states that OpenAI is working on “implementing fine-tuning of GPT-4 and GPT-3.5 Turbo” and plans to make these services available to developers later in 2023.
IT House quoted the OpenAI blog post as saying that the opening of GPT-4 is an important step for OpenAI to realize its vision, which aims to provide everyone with the capabilities and opportunities of general artificial intelligence (AGI).
OpenAI stated that GPT-4 can help developers and companies in education, entertainment, medical, business and other industries to create more innovative and valuable applications and services.
At the same time, OpenAI also emphasizes the supervision and management of the security and ethics of GPT-4 to prevent its abuse or misuse.
Foreign technology media NeoWin believes that the opening of GPT-4 is a milestone in the field of artificial intelligence and a huge boost to natural language processing technology. GPT-4 not only demonstrates its powerful generation capabilities and flexibility, but also provides more possibilities and potential for communication and cooperation between humans and machines.
The above is the detailed content of OpenAI opens GPT-4 to all paying API users. For more information, please follow other related articles on the PHP Chinese website!