search
HomeTechnology peripheralsAIGPT-3 plans to be open source! Sam Altman revealed that there is an urgent need for GPUs, and GPT-4 multi-modal capabilities will be available next year

After the hearing, Sam Altman took the team on a "travel" in Europe.

In a recent interview, Altman did not hide it, revealing that all progress in his own AI will have to wait until the GPU catches up.

He discussed OpenAI’s API and product plans, which attracted the attention of many people.

Many netizens have said that I like Altman’s candor.

It is worth mentioning that the multi-modal capabilities of GPT-4 should be available to most Plus users in 2024, provided they have enough GPUs.

The supercomputer Microsoft’s big brother spent US$1.2 billion to build for OpenAI is far from being able to meet the computing power required for GPT-4 runtime. After all, GPT-4 parameters are said to have 100 trillion.

In addition, Altman also revealed that GPT-3 is also included in OpenAI’s open source plan.

I don’t know if this interview revealed too many “secrets” of OpenAI. The source manuscript has been deleted, so hurry up and code it.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Key points

The latest interview was hosted by Raza Habib, CEO of the AI ​​development platform Humanloop, and interviewed Altman and 20 other developers.

This discussion touches on practical developer issues, as well as larger issues related to OpenAI’s mission and the social impact of AI.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

The following are the key points:

1. OpenAI is in urgent need of GPU

2. OpenAI’s recent roadmap: GPT-4 multi-modal will be opened in 2024

3. The ChatGPT plug-in accessed through API will not be released in the near future

4. OpenAI only makes ChatGPT, the "killer application", with the goal of making ChatGPT a super-smart work assistant

##5. GPT-3 In the open source plan

6. The scaling law of model performance continues to be effective

Next, I will introduce what Sam Altman said from 6 major points What happened?

OpenAI currently relies heavily on GPU

All the topics in the interview revolved around, "OpenAI is too short of GPU."

This has delayed many of their short-term plans.

Currently, many customers of OpenAI are complaining about the reliability and speed of the API. Sam Altman explained that the main reason is that the GPU is too short.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

##OpenAI is the first customer of NVIDIA DGX-1 supercomputer ## As for , the context length supporting 32k tokens cannot yet be rolled out to more people.

Since OpenAI has not overcome technical obstacles yet, it looks like they will have the context to support 100k-1M tokens this year, but will need to make a breakthrough in research.

The Fine-Tuning API is also currently limited by GPU availability.

OpenAI does not yet use efficient fine-tuning methods like Adapters or LoRa, so fine-tuning is very computationally intensive to run and manage.

However, they will provide better support for fine-tuning in the future. OpenAI may even host a marketplace of community-contributed models.

Finally, dedicated capacity provision is also limited by GPU availability.

At the beginning of this year, netizens revealed that OpenAI was quietly launching a new developer platform, Foundry, to allow customers to run the company’s new machine learning model on dedicated capacity.

This product is "designed for cutting-edge customers running larger workloads." To use this service, customers must be willing to pay $100k upfront.

However, as can be seen from the picture information disclosed, the example is not cheap.

Running the lightweight version of GPT-3.5 will cost $78,000 for 3 months of delegation and $264,000 for a year.

It can also be seen from the other side that GPU consumption is expensive.

OpenAI near-term roadmap

Altman shared the tentative near-term roadmap for the OpenAI API:

2023 :

· Fast and cheap GPT-4 ーー This is a top priority for OpenAI.

In general, OpenAI’s goal is to reduce “smart costs” as much as possible. So they will work hard to continue driving down the cost of their APIs.

·Longer context windows—In the near future, context windows may support up to 1 million tokens.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

· Fine-tuning API – The fine-tuning API will be extended to the latest models, but the exact form will depend on what the developer really wants .

·Memory API ーーCurrently most tokens are wasted in the above transmission. In the future, there will be an API version that can remember the conversation history.

2024:

##·Multi-modal capabilitiesーーGPT-4 demonstrated powerful multi-modal capabilities when it was released, but until GPUs are satisfied, this feature cannot be extended to everyone.

The plug-in "has no PMF" and will not appear in the API in the short term

Many developers are very interested in accessing the ChatGPT plug-in through the API, but Sam said that these plug-ins It won't be released anytime soon.

"Except for Browsing, the plug-in system has not found PMF."

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

He also pointed out that, Many people want to put their products into ChatGPT, but in fact what they really need is to put ChatGPT into their products.

Except for ChatGPT, OpenAI will not release any more products

Every move OpenAI makes makes developers tremble.

Many developers said they were nervous about using the OpenAI API to build applications when OpenAI might release products that compete with them.

And Altman said that OpenAI will not release more products outside of ChatGPT.

In his opinion, a great company has a "killer application", and ChatGPT is going to be this record-breaking application.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

ChatGPT’s vision is to become a super smart work assistant. OpenAI will not touch many other GPT use cases.

Regulation is necessary, but so is open source

While Altman calls for regulation of future models, he does not believe that existing models are dangerous.

He believes that regulating or banning existing models would be a huge mistake.

In the interview, he reiterated his belief in the importance of open source and said that OpenAI is considering making GPT-3 open source.

Now, part of the reason OpenAI is not open source is because he doubts how many individuals and companies have the ability to host and provide large models.

The "scaling law" of model performance is still valid

Recently, many articles have claimed that the era of giant artificial intelligence models is over. However, it does not accurately reflect Altman's original intention.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

#OpenAI’s internal data shows that scaling laws for model performance are still in effect, and making the model larger will continue to yield performance.

However, OpenAI has already scaled up its model millions of times in just a few years and therefore cannot sustain this rate of expansion.

This doesn't mean that OpenAI won't continue to try to make models bigger, it just means that they will probably only increase by 1x/2x per year instead of multiple orders of magnitude. The fact that scaling laws continue to be in effect has important implications for the timeline of AGI development.

The scaling assumption is that we probably already have most of the parts needed to build an AGI, and most of the remaining work will be scaling existing methods to larger models and larger data sets.

If the era of scaling is over, then we should probably expect AGI to be even further away. The continued validity of the scaling law strongly implies that the timeline for implementing AGI will become shorter.

Hot comments from netizens

Some netizens joked,

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

##OpenAI: Must Protect our moats through regulations. OpenAI once again mentioned that Meta is peeing in our moat, which should also mean that our models need to be open source.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Some people say that if GPT-3 is really open source, like LLMa, it will take about 5 days to be ready on the M1 chip It's running.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Community developers can help OpenAI solve the GPU bottleneck, provided they open source their models. Within days, developers will be able to run it on CPUs and edge devices.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Regarding the GPU shortage, some people think that there is a problem with OpenAI’s capital chain and it cannot afford it.

However, others said there was an obvious lack of supply. Unless there is a revolution in chip manufacturing, there will likely always be an undersupply relative to consumer GPUs.

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Some netizens doubt that Nvidia’s value is still underestimated? The step change in computing demand may last for years...

GPT-3计划开源!Sam Altman自曝急缺GPU,GPT-4多模态能力明年开放

Nvidia just joined the trillion dollar club, so unlimited computing power Demand may lead to a world with chip factories exceeding $2 trillion.

Reference:

##https://www.php.cn/link /c55d22f5c88cc6f04c0bb2e0025dd70b

#https://www.php.cn/link/d5776aeecb3c45ab15adce6f5cb355f3

The above is the detailed content of GPT-3 plans to be open source! Sam Altman revealed that there is an urgent need for GPUs, and GPT-4 multi-modal capabilities will be available next year. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
Are You At Risk Of AI Agency Decay? Take The Test To Find OutAre You At Risk Of AI Agency Decay? Take The Test To Find OutApr 21, 2025 am 11:31 AM

This article explores the growing concern of "AI agency decay"—the gradual decline in our ability to think and decide independently. This is especially crucial for business leaders navigating the increasingly automated world while retainin

How to Build an AI Agent from Scratch? - Analytics VidhyaHow to Build an AI Agent from Scratch? - Analytics VidhyaApr 21, 2025 am 11:30 AM

Ever wondered how AI agents like Siri and Alexa work? These intelligent systems are becoming more important in our daily lives. This article introduces the ReAct pattern, a method that enhances AI agents by combining reasoning an

Revisiting The Humanities In The Age Of AIRevisiting The Humanities In The Age Of AIApr 21, 2025 am 11:28 AM

"I think AI tools are changing the learning opportunities for college students. We believe in developing students in core courses, but more and more people also want to get a perspective of computational and statistical thinking," said University of Chicago President Paul Alivisatos in an interview with Deloitte Nitin Mittal at the Davos Forum in January. He believes that people will have to become creators and co-creators of AI, which means that learning and other aspects need to adapt to some major changes. Digital intelligence and critical thinking Professor Alexa Joubin of George Washington University described artificial intelligence as a “heuristic tool” in the humanities and explores how it changes

Understanding LangChain Agent FrameworkUnderstanding LangChain Agent FrameworkApr 21, 2025 am 11:25 AM

LangChain is a powerful toolkit for building sophisticated AI applications. Its agent architecture is particularly noteworthy, allowing developers to create intelligent systems capable of independent reasoning, decision-making, and action. This expl

What are the Radial Basis Functions Neural Networks?What are the Radial Basis Functions Neural Networks?Apr 21, 2025 am 11:13 AM

Radial Basis Function Neural Networks (RBFNNs): A Comprehensive Guide Radial Basis Function Neural Networks (RBFNNs) are a powerful type of neural network architecture that leverages radial basis functions for activation. Their unique structure make

The Meshing Of Minds And Machines Has ArrivedThe Meshing Of Minds And Machines Has ArrivedApr 21, 2025 am 11:11 AM

Brain-computer interfaces (BCIs) directly link the brain to external devices, translating brain impulses into actions without physical movement. This technology utilizes implanted sensors to capture brain signals, converting them into digital comman

Insights on spaCy, Prodigy and Generative AI from Ines MontaniInsights on spaCy, Prodigy and Generative AI from Ines MontaniApr 21, 2025 am 11:01 AM

This "Leading with Data" episode features Ines Montani, co-founder and CEO of Explosion AI, and co-developer of spaCy and Prodigy. Ines offers expert insights into the evolution of these tools, Explosion's unique business model, and the tr

A Guide to Building Agentic RAG Systems with LangGraphA Guide to Building Agentic RAG Systems with LangGraphApr 21, 2025 am 11:00 AM

This article explores Retrieval Augmented Generation (RAG) systems and how AI agents can enhance their capabilities. Traditional RAG systems, while useful for leveraging custom enterprise data, suffer from limitations such as a lack of real-time dat

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

MantisBT

MantisBT

Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

PhpStorm Mac version

PhpStorm Mac version

The latest (2018.2.1) professional PHP integrated development tool

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools