search
HomeTechnology peripheralsAILatest interview with OpenAI founder Sam Altman: GPT-3 or open source, scaling rules accelerate the construction of AGI

Big Data Digest Produced

"We are very short of GPUs"

In a recent interview, Sam Altman, the head of OpenAI, responded to the host's question about "the dissatisfaction is the API" reliability and speed.”

This interview comes from Raza Habib, CEO of the artificial intelligence startup Humanloop. He compiled the highlights of the interview on Twitter.

Twitter address:

https://twitter.com/dr_cintas/status/1664281914948337664

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

## In this interview, Altman also announced GPT plans for the past two years. For example, the 2023 plan is to reduce the cost of GPT-4 and improve response speed. Others include:

1. Longer The context window may support 1 million tokens;

2. Fine-tune the API to help developers develop better;

3. Support session state API, that is, API that supports session state.

The 2024 plan mentioned that GPT-4 should support multi-modality. The reason why it was postponed to 2024 is because there is too little GPU.

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

#In the interview, Altman also mentioned that they have been considering whether to open source GPT-3. There is no doubt that open source is very important. At the same time, he also expressed that current AI models are not that dangerous. Although it is very important to regulate future models, banning development is a very big misconception.

Raza Habib originally published more detailed information about this interview to Humanloop, but when Digest checked this morning, the page was already 404. According to the translation of @宝玉xp on domestic Weibo, he also mentioned the scaling rules for the future development of large models:

OpenAI’s internal data shows that the scaling rules for model performance continue to be effective, making the model Bigger will continue to yield performance. OpenAI has scaled its models millions of times, so it cannot continue to scale at a pace that will be unsustainable in the future. This doesn't mean that OpenAI won't continue to try to make models bigger, it just means that they will probably only double or triple every year instead of increasing by many orders of magnitude.

The fact that scaling continues to be effective has important implications for the timeline of AGI development. The scaling assumption is that we probably already have most of the parts needed to build an AGI, and most of the remaining work will be scaling existing methods to larger models and larger data sets. If the era of scaling is over, then we should probably expect AGI to be even further away. That the Law of Scaling continues to be in effect strongly hints at a shorter timeline.

Obviously, the scaling law is the fastest path to AGI.

What is the scaling rule?

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

Scaling Laws, the English name is Scaling Laws, is a description of the phenomenon, which generally refers to: the effect of the language model and the amount of parameters, amount of data, The amount of calculation basically follows a smooth power law.

In other words, as the number of parameters of the model (Parameters), the amount of data involved in training (Tokens), and the amount of calculation accumulated in the training process (FLOPS) increase exponentially, the performance of the model on the test set increases exponentially. Loss decreases linearly, which means the model performs better.

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

Legend: Empirical performance exhibits a power-law relationship with each individual factor when not constrained by the other two factors.

In 2022, DeepMind conducted further analysis in ScalingLaw. The research has verified through quantitative experiments that the size of the language model training data should be enlarged in proportion to the size of the model parameters. When the total amount of calculation remains unchanged, the effect of model training has an optimal balance point between the parameter size and the training data amount. The lowest point under the curve is a very good compromise between the parameter size and the training data amount. point.

OpeaAI’s Success and GPT-4

OpenAI began as a non-profit artificial intelligence research lab that acquired Sam Altman and Elon Musk in 201610 billion in funding.

In 2019 OpenAI transformed into a for-profit artificial intelligence research laboratory to absorb investors' funds.

When the laboratory is running out of funds to support its research, Microsoft announced that it will invest another US$1 billion in the laboratory.

Every version of the GPT series launched by OpenAI can cause carnival in the industry. At the Microsoft Build 2023 Developer Conference, OpenAI founder Andrej Karpthy gave a speech: State of GPT (the current status of GPT), saying They have been training large models as "human brains".

Andrej mentioned that the current LLM large language model can be compared to System 1 (fast system) of human thinking mode, which is compared to System 2 (slow system) that responds slowly but has longer-term reasoning. .

“System one is a fast automatic process that I think kind of corresponds to LLM, just sampling tokens.

System two is the slower, well-thought-out planning part of the brain.

The prompt project basically hopes to restore some of the capabilities of our brains to LLM."

Andrej Karpthy also mentioned that GPT-4 is an amazing artifact, and he is very Be grateful that it exists. It has a ton of knowledge in many areas, it can do math, code, and more, all at your fingertips.

CEO Altman said that in the early days, GPT-4 was very slow, had bugs, and did many things poorly. But so did the earliest computers, which still pointed the way to something that would become very important in our lives, even if it took decades to develop.

It seems that OpenAI is an organization that insists on dreams and wants to do things to the extreme.

As Zhou Ming, former vice president of Microsoft Research Asia and founder of Lanzhou Technology, mentioned in an interview:

OpenAI’s greatest achievement is to achieve perfection in all aspects and is a model of integrated innovation.

There are several types of people in the world. Some people just want to study underlying innovation. Some are applications based on underlying innovations, while general applications are to solve a single task. It can be rewritten as: Another approach is to achieve integrated innovation, concentrating all work, applications and algorithms on a large platform to create milestones. OpenAI happens to do a really good job of integrating innovation.

Reference:

https://mp.weixin.qq.com/s/p42pBVyjZws8XsstDoR2Jw https://mp.weixin.qq.com/s/zmEGzm1cdXupNoqZ65h7yg https://weibo.com/1727858283/4907695679472174?wm=3333_2001&from=10D5293010&sourcetype=weixin&s_trans=6289897940_490769 5679472174&s_channel=4 https://humanloop.com/blog/openai- plans?cnotallow=bd9e76a5f41a6d847de52fa275480e22

The above is the detailed content of Latest interview with OpenAI founder Sam Altman: GPT-3 or open source, scaling rules accelerate the construction of AGI. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
win7硬盘格式应该选择MBR还是GPT?win7硬盘格式应该选择MBR还是GPT?Jan 03, 2024 pm 08:09 PM

我们在使用win7操作系统的时候,有的情况下可能就会遇到需要我们重装系统,为硬盘分区的情况。对于win7硬盘格式要求mbr还是gpt这种问题小编觉得,还是要根据自己系统以及硬件配置的详细情况来进行选择即可。如果按兼容性来说的话最好还是选择mbr格式。详细内容还是来看下小编是怎么做的吧~win7硬盘格式要求mbr还是gpt1.如果系统装的是Win7的话,建议还是MBR,兼容性好。2.超过3T或装win8,可以用GPT。3.虽然GPT确实比MBR先进,但兼容性方面肯定是MBR无敌。GPT和MBR的区

深入了解Win10分区格式:GPT和MBR的比较深入了解Win10分区格式:GPT和MBR的比较Dec 22, 2023 am 11:58 AM

对自己的系统分区时由于用户使用的硬盘不同因此很多的用户也不知道win10分区格式gpt还是mbr,为此我们给大家带来了详细的介绍,帮助大家了解两者间的不同。win10分区格式gpt还是mbr:答:如果你使用的是超过3t的硬盘,可以用gpt。gpt相比mbr更加的先进,但是兼容性方面还是mbr更厉害。当然这也是完全可以根据用户的喜好来进行选择的。gpt和mbr的区别:一、支持的分区个数:1、MBR最多支持划分4个主分区。2、GPT则不受分区个数的限制。二、支持的硬盘大小:1、MBR最大仅支持2TB

Kubernetes调试终极武器: K8sGPTKubernetes调试终极武器: K8sGPTFeb 26, 2024 am 11:40 AM

随着人工智能和机器学习技术的不断发展,企业和组织开始积极探索创新战略,以利用这些技术来提升竞争力。K8sGPT[2]是该领域内功能强大的工具之一,它是基于k8s的GPT模型,兼具k8s编排的优势和GPT模型出色的自然语言处理能力。什么是K8sGPT?先看一个例子:根据K8sGPT官网解释:K8sgpt是一个专为扫描、诊断和分类kubernetes集群问题而设计的工具,它整合了SRE经验到其分析引擎中,以提供最相关的信息。通过人工智能技术的应用,K8sgpt不断丰富其内容,帮助用户更快速、准确地解

如何确定电脑硬盘采用的是GPT还是MBR分区方式如何确定电脑硬盘采用的是GPT还是MBR分区方式Dec 25, 2023 pm 10:57 PM

何查看电脑硬盘是GPT分区还是MBR分区呢?当我们用到电脑硬盘的时候,需要进行GPT与MBR的区分,其实这个查看方法特别简单,下面跟我一起来看看吧。查看电脑硬盘是GPT还是MBR的方法1、右击桌面上的'计算机“点击”管理2、在”管理“中找得”磁盘管理“3、进入磁盘管理可以看到我们硬盘的一般情况,那么该如何查看我的硬盘的分区模式,右击”磁盘0“选择”属性“4、在”属性“中切换到”卷“标签,这时我们就可以看到”磁盘分区形式“可以看到为MBR分区win10磁盘相关问题如何将MBR分区转换成GPT分区>

GPT大语言模型Alpaca-lora本地化部署实践GPT大语言模型Alpaca-lora本地化部署实践Jun 01, 2023 pm 09:04 PM

模型介绍Alpaca模型是斯坦福大学研发的LLM(LargeLanguageModel,大语言)开源模型,是一个在52K指令上从LLaMA7B(Meta公司开源的7B)模型微调而来,具有70亿的模型参数(模型参数越大,模型的推理能力越强,当然随之训练模型的成本也就越高)。LoRA,英文全称Low-RankAdaptationofLargeLanguageModels,直译为大语言模型的低阶适应,这是微软的研究人员为了解决大语言模型微调而开发的一项技术。如果想让一个预训练大语言模型能够执行特定领域

LLM的三大缺陷,你知道几个?LLM的三大缺陷,你知道几个?Nov 26, 2023 am 11:26 AM

科学:远非是一种永远仁慈有益的实体,未来的感知通用AI很可能是一个操纵性反社会个体,它会吞噬你所有个人数据,然后在最需要它的时候就崩溃。译自3WaysLLMsCanLetYouDown,作者JoabJackson。OpenAI即将发布GPT-5,外界对它寄予厚望,最乐观的预测甚至认为它将实现通用人工智能。但与此同时,CEOSamAltman和他的团队在将其推向市场过程中面临许多严峻的障碍,他本月早些时候就已经承认了这一点。有一些最近发表的研究论文可能提供了对于Altman挑战的线索。这些论文总结

被比尔盖茨选中的GPT技术,是如何演进,又革谁的命?被比尔盖茨选中的GPT技术,是如何演进,又革谁的命?May 28, 2023 pm 03:13 PM

夕小瑶科技说原创作者|智商掉了一地、Python如果机器能够以类似于人类的方式进行理解和沟通,那会是怎样的情况?这一直是学界中备受关注的话题,而由于近些年来在自然语言处理的一系列突破,我们可能比以往任何时候都更接近实现这个目标。在这个突破的前沿领域,是GenerativePre-trainedTransformer(GPT)——专门针对自然语言处理任务设计的深度神经网络模型。它出色的表现和有效对话的能力使其成为该领域中使用最广泛且效果最好的模型之一,吸引了研究和工业界的广泛关注。在最近一篇详尽的

深入浅析,一步步用GPT打造你的聊天机器人深入浅析,一步步用GPT打造你的聊天机器人Apr 07, 2023 pm 07:41 PM

与ChatGPT聊天很有趣,而且信息量很大 —— 与它闲聊可以探索一些新的想法。但这些都是比较随意的用例,新奇感很快就会减弱,特别是当人意识到它能产生幻觉的时候。如何以更高效的方式使用ChatGPT呢?在OpenAI发布GPT3.5系列API后,可以做的事情远不止是闲聊。QA(问答)是企业和个人使用的一个非常有效的用例 —— 用自然语言向机器人询问自己的文件/数据,它可以通过从文件中检索信息并生成回应来快速回答。可以把它用于客户支持、综合用户研究、个人知识管理等等。向机器人询问与文件相关的问题。

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
2 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
Repo: How To Revive Teammates
4 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
Hello Kitty Island Adventure: How To Get Giant Seeds
4 weeks agoBy尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

Safe Exam Browser

Safe Exam Browser

Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

SAP NetWeaver Server Adapter for Eclipse

SAP NetWeaver Server Adapter for Eclipse

Integrate Eclipse with SAP NetWeaver application server.

SublimeText3 English version

SublimeText3 English version

Recommended: Win version, supports code prompts!