search
HomeTechnology peripheralsAILatest interview with OpenAI founder Sam Altman: GPT-3 or open source, scaling rules accelerate the construction of AGI

Big Data Digest Produced

"We are very short of GPUs"

In a recent interview, Sam Altman, the head of OpenAI, responded to the host's question about "the dissatisfaction is the API" reliability and speed.”

This interview comes from Raza Habib, CEO of the artificial intelligence startup Humanloop. He compiled the highlights of the interview on Twitter.

Twitter address:

https://twitter.com/dr_cintas/status/1664281914948337664

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

## In this interview, Altman also announced GPT plans for the past two years. For example, the 2023 plan is to reduce the cost of GPT-4 and improve response speed. Others include:

1. Longer The context window may support 1 million tokens;

2. Fine-tune the API to help developers develop better;

3. Support session state API, that is, API that supports session state.

The 2024 plan mentioned that GPT-4 should support multi-modality. The reason why it was postponed to 2024 is because there is too little GPU.

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

#In the interview, Altman also mentioned that they have been considering whether to open source GPT-3. There is no doubt that open source is very important. At the same time, he also expressed that current AI models are not that dangerous. Although it is very important to regulate future models, banning development is a very big misconception.

Raza Habib originally published more detailed information about this interview to Humanloop, but when Digest checked this morning, the page was already 404. According to the translation of @宝玉xp on domestic Weibo, he also mentioned the scaling rules for the future development of large models:

OpenAI’s internal data shows that the scaling rules for model performance continue to be effective, making the model Bigger will continue to yield performance. OpenAI has scaled its models millions of times, so it cannot continue to scale at a pace that will be unsustainable in the future. This doesn't mean that OpenAI won't continue to try to make models bigger, it just means that they will probably only double or triple every year instead of increasing by many orders of magnitude.

The fact that scaling continues to be effective has important implications for the timeline of AGI development. The scaling assumption is that we probably already have most of the parts needed to build an AGI, and most of the remaining work will be scaling existing methods to larger models and larger data sets. If the era of scaling is over, then we should probably expect AGI to be even further away. That the Law of Scaling continues to be in effect strongly hints at a shorter timeline.

Obviously, the scaling law is the fastest path to AGI.

What is the scaling rule?

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

Scaling Laws, the English name is Scaling Laws, is a description of the phenomenon, which generally refers to: the effect of the language model and the amount of parameters, amount of data, The amount of calculation basically follows a smooth power law.

In other words, as the number of parameters of the model (Parameters), the amount of data involved in training (Tokens), and the amount of calculation accumulated in the training process (FLOPS) increase exponentially, the performance of the model on the test set increases exponentially. Loss decreases linearly, which means the model performs better.

OpenAI创始人Sam Altman最新访谈:GPT-3 或开源,缩放法则加速构建AGI

Legend: Empirical performance exhibits a power-law relationship with each individual factor when not constrained by the other two factors.

In 2022, DeepMind conducted further analysis in ScalingLaw. The research has verified through quantitative experiments that the size of the language model training data should be enlarged in proportion to the size of the model parameters. When the total amount of calculation remains unchanged, the effect of model training has an optimal balance point between the parameter size and the training data amount. The lowest point under the curve is a very good compromise between the parameter size and the training data amount. point.

OpeaAI’s Success and GPT-4

OpenAI began as a non-profit artificial intelligence research lab that acquired Sam Altman and Elon Musk in 201610 billion in funding.

In 2019 OpenAI transformed into a for-profit artificial intelligence research laboratory to absorb investors' funds.

When the laboratory is running out of funds to support its research, Microsoft announced that it will invest another US$1 billion in the laboratory.

Every version of the GPT series launched by OpenAI can cause carnival in the industry. At the Microsoft Build 2023 Developer Conference, OpenAI founder Andrej Karpthy gave a speech: State of GPT (the current status of GPT), saying They have been training large models as "human brains".

Andrej mentioned that the current LLM large language model can be compared to System 1 (fast system) of human thinking mode, which is compared to System 2 (slow system) that responds slowly but has longer-term reasoning. .

“System one is a fast automatic process that I think kind of corresponds to LLM, just sampling tokens.

System two is the slower, well-thought-out planning part of the brain.

The prompt project basically hopes to restore some of the capabilities of our brains to LLM."

Andrej Karpthy also mentioned that GPT-4 is an amazing artifact, and he is very Be grateful that it exists. It has a ton of knowledge in many areas, it can do math, code, and more, all at your fingertips.

CEO Altman said that in the early days, GPT-4 was very slow, had bugs, and did many things poorly. But so did the earliest computers, which still pointed the way to something that would become very important in our lives, even if it took decades to develop.

It seems that OpenAI is an organization that insists on dreams and wants to do things to the extreme.

As Zhou Ming, former vice president of Microsoft Research Asia and founder of Lanzhou Technology, mentioned in an interview:

OpenAI’s greatest achievement is to achieve perfection in all aspects and is a model of integrated innovation.

There are several types of people in the world. Some people just want to study underlying innovation. Some are applications based on underlying innovations, while general applications are to solve a single task. It can be rewritten as: Another approach is to achieve integrated innovation, concentrating all work, applications and algorithms on a large platform to create milestones. OpenAI happens to do a really good job of integrating innovation.

Reference:

https://mp.weixin.qq.com/s/p42pBVyjZws8XsstDoR2Jw https://mp.weixin.qq.com/s/zmEGzm1cdXupNoqZ65h7yg https://weibo.com/1727858283/4907695679472174?wm=3333_2001&from=10D5293010&sourcetype=weixin&s_trans=6289897940_490769 5679472174&s_channel=4 https://humanloop.com/blog/openai- plans?cnotallow=bd9e76a5f41a6d847de52fa275480e22

The above is the detailed content of Latest interview with OpenAI founder Sam Altman: GPT-3 or open source, scaling rules accelerate the construction of AGI. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
Personal Hacking Will Be A Pretty Fierce BearPersonal Hacking Will Be A Pretty Fierce BearMay 11, 2025 am 11:09 AM

Cyberattacks are evolving. Gone are the days of generic phishing emails. The future of cybercrime is hyper-personalized, leveraging readily available online data and AI to craft highly targeted attacks. Imagine a scammer who knows your job, your f

Pope Leo XIV Reveals How AI Influenced His Name ChoicePope Leo XIV Reveals How AI Influenced His Name ChoiceMay 11, 2025 am 11:07 AM

In his inaugural address to the College of Cardinals, Chicago-born Robert Francis Prevost, the newly elected Pope Leo XIV, discussed the influence of his namesake, Pope Leo XIII, whose papacy (1878-1903) coincided with the dawn of the automobile and

FastAPI-MCP Tutorial for Beginners and Experts - Analytics VidhyaFastAPI-MCP Tutorial for Beginners and Experts - Analytics VidhyaMay 11, 2025 am 10:56 AM

This tutorial demonstrates how to integrate your Large Language Model (LLM) with external tools using the Model Context Protocol (MCP) and FastAPI. We'll build a simple web application using FastAPI and convert it into an MCP server, enabling your L

Dia-1.6B TTS : Best Text-to-Dialogue Generation Model - Analytics VidhyaDia-1.6B TTS : Best Text-to-Dialogue Generation Model - Analytics VidhyaMay 11, 2025 am 10:27 AM

Explore Dia-1.6B: A groundbreaking text-to-speech model developed by two undergraduates with zero funding! This 1.6 billion parameter model generates remarkably realistic speech, including nonverbal cues like laughter and sneezes. This article guide

3 Ways AI Can Make Mentorship More Meaningful Than Ever3 Ways AI Can Make Mentorship More Meaningful Than EverMay 10, 2025 am 11:17 AM

I wholeheartedly agree. My success is inextricably linked to the guidance of my mentors. Their insights, particularly regarding business management, formed the bedrock of my beliefs and practices. This experience underscores my commitment to mentor

AI Unearths New Potential In The Mining IndustryAI Unearths New Potential In The Mining IndustryMay 10, 2025 am 11:16 AM

AI Enhanced Mining Equipment The mining operation environment is harsh and dangerous. Artificial intelligence systems help improve overall efficiency and security by removing humans from the most dangerous environments and enhancing human capabilities. Artificial intelligence is increasingly used to power autonomous trucks, drills and loaders used in mining operations. These AI-powered vehicles can operate accurately in hazardous environments, thereby increasing safety and productivity. Some companies have developed autonomous mining vehicles for large-scale mining operations. Equipment operating in challenging environments requires ongoing maintenance. However, maintenance can keep critical devices offline and consume resources. More precise maintenance means increased uptime for expensive and necessary equipment and significant cost savings. AI-driven

Why AI Agents Will Trigger The Biggest Workplace Revolution In 25 YearsWhy AI Agents Will Trigger The Biggest Workplace Revolution In 25 YearsMay 10, 2025 am 11:15 AM

Marc Benioff, Salesforce CEO, predicts a monumental workplace revolution driven by AI agents, a transformation already underway within Salesforce and its client base. He envisions a shift from traditional markets to a vastly larger market focused on

AI HR Is Going To Rock Our Worlds As AI Adoption SoarsAI HR Is Going To Rock Our Worlds As AI Adoption SoarsMay 10, 2025 am 11:14 AM

The Rise of AI in HR: Navigating a Workforce with Robot Colleagues The integration of AI into human resources (HR) is no longer a futuristic concept; it's rapidly becoming the new reality. This shift impacts both HR professionals and employees, dem

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

SublimeText3 Linux new version

SublimeText3 Linux new version

SublimeText3 Linux latest version

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools