Home >Technology peripherals >AI >Zuckerberg strongly supports open source AGI: fully training Llama 3, expected to reach 350,000 H100 by the end of the year
Xiao Zha announced a new goal: All in open source AGI.
Yes, Xiao Zha is All in again, which is where OpenAI and Google must compete.
But before AGI, the emphasis was on Open Source.
Picture
This move has received a lot of praise, just like when the LIama series of large models were open sourced.
Picture
Picture
But this time there is another wave of All in, and netizens can’t help but let Reminds me of the last wave of All In: Where did the Metaverse go? ? ?
Picture
But it must be said that the Flag listed this time is indeed more specific and even reveals some key data.
For example,
Finally, he also put up a small advertisement. They are building new AI-centric computing devices, such as Ray Ban Meta smart glasses.
It seems that the Metaverse is still going on.
Now, Xiao Zha has officially announced his participation in the AGI battle.
Although there is no clear timetable, as a long-term vision, two key points are clearly stated:
Be open source responsibly and make it widely available so that everyone can benefit from it.
In order to achieve this goal, there are two main things:
First, closely integrate the two existing AI research teams (FAIR and GenAI).
According to LeCun, the two became brother departments.
Picture
and said that Llama-3 is coming!
Second, build large-scale computing infrastructure: by the end of this year, there will be 350,000 H100s, with a total computing power equivalent to 600,000 H100s.
Calculated based on the sales price of US$25,000 to US$30,000, the total computing power value will reach US$15 billion to US$18 billion.
Previously, some organizations predicted that Nvidia’s H100 shipments to Meta could reach 150,000 units in 2023, which is the same as Microsoft and at least three times that of other companies.
For this reason, Xiao Zha said that we have established such capabilities that may be larger than any other individual company.
Some netizens calculated the computing power and said: Brain-sized models are coming soon.
Picture
However, some people question that Nvidia should not be able to produce that many.
Picture
But a Meta leader came out and said: H100 is 350,000 yuan including the current one.
Picture
In addition, progress in hardware equipment is also emphasized.
Before Xiao Zha announced this news, many big guys made a lot of comments about AGI at the World Economic Forum in Davos.
For example, LeCun emphasized the importance of open source in the AGI implementation path on the forum The Expanding Universe of Generative Models.
The reason we see such rapid progress in artificial intelligence is because of open research.
Even though he often expressed doubts that AGI would arrive anytime soon (certainly not within the next five years), AGI arrived.
Picture
And Transformer creator Aidan Gomez said:
Currently we have not completed the expansion, we still need to continue working hard.
As for OpenAI CEO Altman, he said that human-level artificial intelligence will arrive soon, but the change to the world will be far smaller than we imagined.
Artificial General Intelligence (AGI) may be developed in the "fairly near future."
What do you think of the development of AGI?
The above is the detailed content of Zuckerberg strongly supports open source AGI: fully training Llama 3, expected to reach 350,000 H100 by the end of the year. For more information, please follow other related articles on the PHP Chinese website!