


In order to clarify this problem, the American Zeta Alpha platform counted the top 100 most cited AI papers in the world between 2020 and 2022, and came up with some very interesting results.
For example:
"popular star" OpenAI ranks 9th among the institutions with the most cited papers.
However, its name cannot be found at all on the list of institutions with the largest number of published papers.
For another example, Google Meta and Microsoft from the industry are always among the best in all data. However, in general, academia does not lag behind the industry.
In addition, the previous view that "the quantity and quality of China's AI research output may exceed that of the United States" seems to have been cracked in this report -
More and specific data, Let’s look at them one by one.
China is second, OpenAI and DeepMind win in quality
Before the specific analysis, Zeta Alpha first counted the most cited papers each year from 2020 to 2022. They are:
2022:
1. AlphaFold Protein Structure Database: Massively expanding the structural coverage of protein-sequence space with high-accuracy models
Number of citations: 1372
Institution: DeepMind
Topic: Using AlphaFold to increase the coverage of protein structure database
2. ColabFold: making protein folding accessible to all
Number of citations: 1162
Institution: Completed by multiple collaborations
Topic: An open source and efficient protein folding model
3. Hierarchical Text-Conditional Image Generation with CLIP Latents
Number of citations: 718
Institution: OpenAI
Topic: DALL·E 2
4, A ConvNet for the 2020s
Number of citations: 690
Institution: Meta and UC Berkeley
Topic: Successfully modernizing CNN during the Transformer boom
5. PaLM: Scaling Language Modeling with Pathways
Number of citations: 452
Institution: Google
Topic: Google’s 540B large language model, a new MLOps paradigm, including Its implementation process
2021
1. Highly accurate protein structure prediction with AlphaFold
Number of citations: 8965
Institution: DeepMind
Topic: AlphaFold, A huge breakthrough in protein structure prediction using deep learning
2. Swin Transformer: Hierarchical Vision Transformer using Shifted Windows
Number of citations: 4810
Institution: Microsoft
Subject: ViT Powerful variant
3、Learning Transferable Visual Models From Natural Language Supervision
Number of citations: 3204
Institution: OpenAI
Topic: CLIP
4、On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?
Number of citations: 1266
Institution: Washington, DC, USA, Black in AI, The Aether
Topic: Famous position paper, on growing Be critical of the trend of language models, emphasizing their limitations and dangers
5. Emerging Properties in Self-Supervised Vision Transformers
Number of citations: 1219
Institution: Meta
Topic: DINO, revealing how self-supervision of images leads to some kind of proto-object segmentation in Transformers
2020:
1. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Number of citations: 11914
Institution: Google
Topic: The first work to show how ordinary Transformer can perform well in the field of computer vision
2. Language Models are Few-Shot Learners
Number of citations: 8070
Institution: OpenAI
Topic: GPT-3
3, YOLOv4: Optimal Speed and Accuracy of Object Detection
Number of citations: 8014
Institution: "Academia Sinica", Taiwan, China
Topic: YOLOv4
4, Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Number of citations: 5906
Institution: Google
Subject: A rigorous study of Transformer’s transfer learning resulted in the famous T5
5. Bootstrap your own latent: A new approach to self- Supervised Learning
Number of citations: 2873
Institution: DeepMind and Imperial College
Topic: Showing that negative is not necessary for representation learning
I believe you can find a lot from it "Familiar Faces".
Next, Zeta Alpha conducted an analysis of the information behind highly cited papers in the past three years.
The first is the countries (regions) with the “top 100 most cited papers each year”.
It can be seen that the United States is the strongest, and the gap between China and China is relatively obvious.
Therefore, Zeta Alpha also believes that the previous statement that "China may surpass the United States in AI research" is not valid at least based on this data.
In addition, Singapore and Australia’s rankings are also quite unexpected, ranking fifth and sixth respectively.
"In order to correctly assess the dominance of the United States," Zeta Alpha changed another statistical method and calculated the percentage of the top 100 citations.
# Of course, the United States still ranks first, but we can see that its share has declined in the past three years.
The UK is the largest competitor outside of China and the United States, but the UK’s outstanding performance in 2022 was actually mainly contributed by DeepMind (accounting for 69%).
The following is a ranking of the top 100 individuals with the most citations by organization or institution.
Not surprisingly, Google and Meta Microsoft ranked in the top three, followed by UC Berkeley, DeepMind and Stanford.
OpenAI also received a pretty good ranking, ninth. The tenth place is MIT and the eleventh place is Tsinghua University.
Although the top three contestants are all from the industry, if we only divide them according to the type of institution, the performance of academia and its performance is basically the same.
Then followed by the ranking of the total number of papers published by each organization or institution in the past three years.
The boss is still Google. The second place is more eye-catching, which is Tsinghua University, followed by Microsoft, CMU, MIT, Stanford, UC Berkeley, Peking University (eighth), Meta...
It can be seen that the top ten are affiliated with academia Institutions or organizations occupy a large area of the country.
We searched for a long time, but did not see the names of OpenAI and DeepMind -
Obviously they published a smaller number of papers and mainly relied on quality to win.
In order to verify this guess, Zeta Alpha also made a ranking of the conversion rate of highly cited papers.
As expected, OpenAI won the crown and DeepMind came third.
Of course, Meta is also good. Fourthly, LeCun even came out to "come out and say it":
We at Meta really pay more attention to quality rather than quantity.
# In contrast, Google, which has more citations but more publications, only ranked ninth, barely out of the top 10.
In addition to these, the second place is also a highlight-it is Megvii.
and domestic SenseTime are also on the list.
Attached is the complete list of the Top 100 citations in 2022
The popularity of ChatGPT has really revitalized the AI industry. In what direction will the latest cutting-edge research lead? We also need to be more observant.
To this end, Zeta Alpha has also given a list of all AI papers cited in the top 100 in 2022, which may be inspiring to everyone.
1-30:
31-60:
61-90:
91-100:
So, that’s all the content of Zeta Alpha’s report.
The original text can be stamped:https://www.php.cn/link/ea20aed6df7caa746052d227d194a395
The above is the detailed content of The ranking of 'high citation conversion rate' of AI papers is released: OpenAI ranks first, Megvii ranks second, and Google ranks ninth. For more information, please follow other related articles on the PHP Chinese website!

ai合并图层的快捷键是“Ctrl+Shift+E”,它的作用是把目前所有处在显示状态的图层合并,在隐藏状态的图层则不作变动。也可以选中要合并的图层,在菜单栏中依次点击“窗口”-“路径查找器”,点击“合并”按钮。

ai橡皮擦擦不掉东西是因为AI是矢量图软件,用橡皮擦不能擦位图的,其解决办法就是用蒙板工具以及钢笔勾好路径再建立蒙板即可实现擦掉东西。

虽然谷歌早在2020年,就在自家的数据中心上部署了当时最强的AI芯片——TPU v4。但直到今年的4月4日,谷歌才首次公布了这台AI超算的技术细节。论文地址:https://arxiv.org/abs/2304.01433相比于TPU v3,TPU v4的性能要高出2.1倍,而在整合4096个芯片之后,超算的性能更是提升了10倍。另外,谷歌还声称,自家芯片要比英伟达A100更快、更节能。与A100对打,速度快1.7倍论文中,谷歌表示,对于规模相当的系统,TPU v4可以提供比英伟达A100强1.

ai可以转成psd格式。转换方法:1、打开Adobe Illustrator软件,依次点击顶部菜单栏的“文件”-“打开”,选择所需的ai文件;2、点击右侧功能面板中的“图层”,点击三杠图标,在弹出的选项中选择“释放到图层(顺序)”;3、依次点击顶部菜单栏的“文件”-“导出”-“导出为”;4、在弹出的“导出”对话框中,将“保存类型”设置为“PSD格式”,点击“导出”即可;

Yann LeCun 这个观点的确有些大胆。 「从现在起 5 年内,没有哪个头脑正常的人会使用自回归模型。」最近,图灵奖得主 Yann LeCun 给一场辩论做了个特别的开场。而他口中的自回归,正是当前爆红的 GPT 家族模型所依赖的学习范式。当然,被 Yann LeCun 指出问题的不只是自回归模型。在他看来,当前整个的机器学习领域都面临巨大挑战。这场辩论的主题为「Do large language models need sensory grounding for meaning and u

ai顶部属性栏不见了的解决办法:1、开启Ai新建画布,进入绘图页面;2、在Ai顶部菜单栏中点击“窗口”;3、在系统弹出的窗口菜单页面中点击“控制”,然后开启“控制”窗口即可显示出属性栏。

ai移动不了东西的解决办法:1、打开ai软件,打开空白文档;2、选择矩形工具,在文档中绘制矩形;3、点击选择工具,移动文档中的矩形;4、点击图层按钮,弹出图层面板对话框,解锁图层;5、点击选择工具,移动矩形即可。

引入密集强化学习,用 AI 验证 AI。 自动驾驶汽车 (AV) 技术的快速发展,使得我们正处于交通革命的风口浪尖,其规模是自一个世纪前汽车问世以来从未见过的。自动驾驶技术具有显着提高交通安全性、机动性和可持续性的潜力,因此引起了工业界、政府机构、专业组织和学术机构的共同关注。过去 20 年里,自动驾驶汽车的发展取得了长足的进步,尤其是随着深度学习的出现更是如此。到 2015 年,开始有公司宣布他们将在 2020 之前量产 AV。不过到目前为止,并且没有 level 4 级别的 AV 可以在市场


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

SublimeText3 Linux new version
SublimeText3 Linux latest version

Notepad++7.3.1
Easy-to-use and free code editor

Atom editor mac version download
The most popular open source editor

WebStorm Mac version
Useful JavaScript development tools

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment
