Home >Technology peripherals >AI >Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

WBOY
WBOYforward
2024-04-08 15:31:271059browse

On April 7, Alibaba Cloud informed Qianwen that the open source 32 billion parameter model Qwen1.5-32B can maximize the balance between performance, efficiency and memory usage, providing enterprises and developers with a more cost-effective model choice. . At present, Notification Qianwen has open sourced a total of 6 large language models, and the cumulative downloads in open source communities at home and abroad have exceeded 3 million.

General Question Qianwen has previously developed 500 million, 1.8 billion, 4 billion, 7 billion, 14 billion and 72 billion parameter models, and all have been upgraded to version 1.5. Among them, several small-size models can be easily deployed on the device side, and the 72 billion parameter model has industry-leading performance and has been listed on HuggingFace and other model lists many times. The open source 32 billion parameter model will achieve a more ideal balance between performance, efficiency and memory usage. For example, compared to the 14B model, 32B has stronger capabilities in agent scenarios; compared to 72B, 32B has lower reasoning costs. The general problem team hopes that the 32B open source model can provide better solutions for downstream applications.

Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

In terms of basic capabilities, the Qianwen 32 billion parameter model has performed well in multiple tests such as MMLU, GSM8K, HumanEval, BBH, etc., and its performance is close to that of the Qianwen 72 billion parameter model. , far exceeding its 30 billion-level parameter model.

Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

In terms of Chat model, the Qwen1.5-32B-Chat model scored more than 8 points in the MT-Bench evaluation, which is relatively far behind the Qwen1.5-72B-Chat smaller.

Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

People with rich language skills can choose 12 languages ​​including Arabic, Spanish, French, Japanese, Korean, etc. Assessments were conducted in various areas such as comprehension, mathematics and translation. The multi-language capability of Qwen1.5-32B is limited to the general Qwen 72 billion parameter model.

Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.

The above is the detailed content of Tongyi Qianwen has open sourced 32 billion parameter models, and has realized all 7 major language models as open source.. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:jiqizhixin.com. If there is any infringement, please contact admin@php.cn delete