܄

70 billion parameters APUS large model 3.0 Lingli officially open source...

【数据猿导读】 70 billion parameters APUS large model 3.0 Lingli officially open source

70 billion parameters APUS large model 3.0 Lingli officially open source

On February 7, APUS and the National Engineering Laboratory of Big Data System Computing Technology of Shenzhen University (hereinafter referred to as the "National Engineering Laboratory of Big Data") jointly trained the Lingli Linly-70B Chinese large model, and officially launched the open source on GitHub, which is the first open source large model of APUS Big Model 3.0. APUS Grand Model 3.0 Lingli scored 80.6 points in the Chinese benchmark assessment list C-Eval, surpassing GPT-4 in Chinese proficiency and ranking third among all participating models, significantly improving compared with the original open source model benchmark LLaMa2-70B. It is understood that based on the computing power support of APUS Zhengzhou Intelligent Computing Center, APUS large model 3.0 Lingli takes 3 months to complete training, and the current context length is set to 4K (about 8000-10,000 Chinese characters).


来源:DIYuan

声明:数据猿尊重媒体行业规范,相关内容都会注明来源与作者;转载我们原创内容时,也请务必注明“来源:数据猿”与作者名称,否则将会受到数据猿追责。

刷新相关文章

Pony Wisdom line sixth generation autonomous driving model in Beijing opened a full driverless passenger test
Pony Wisdom line sixth generation autonomous drivin...
Tiangong 2.0 MoE large model released
Tiangong 2.0 MoE large model released...
Allen Institute for Artificial Intelligence releases open source Grand model
Allen Institute for Artificial Intelligence releases...

我要评论

数据猿微信公众号
2023第七届上海AI大会暨医药和医疗创新峰会
2023深圳物联网展
人工智能博览会
FMW2023全球闪存峰值
2023世界农业科技创新大会暨世界农业科技博览会
2024上海世博展览馆
返回顶部