70 billion parameters APUS large model 3.0 Lingli officially open source...
【数据猿导读】 70 billion parameters APUS large model 3.0 Lingli officially open source

On February 7, APUS and the National Engineering Laboratory of Big Data System Computing Technology of Shenzhen University (hereinafter referred to as the "National Engineering Laboratory of Big Data") jointly trained the Lingli Linly-70B Chinese large model, and officially launched the open source on GitHub, which is the first open source large model of APUS Big Model 3.0. APUS Grand Model 3.0 Lingli scored 80.6 points in the Chinese benchmark assessment list C-Eval, surpassing GPT-4 in Chinese proficiency and ranking third among all participating models, significantly improving compared with the original open source model benchmark LLaMa2-70B. It is understood that based on the computing power support of APUS Zhengzhou Intelligent Computing Center, APUS large model 3.0 Lingli takes 3 months to complete training, and the current context length is set to 4K (about 8000-10,000 Chinese characters).
来源:DIYuan
刷新相关文章
我要评论
不容错过的资讯
大家都在搜
