܄

Tongyi thousand ask open source 100 billion parameter model

【数据猿导读】 Tongyi thousand ask open source 100 billion parameter model

Tongyi thousand ask open source 100 billion parameter model

April 28 news, Tongyi Qianwen open source 110 billion parameter model Qwen1.5-110B, becoming the first 100 billion level parameter open source model in the whole series. The 110 billion parameter model continues the Qwen1.5 series Transformer decoder architecture and adopts the group query attention method (GQA) to make the model more efficient in inference. The 110B model supports 32K context length and has excellent multilingual capability, supporting Chinese, English, French, German, Spanish, Russian, Japanese, Korean, Vietnamese, Arabic and other languages.


来源:DIYuan

声明:数据猿尊重媒体行业规范,相关内容都会注明来源与作者;转载我们原创内容时,也请务必注明“来源:数据猿”与作者名称,否则将会受到数据猿追责。

刷新相关文章

Iflystar Fire model update and launch its first long text, long text, long voice large model
Iflystar Fire model update and launch its first l...
Li Weike Technology released AI glasses Chat equipped with self-developed large model
Li Weike Technology released AI glasses Chat equip...
Huawei: Pangu Grand Model 5.0 will debut with HarmonyOS in HDC 2024
Huawei: Pangu Grand Model 5.0 will debut with Har...

我要评论

数据猿微信公众号
2023第七届上海AI大会暨医药和医疗创新峰会
2023深圳物联网展
人工智能博览会
FMW2023全球闪存峰值
2023世界农业科技创新大会暨世界农业科技博览会
2024上海世博展览馆
返回顶部