Tongyi thousand ask open source 100 billion parameter model
【数据猿导读】 Tongyi thousand ask open source 100 billion parameter model

April 28 news, Tongyi Qianwen open source 110 billion parameter model Qwen1.5-110B, becoming the first 100 billion level parameter open source model in the whole series. The 110 billion parameter model continues the Qwen1.5 series Transformer decoder architecture and adopts the group query attention method (GQA) to make the model more efficient in inference. The 110B model supports 32K context length and has excellent multilingual capability, supporting Chinese, English, French, German, Spanish, Russian, Japanese, Korean, Vietnamese, Arabic and other languages.
来源:DIYuan
刷新相关文章
我要评论
不容错过的资讯
大家都在搜
