entry-slick
About Baichuan-7B

Baichuan-7B is an open source and commercially available large-scale pre-trained language model developed by Baichuan Intelligence. Based on the Transformer structure, the 7 billion parameter model trained on about 1.2 trillion tokens supports Chinese and English bilingual, and the context window length is 4096. The best results of the same size are achieved on the standard Chinese and English benchmarks (C-Eval/MMLU).

Visit Official Website

https://github.com/baichuan-inc/baichuan-7B

Community Posts
no data
Nothing to display