Ali large model is here! Why is it called "Tongyi Thousand Questions"? It answered by itself

The rumored Ali model has new news!
On April 7, Alibaba Cloud announced that the self-developed large-scale model "Tongyi Qianwen" began to invite users to test the experience. At this stage, the model mainly invites enterprise users to conduct experience tests. Users can apply through the official website ( tongyi.aliyun.com ), and qualified users can participate in the experience.
It is reported that Ali Dharma Academy has been laying out in frontier scientific research fields such as NLP natural language processing for many years, and started the research and development of large models in 2019.
Alibaba Cloud said that this time it is only an invitation for product companies to test. Not officially released yet. According to previous news, Alibaba Cloud will announce more progress at the Beijing summit on April 11.
Why is it called "Tongyi Thousand Questions"? At present, Alibaba Cloud has not given an official explanation. But we have already obtained the invitation code for the internal test. We invited Xiaoqian to answer this question:
The name "Tongyi Thousand Questions" comes from two aspects, "Tongyi" means that the model has a wide range of knowledge and universality, and can understand and answer questions in various fields. As a large-scale pre-trained language model, "Tongyi Qianwen" has learned a large amount of text data during the training process, thus possessing cross-domain knowledge and language understanding ability.
"Thousands of questions" represent the model can answer a variety of questions, including common, complex and even rare questions. It expresses that Tongyi Qianwen is committed to meeting the needs of users in different scenarios, no matter how complex or unique the questions are. Taken together, the name "Tongyi Qianwen" expresses the powerful functions and wide applicability of this artificial intelligence language model.
As we all know, when chatGPT set off a series of turmoil, Baidu was the first to announce the progress and release the product in China. Are you afraid of being compared with Wen Xin? Apparently, Xiao Qianwen was already prepared, but Wen Xin was a little bit "tsundere".


As one of the first teams to invest in pre-training language model research, Dharma Institute developed Ali Tongyi AliceMind, which covers pre-training models, multilingual pre-training models, super-large Chinese pre-training models, etc., with reading comprehension, machine translation, Dialogue question and answer, document processing and other capabilities.
In November 2022, in CLUE, the authoritative list in the field of Chinese language understanding, Ali AI set a new record with a total score of 86.685. This is the first time since the list was born nearly three years ago that AI surpassed human scores (86.678).
Two years ago, Bodhidharma announced the latest progress of the multi-modal large model M6: its parameters have been upgraded from trillions to 10 trillions, making it the world's largest AI pre-training model. Compared with traditional AI, the "neurons" of the large model are of an order of magnitude larger, and the cognitive and creative capabilities are also stronger.
And a few days ago, talk show actor Niao Niao showed off his "clone" on Weibo. This avatar is a ChatGPT-like voice assistant trained by Ali, which can imitate her tone, tone and text style.
According to previous reports, the main characteristics of its research are: 1. How to make large models safe and efficient for personal terminals, family scenes, etc.; 2. AIGC (generative AI) is driven by multiple modes, including text, image, voice, video.
Recently, Hangzhou has been frequently searched for its gourmet desert. We also asked Tongyi Qianwen and Wenxin Yiyan about this question——


Previously, Zhang Yong, chairman and CEO of Alibaba Group, said at the financial report that the combination of cloud computing and artificial intelligence is in a critical period of technological breakthrough and development, and Alibaba will make every effort to build its own AI pre-training large model, And provide good computing power support for the surging models and applications in the market.
Both Ma Yun and Wang Jian have recently talked about chatGPT related views. Today, Ali's research results in the field of AI large models have finally appeared. What are the characteristics of "Tongyi Thousand Questions" and when will it be open to individuals? We will continue to follow up.