Materia Medica [formerly known as: HuaTuo (HuaTuo)]: Large language model instruction fine-tuning based on Chinese medical knowledge Model collection, including LLaMA, Alpaca-Chinese, Bloom, movable type models, etc.
We built a fine-tuning dataset of Chinese medical instructions based on the medical knowledge map and medical literature, combined with the ChatGPT API, and used this to fine-tune the instructions of various base models, improving the performance of the base model in the medical field Q&A effect.
News
[2023/08/07] 🔥🔥Added the release of the model for fine-tuning instructions based on movable type, and the effect of the model has been significantly improved. 🔥🔥
[2023/08/05] Materia Medica model will be displayed in CCL 2023 Demo Track.
[2023/08/03] SCIR Lab open source movable type universal question and answer model, welcome everyone to pay attention Fine-tuned model release.
[2023/05/12] The model was renamed from "Hua Camel" to "Materia Medica".
[2023/04/28] Added a model release for instruction fine-tuning based on the Chinese Alpaca large model.
[2023/04/24] Added model release for instruction fine-tuning based on LLaMA and medical literature.
[2023/03/31] Added model release for instruction fine-tuning based on LLaMA and medical knowledge base.
Visit Official Website