【CN LLM】2.2LangChain Application
Introduction: A question answering application based on the local knowledge base. The goal is to establish a knowledge base question answering solution that is friendly to Chinese scenarios and open source models and can run offline. A local knowledge base question answering application that can be implemented using an open source model is established throughout the process. It now supports direct access to large language models such as ChatGLM-6B, or access to models such as Vicuna, Alpaca, LLaMA, Koala, RWKV through fastchat api.
Introduction: Use the Webui made by LangChain and ChatGLM-6B series models to provide large-scale model applications based on local knowledge. At present, it supports uploading text format files such as txt, docx, md, pdf, etc., and provides model files including ChatGLM-6B series, Belle series, etc. 3.0-nano-zh and other Embedding models.
Introduction: Based on langchain-ChatGLM, this project supplements the question answering application based on local knowledge base loaded with TigerBot model.
Introduction: Realize localized knowledge base retrieval and intelligent answer generation based on ChatGLM-6b+langchain (including access to Internet retrieval results)
Introduction: ⚡ DemoGPT enables you to create quick demos just by using hints. ⚡