mirror of
https://github.com/csunny/DB-GPT.git
synced 2025-09-09 12:59:43 +00:00
update:doc
This commit is contained in:
@@ -3,7 +3,7 @@
|
||||
Chat with your own knowledge is a very interesting thing. In the usage scenarios of this chapter, we will introduce how to build your own knowledge base through the knowledge base API. Firstly, building a knowledge store can currently be initialized by executing "python tool/knowledge_init.py" to initialize the content of your own knowledge base, which was introduced in the previous knowledge base module. Of course, you can also call our provided knowledge embedding API to store knowledge.
|
||||
|
||||
|
||||
We currently support four document formats: txt, pdf, url, and md.
|
||||
We currently support many document formats: txt, pdf, md, html, doc, ppt, and url.
|
||||
```
|
||||
vector_store_config = {
|
||||
"vector_store_name": name
|
||||
@@ -11,7 +11,7 @@ vector_store_config = {
|
||||
|
||||
file_path = "your file path"
|
||||
|
||||
knowledge_embedding_client = KnowledgeEmbedding(file_path=file_path, model_name=LLM_MODEL_CONFIG["text2vec"],local_persist=False, vector_store_config=vector_store_config)
|
||||
knowledge_embedding_client = KnowledgeEmbedding(file_path=file_path, model_name=LLM_MODEL_CONFIG["text2vec"], vector_store_config=vector_store_config)
|
||||
|
||||
knowledge_embedding_client.knowledge_embedding()
|
||||
|
||||
@@ -37,7 +37,7 @@ vector_store_config = {
|
||||
|
||||
query = "your query"
|
||||
|
||||
knowledge_embedding_client = KnowledgeEmbedding(file_path="", model_name=LLM_MODEL_CONFIG["text2vec"], local_persist=False, vector_store_config=vector_store_config)
|
||||
knowledge_embedding_client = KnowledgeEmbedding(file_path="", model_name=LLM_MODEL_CONFIG["text2vec"], vector_store_config=vector_store_config)
|
||||
|
||||
knowledge_embedding_client.similar_search(query, 10)
|
||||
```
|
Reference in New Issue
Block a user