privateGPT/private_gpt/server/chunks
Iván Martínez 45f05711eb
feat: Upgrade to LlamaIndex to 0.10 (#1663)
* Extract optional dependencies

* Separate local mode into llms-llama-cpp and embeddings-huggingface for clarity

* Support Ollama embeddings

* Upgrade to llamaindex 0.10.14. Remove legacy use of ServiceContext in ContextChatEngine

* Fix vector retriever filters
2024-03-06 17:51:30 +01:00
..
__init__.py Next version of PrivateGPT (#1077) 2023-10-19 16:04:35 +02:00
chunks_router.py fix: Remove global state (#1216) 2023-11-12 22:20:36 +01:00
chunks_service.py feat: Upgrade to LlamaIndex to 0.10 (#1663) 2024-03-06 17:51:30 +01:00