docs(openai proxy): Fix openai proxy configuration in llms.md (#233) (#233)

Fix the name of the OpenAI proxy model in the `llms.md` document,
Modifies the LLM_MODEL variable from "proxy_llm" to "proxyllm".
This commit is contained in:
magic.chen 2023-06-16 11:15:32 +08:00 committed by GitHub
commit 3ccd939fe3

View File

@ -115,7 +115,7 @@ PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions
- If you can't access OpenAI locally but have an OpenAI proxy service, you can configure as follows.
```
LLM_MODEL=proxy_llm
LLM_MODEL=proxyllm
MODEL_SERVER=127.0.0.1:8000
PROXY_API_KEY=sk-xxx
PROXY_SERVER_URL={your-openai-proxy-server/v1/chat/completions}