together: add chat models, use openai base (#21337)

**Description:** Adding chat completions to the Together AI package,
which is our most popular API. Also staying backwards compatible with
the old API so folks can continue to use the completions API as well.
Also moved the embedding API to use the OpenAI library to standardize it
further.

**Twitter handle:** @nutlope

- [x] **Add tests and docs**: If you're adding a new integration, please
include
- [x] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/

If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, hwchase17.

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
This commit is contained in:
Hassan El Mghari
2024-05-06 20:47:06 -04:00
committed by GitHub
parent a2d31307bb
commit d6ef5fe86a
22 changed files with 1501 additions and 473 deletions

View File

@@ -7,15 +7,21 @@
"source": [
"# Together AI\n",
"\n",
"> The Together API makes it easy to fine-tune or run leading open-source models with a couple lines of code. We have integrated the worlds leading open-source models, including Llama-2, RedPajama, Falcon, Alpaca, Stable Diffusion XL, and more. Read more: https://together.ai\n",
"> The Together API makes it easy to query and fine-tune leading open-source models with a couple lines of code. We have integrated the worlds leading open-source models, including Llama-3, Mixtral, DBRX, Stable Diffusion XL, and more. Read more: https://together.ai\n",
"\n",
"To use, you'll need an API key which you can find here:\n",
"https://api.together.xyz/settings/api-keys. This can be passed in as init param\n",
"https://api.together.ai/settings/api-keys. This can be passed in as init param\n",
"``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n",
"\n",
"Together API reference: https://docs.together.ai/reference"
"Together API reference: https://docs.together.ai"
]
},
{
"cell_type": "markdown",
"id": "1c47fc36",
"metadata": {},
"source": []
},
{
"cell_type": "code",
"execution_count": null,
@@ -28,40 +34,43 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"id": "637bb53f",
"metadata": {},
"outputs": [],
"source": [
"# Running chat completions with Together AI\n",
"\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"from langchain_together import ChatTogether\n",
"\n",
"chat = ChatTogether()\n",
"\n",
"# using chat invoke\n",
"chat.invoke(\"Tell me fun things to do in NYC\")\n",
"\n",
"# using chat stream\n",
"for m in chat.stream(\"Tell me fun things to do in NYC\"):\n",
" print(m)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e7b7170d-d7c5-4890-9714-a37238343805",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"A: A large language model is a neural network that is trained on a large amount of text data. It is able to generate text that is similar to the training data, and can be used for tasks such as language translation, question answering, and text summarization.\n",
"\n",
"A: A large language model is a neural network that is trained on a large amount of text data. It is able to generate text that is similar to the training data, and can be used for tasks such as language translation, question answering, and text summarization.\n",
"\n",
"A: A large language model is a neural network that is trained on\n"
]
}
],
"outputs": [],
"source": [
"# Running completions with Together AI\n",
"\n",
"from langchain_together import Together\n",
"\n",
"llm = Together(\n",
" model=\"togethercomputer/RedPajama-INCITE-7B-Base\",\n",
" temperature=0.7,\n",
" max_tokens=128,\n",
" top_k=1,\n",
" model=\"codellama/CodeLlama-70b-Python-hf\",\n",
" # together_api_key=\"...\"\n",
")\n",
"\n",
"input_ = \"\"\"You are a teacher with a deep knowledge of machine learning and AI. \\\n",
"You provide succinct and accurate answers. Answer the following question: \n",
"\n",
"What is a large language model?\"\"\"\n",
"print(llm.invoke(input_))"
"print(llm.invoke(\"def bubble_sort(): \"))"
]
}
],