diff --git a/docs/docs/integrations/chat/together.ipynb b/docs/docs/integrations/chat/together.ipynb new file mode 100644 index 00000000000..4a3b07e57d1 --- /dev/null +++ b/docs/docs/integrations/chat/together.ipynb @@ -0,0 +1,119 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "2970dd75-8ebf-4b51-8282-9b454b8f356d", + "metadata": {}, + "source": [ + "# Together AI\n", + "\n", + "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n", + "\n", + "This example goes over how to use LangChain to interact with Together AI models." + ] + }, + { + "cell_type": "markdown", + "id": "1c47fc36", + "metadata": {}, + "source": [ + "## Installation" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1ecdb29d", + "metadata": {}, + "outputs": [], + "source": [ + "%pip install --upgrade langchain-together" + ] + }, + { + "cell_type": "markdown", + "id": "89883202", + "metadata": {}, + "source": [ + "## Environment\n", + "\n", + "To use Together AI, you'll need an API key which you can find here:\n", + "https://api.together.ai/settings/api-keys. This can be passed in as an init param\n", + "``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n" + ] + }, + { + "cell_type": "markdown", + "id": "8304b4d9", + "metadata": {}, + "source": [ + "## Example" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "637bb53f", + "metadata": {}, + "outputs": [], + "source": [ + "# Querying chat models with Together AI\n", + "\n", + "from langchain_together import ChatTogether\n", + "\n", + "# choose from our 50+ models here: https://docs.together.ai/docs/inference-models\n", + "chat = ChatTogether(\n", + " # together_api_key=\"YOUR_API_KEY\",\n", + " model=\"meta-llama/Llama-3-70b-chat-hf\",\n", + ")\n", + "\n", + "# stream the response back from the model\n", + "for m in chat.stream(\"Tell me fun things to do in NYC\"):\n", + " print(m.content, end=\"\", flush=True)\n", + "\n", + "# if you don't want to do streaming, you can use the invoke method\n", + "# chat.invoke(\"Tell me fun things to do in NYC\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e7b7170d-d7c5-4890-9714-a37238343805", + "metadata": {}, + "outputs": [], + "source": [ + "# Querying code and language models with Together AI\n", + "\n", + "from langchain_together import Together\n", + "\n", + "llm = Together(\n", + " model=\"codellama/CodeLlama-70b-Python-hf\",\n", + " # together_api_key=\"...\"\n", + ")\n", + "\n", + "print(llm.invoke(\"def bubble_sort(): \"))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": ".venv", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.4" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/docs/integrations/providers/together.ipynb b/docs/docs/integrations/providers/together.ipynb index aaffcb65abe..4a3b07e57d1 100644 --- a/docs/docs/integrations/providers/together.ipynb +++ b/docs/docs/integrations/providers/together.ipynb @@ -2,89 +2,102 @@ "cells": [ { "cell_type": "markdown", + "id": "2970dd75-8ebf-4b51-8282-9b454b8f356d", "metadata": {}, "source": [ "# Together AI\n", "\n", - "> [Together AI](https://together.ai) is a cloud platform for building and running generative AI.\n", - "> \n", - "> It makes it easy to fine-tune or run leading open-source models with a couple lines of code.\n", - "> We have integrated the world’s leading open-source models, including `Llama-2`, `RedPajama`, `Falcon`, `Alpaca`, `Stable Diffusion XL`, and more. Read mo\n", + "[Together AI](https://www.together.ai/) offers an API to query [50+ leading open-source models](https://docs.together.ai/docs/inference-models) in a couple lines of code.\n", "\n", - "## Installation and Setup\n", - "\n", - "To use, you'll need an API key which you can find [here](https://api.together.xyz/settings/api-keys).\n", - "\n", - "API key can be passed in as init param\n", - "``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n", - "\n", - "See details in the [Together API reference](https://docs.together.ai/reference)\n", - "\n", - "You will also need to install the `langchain-together` integration package:" + "This example goes over how to use LangChain to interact with Together AI models." + ] + }, + { + "cell_type": "markdown", + "id": "1c47fc36", + "metadata": {}, + "source": [ + "## Installation" ] }, { "cell_type": "code", "execution_count": null, + "id": "1ecdb29d", "metadata": {}, "outputs": [], "source": [ - "%pip install --upgrade --quiet langchain-together" + "%pip install --upgrade langchain-together" ] }, { "cell_type": "markdown", + "id": "89883202", "metadata": {}, "source": [ - "## LLMs\n", + "## Environment\n", "\n", - "See a [usage example](/docs/integrations/llms/together)." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "id": "y8ku6X96sebl" - }, - "outputs": [], - "source": [ - "from langchain_together import Together" + "To use Together AI, you'll need an API key which you can find here:\n", + "https://api.together.ai/settings/api-keys. This can be passed in as an init param\n", + "``together_api_key`` or set as environment variable ``TOGETHER_API_KEY``.\n" ] }, { "cell_type": "markdown", - "metadata": { - "execution": { - "iopub.execute_input": "2024-04-03T18:49:24.701100Z", - "iopub.status.busy": "2024-04-03T18:49:24.700943Z", - "iopub.status.idle": "2024-04-03T18:49:24.705570Z", - "shell.execute_reply": "2024-04-03T18:49:24.704943Z", - "shell.execute_reply.started": "2024-04-03T18:49:24.701088Z" - } - }, + "id": "8304b4d9", + "metadata": {}, "source": [ - "## Embedding models\n", - "\n", - "See a [usage example](/docs/integrations/text_embedding/together)." + "## Example" ] }, { "cell_type": "code", "execution_count": null, + "id": "637bb53f", "metadata": {}, "outputs": [], "source": [ - "from langchain_together.embeddings import TogetherEmbeddings" + "# Querying chat models with Together AI\n", + "\n", + "from langchain_together import ChatTogether\n", + "\n", + "# choose from our 50+ models here: https://docs.together.ai/docs/inference-models\n", + "chat = ChatTogether(\n", + " # together_api_key=\"YOUR_API_KEY\",\n", + " model=\"meta-llama/Llama-3-70b-chat-hf\",\n", + ")\n", + "\n", + "# stream the response back from the model\n", + "for m in chat.stream(\"Tell me fun things to do in NYC\"):\n", + " print(m.content, end=\"\", flush=True)\n", + "\n", + "# if you don't want to do streaming, you can use the invoke method\n", + "# chat.invoke(\"Tell me fun things to do in NYC\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e7b7170d-d7c5-4890-9714-a37238343805", + "metadata": {}, + "outputs": [], + "source": [ + "# Querying code and language models with Together AI\n", + "\n", + "from langchain_together import Together\n", + "\n", + "llm = Together(\n", + " model=\"codellama/CodeLlama-70b-Python-hf\",\n", + " # together_api_key=\"...\"\n", + ")\n", + "\n", + "print(llm.invoke(\"def bubble_sort(): \"))" ] } ], "metadata": { - "colab": { - "provenance": [] - }, "kernelspec": { - "display_name": "Python 3 (ipykernel)", + "display_name": ".venv", "language": "python", "name": "python3" }, @@ -98,9 +111,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.12" + "version": "3.11.4" } }, "nbformat": 4, - "nbformat_minor": 4 + "nbformat_minor": 5 }