diff --git a/docs/docs/get_started/quickstart.mdx b/docs/docs/get_started/quickstart.mdx index 6bb49da682e..2f73bbd24e6 100644 --- a/docs/docs/get_started/quickstart.mdx +++ b/docs/docs/get_started/quickstart.mdx @@ -65,10 +65,10 @@ We will link to relevant docs. ## LLM Chain -We'll show how to use models available via API, like OpenAI and Cohere, and local open source models, using integrations like Ollama. +We'll show how to use models available via API, like OpenAI, and local open source models, using integrations like Ollama. - + First we'll need to import the LangChain x OpenAI integration package. @@ -115,7 +115,36 @@ llm = Ollama(model="llama2") ``` - + + +First we'll need to import the LangChain x Anthropic package. + +```shell +pip install langchain-anthropic +``` + +Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running: + +```shell +export ANTHROPIC_API_KEY="..." +``` + +We can then initialize the model: + +```python +from langchain_anthropic import ChatAnthropic + +llm = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024) +``` + +If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class: + +```python +llm = ChatAnthropic(anthropic_api_key="...") +``` + + + First we'll need to import the Cohere SDK package. diff --git a/docs/docs/modules/model_io/quick_start.mdx b/docs/docs/modules/model_io/quick_start.mdx index 191117418ad..872378234e6 100644 --- a/docs/docs/modules/model_io/quick_start.mdx +++ b/docs/docs/modules/model_io/quick_start.mdx @@ -7,7 +7,7 @@ sidebar_position: 0 The quick start will cover the basics of working with language models. It will introduce the two different types of models - LLMs and ChatModels. It will then cover how to use PromptTemplates to format the inputs to these models, and how to use Output Parsers to work with the outputs. For a deeper conceptual guide into these topics - please see [this documentation](./concepts) ## Models -For this getting started guide, we will provide two options: using OpenAI (a popular model available via API) or using a local open source model. +For this getting started guide, we will provide a few options: using an API like Anthropic or OpenAI, or using a local open source model via Ollama. import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; @@ -62,6 +62,35 @@ from langchain_community.chat_models import ChatOllama llm = Ollama(model="llama2") chat_model = ChatOllama() +``` + + + + +First we'll need to import the LangChain x Anthropic package. + +```shell +pip install langchain-anthropic +``` + +Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running: + +```shell +export ANTHROPIC_API_KEY="..." +``` + +We can then initialize the model: + +```python +from langchain_anthropic import ChatAnthropic + +chat_model = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024) +``` + +If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class: + +```python +chat_model = ChatAnthropic(anthropic_api_key="...") ``` @@ -84,7 +113,7 @@ We can then initialize the model: ```python from langchain_community.chat_models import ChatCohere -llm = ChatCohere() +chat_model = ChatCohere() ``` If you'd prefer not to set an environment variable you can pass the key in directly via the `cohere_api_key` named parameter when initiating the Cohere LLM class: @@ -92,7 +121,7 @@ If you'd prefer not to set an environment variable you can pass the key in direc ```python from langchain_community.chat_models import ChatCohere -llm = ChatCohere(cohere_api_key="...") +chat_model = ChatCohere(cohere_api_key="...") ```