mirror of
https://github.com/hwchase17/langchain.git
synced 2025-09-04 04:28:58 +00:00
@@ -141,10 +141,10 @@ from langchain_anthropic import ChatAnthropic
|
||||
llm = ChatAnthropic(model="claude-3-sonnet-20240229", temperature=0.2, max_tokens=1024)
|
||||
```
|
||||
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
|
||||
```python
|
||||
llm = ChatAnthropic(anthropic_api_key="...")
|
||||
llm = ChatAnthropic(api_key="...")
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
@@ -69,7 +69,7 @@
|
||||
"source": [
|
||||
"The code provided assumes that your ANTHROPIC_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
|
||||
"```python\n",
|
||||
"chat = ChatAnthropic(temperature=0, anthropic_api_key=\"YOUR_API_KEY\", model_name=\"claude-3-opus-20240229\")\n",
|
||||
"chat = ChatAnthropic(temperature=0, api_key=\"YOUR_API_KEY\", model_name=\"claude-3-opus-20240229\")\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
|
@@ -52,8 +52,8 @@
|
||||
"source": [
|
||||
"```{=mdx}\n",
|
||||
"<ChatModelTabs\n",
|
||||
" anthropicParams={`model=\"claude-3-sonnet-20240229\", api_key=\"...\"`}\n",
|
||||
" openaiParams={`model=\"gpt-3.5-turbo-0125\", api_key=\"...\"`}\n",
|
||||
" anthropicParams={`model=\"claude-3-sonnet-20240229\", anthropic_api_key=\"...\"`}\n",
|
||||
" mistralParams={`model=\"mistral-large-latest\", api_key=\"...\"`}\n",
|
||||
" fireworksParams={`model=\"accounts/fireworks/models/mixtral-8x7b-instruct\", api_key=\"...\"`}\n",
|
||||
" googleParams={`model=\"gemini-pro\", google_api_key=\"...\"`}\n",
|
||||
|
@@ -158,10 +158,10 @@ from langchain_anthropic import ChatAnthropic
|
||||
chat_model = ChatAnthropic(model="claude-3-sonnet-20240229", temperature=0.2, max_tokens=1024)
|
||||
```
|
||||
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
|
||||
```python
|
||||
chat_model = ChatAnthropic(anthropic_api_key="...")
|
||||
chat_model = ChatAnthropic(api_key="...")
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
@@ -87,10 +87,10 @@ from langchain_anthropic import ChatAnthropic
|
||||
chat_model = ChatAnthropic(model="claude-3-sonnet-20240229", temperature=0.2, max_tokens=1024)
|
||||
```
|
||||
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
If you'd prefer not to set an environment variable you can pass the key in directly via the `api_key` named parameter when initiating the Anthropic Chat Model class:
|
||||
|
||||
```python
|
||||
chat_model = ChatAnthropic(anthropic_api_key="...")
|
||||
chat_model = ChatAnthropic(api_key="...")
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
Reference in New Issue
Block a user